Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for +publisher:"Princeton University" +contributor:("Vanderbei, Robert"). Showing records 1 – 2 of 2 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


Princeton University

1. Pang, Haotian. The Application of Optimization Techniques in Machine Learning .

Degree: PhD, 2017, Princeton University

The past decades have witnessed significantly progress in machine learning, and solving these problems requires the advancing in optimization techniques. High dimensional sparse learning has imposed a great computational challenge to large scale data analysis. In this dissertation, parametric simplex method is applied to solve a broad class of sparse learning approaches, which can be formulated as linear programs parametrized by a regularization parameter. There are serious drawbacks to the existing methods for solving these types of problems as tuning the parameter for the desired solution is very inefficient. The customized parametric simplex method is introduced and it uses the unknown weighting factor as the parameter and provides a powerful and efficient way to address these shortcomings. Although the simplex method has an exponential complexity in the worse case, it has been shown that the parametric simplex method is an appropriate method for these cases when the expected solution is sparse. An R package named fastclime which efficiently solves a variety machine learning problems by the customized parametric simplex method is developed. A convex optimization method named Inexact Peaceman-Rachford Splitting Method (IPRSM) is studied. Similar to the Alternating Direction Method of Multiplier (ADMM), the strictly contractive Peaceman-Rachford Splitting Method (PRSM) is also used to solve a convex minimization problem with linear constraints and a separable objective function. In many applications, it is quite expensive to obtain exact solutions to these subproblems. The inexact methods intend to solve the iterative subproblems when the exact solutions do not exist or they are hard to obtain. Finally, the new graph Perceptron algorithm, a graph estimation method which performs on online binary classification problems is proposed. The new graph Perceptron algorithm is a new kernel based Perceptron derived from online class-action and extend to online graph estimation with a new kernel trick. Advisors/Committee Members: Vanderbei, Robert (advisor).

Subjects/Keywords: Convex Optimization; Linear Programming; Machine Learning; Online Learning; Parametric Simplex Method

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Pang, H. (2017). The Application of Optimization Techniques in Machine Learning . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01fb494c071

Chicago Manual of Style (16th Edition):

Pang, Haotian. “The Application of Optimization Techniques in Machine Learning .” 2017. Doctoral Dissertation, Princeton University. Accessed November 13, 2019. http://arks.princeton.edu/ark:/88435/dsp01fb494c071.

MLA Handbook (7th Edition):

Pang, Haotian. “The Application of Optimization Techniques in Machine Learning .” 2017. Web. 13 Nov 2019.

Vancouver:

Pang H. The Application of Optimization Techniques in Machine Learning . [Internet] [Doctoral dissertation]. Princeton University; 2017. [cited 2019 Nov 13]. Available from: http://arks.princeton.edu/ark:/88435/dsp01fb494c071.

Council of Science Editors:

Pang H. The Application of Optimization Techniques in Machine Learning . [Doctoral Dissertation]. Princeton University; 2017. Available from: http://arks.princeton.edu/ark:/88435/dsp01fb494c071

2. Fang, Xingyuan. Some Interactions of Modern Optimization and Statistics .

Degree: PhD, 2016, Princeton University

This dissertation attacks several challenging problems using state-of-the-art modern optimization and statistics. We first consider optimal, two stage, adaptive enrichment designs for randomized trials, using sparse linear programming. Adaptive enrichment designs involve preplanned rules for modifying enrollment criteria based on accruing data in a randomized trial. Such designs have been proposed. The goal is to learn which populations benefit from an experimental treatment. Two critical components of adaptive enrichment designs are the decision rule for modifying enrollment, and the multiple testing procedure. We provide the first general framework for simultaneously optimizing both of these components for two stage, adaptive enrichment designs. We minimize expected sample size under constraints on power and the familywise Type I error rate. Next, we consider high-dimensional spatial graphical model estimation under a total cardinality constraint. Though this problem is highly nonconvex, we show that its primal-dual gap diminishes linearly with the dimensionality and provide a convex geometry justification of this `blessing of massive scale' phenomenon. Motivated by this result, we propose an efficient algorithm to solve the dual problem and prove that the solution achieves optimal statistical properties. Finally, we consider the problem of hypothesis testing and confidence intervals under high dimensional proportional hazards models. Motivated by a geometric projection principle, we propose a unified likelihood ratio inferential framework, including score, Wald and partial likelihood ratio statistics for hypothesis testing. Without assuming model selection consistency, we derive the asymptotic distributions of these test statistics, establish their semiparametric optimality, and conduct power analysis under Pitman alternatives. We also develop procedures to construct pointwise confidence intervals for the baseline hazard function and conditional hazard function. Advisors/Committee Members: Liu, Han (advisor), Vanderbei, Robert (advisor).

Subjects/Keywords: Clinical Trial; Graphical Model; High-Dimensional Statistics; Optimization; Survival Analysis

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fang, X. (2016). Some Interactions of Modern Optimization and Statistics . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01b2773z11q

Chicago Manual of Style (16th Edition):

Fang, Xingyuan. “Some Interactions of Modern Optimization and Statistics .” 2016. Doctoral Dissertation, Princeton University. Accessed November 13, 2019. http://arks.princeton.edu/ark:/88435/dsp01b2773z11q.

MLA Handbook (7th Edition):

Fang, Xingyuan. “Some Interactions of Modern Optimization and Statistics .” 2016. Web. 13 Nov 2019.

Vancouver:

Fang X. Some Interactions of Modern Optimization and Statistics . [Internet] [Doctoral dissertation]. Princeton University; 2016. [cited 2019 Nov 13]. Available from: http://arks.princeton.edu/ark:/88435/dsp01b2773z11q.

Council of Science Editors:

Fang X. Some Interactions of Modern Optimization and Statistics . [Doctoral Dissertation]. Princeton University; 2016. Available from: http://arks.princeton.edu/ark:/88435/dsp01b2773z11q

.