Full Record

New Search | Similar Records

Title Provable alternating minimization for non-convex learning problems
Publication Date
Date Accessioned
Discipline/Department Electrical and Computer Engineering
University/Publisher University of Texas – Austin
Abstract Alternating minimization (AltMin) is a generic term for a widely popular approach in non-convex learning: often, it is possible to partition the variables into two (or more) sets, so that the problem is convex/tractable in one set if the other is held fixed (and vice versa). This allows for alternating between optimally updating one set of variables, and then the other. AltMin methods typically do not have associated global consistency guarantees; even though they are empirically observed to perform better than methods (e.g. based on convex optimization) that do have guarantees. In this thesis, we obtain rigorous performance guarantees for AltMin in three statistical learning settings: low rank matrix completion, phase retrieval and learning sparsely-used dictionaries. The overarching theme behind our results consists of two parts: (i) devising new initialization procedures (as opposed to doing so randomly, as is typical), and (ii) establishing exponential local convergence from this initialization. Our work shows that the pursuit of statistical guarantees can yield algorithmic improvements (initialization in our case) that perform better in practice.
Subjects/Keywords Alternating minimization; Alternating least squares; Matrix completion; Phase retrieval; Dictionary learning; Sparse dictionaries; Iterative methods; Non-convex optimization
Contributors Sanghavi, Sujay Rajendra, 1979- (advisor)
Language en
Country of Publication us
Record ID handle:2152/25931
Repository texas
Date Indexed 2018-10-22
Note [] text; [department] Electrical and Computer Engineering;

Sample Images