Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(augmented lagrange multiplier)`

.
Showing records 1 – 3 of
3 total matches.

▼ Search Limiters

1. Cabral, Ricardo da Silveira. Unifying Low-Rank Models for Visual Learning.

Degree: 2015, Carnegie Mellon University

URL: http://repository.cmu.edu/dissertations/506

Many problems in signal processing, machine learning and computer vision can be solved by learning low rank models from data. In computer vision, problems such as rigid structure from motion have been formulated as an optimization over subspaces with fixed rank. These hard-rank constraints have traditionally been imposed by a factorization that parameterizes subspaces as a product of two matrices of fixed rank. Whilst factorization approaches lead to efficient and kernelizable optimization algorithms, they have been shown to be NP-Hard in presence of missing data. Inspired by recent work in compressed sensing, hard-rank constraints have been replaced by soft-rank constraints, such as the nuclear norm regularizer. Vis-a-vis hard-rank approaches, soft-rank models are convex even in presence of missing data: but how is convex optimization solving a NP-Hard problem? This thesis addresses this question by analyzing the relationship between hard and soft rank constraints in the unsupervised factorization with missing data problem. Moreover, we extend soft rank models to weakly supervised and fully supervised learning problems in computer vision. There are four main contributions of our work: (1) The analysis of a new unified low-rank model for matrix factorization with missing data. Our model subsumes soft and hard-rank approaches and merges advantages from previous formulations, such as efficient algorithms and kernelization. It also provides justifications on the choice of algorithms and regions that guarantee convergence to global minima. (2) A deterministic \rank continuation" strategy for the NP-hard unsupervised factorization with missing data problem, that is highly competitive with the state-of-the-art and often achieves globally optimal solutions. In preliminary work, we show that this optimization strategy is applicable to other NP-hard problems which are typically relaxed to convex semidentite programs (e.g., MAX-CUT, quadratic assignment problem). (3) A new soft-rank fully supervised robust regression model. This convex model is able to deal with noise, outliers and missing data in the input variables. (4) A new soft-rank model for weakly supervised image classification and localization. Unlike existing multiple-instance approaches for this problem, our model is convex.

Subjects/Keywords: Computer vision; Machine learning; Low-rank matrices; Convex optimization; Bilinear fac-torization; Augmented lagrange multiplier method

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Cabral, R. d. S. (2015). Unifying Low-Rank Models for Visual Learning. (Thesis). Carnegie Mellon University. Retrieved from http://repository.cmu.edu/dissertations/506

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Cabral, Ricardo da Silveira. “Unifying Low-Rank Models for Visual Learning.” 2015. Thesis, Carnegie Mellon University. Accessed December 07, 2019. http://repository.cmu.edu/dissertations/506.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Cabral, Ricardo da Silveira. “Unifying Low-Rank Models for Visual Learning.” 2015. Web. 07 Dec 2019.

Vancouver:

Cabral RdS. Unifying Low-Rank Models for Visual Learning. [Internet] [Thesis]. Carnegie Mellon University; 2015. [cited 2019 Dec 07]. Available from: http://repository.cmu.edu/dissertations/506.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Cabral RdS. Unifying Low-Rank Models for Visual Learning. [Thesis]. Carnegie Mellon University; 2015. Available from: http://repository.cmu.edu/dissertations/506

Not specified: Masters Thesis or Doctoral Dissertation

University of Alabama

2. Meng, Ming. Three essays on more powerful unit root tests with non-normal errors.

Degree: 2013, University of Alabama

URL: http://purl.lib.ua.edu/97218

This dissertation is concerned with finding ways to improve the power of unit root tests. This dissertation consists of three essays. In the first essay, we extends the Lagrange Multiplier (LM) unit toot tests of Schmidt and Phillips (1992) to utilize information contained in non-normal errors. The new tests adopt the Residual Augmented Least Squares (RALS) estimation procedure of Im and Schmidt (2008). This essay complements the work of Im, Lee and Tieslau (2012) who adopt the RALS procedure for DF-based tests. This essay provides the relevant asymptotic distribution and the corresponding critical values of the new tests. The RALS-LM tests show improved power over the RALS-DF tests. Moreover, the main advantage of the RALS-LM tests lies in the invariance feature that the distribution does not depend on the nuisance parameter in the presence of level-breaks. The second essay tests the Prebisch-Singer hypothesis by examining paths of primary commodity prices which are known to exhibit multiple structural breaks. In order to examine the issue more properly, we first suggest new unit root tests that can allow for structural breaks in both the intercept and the slope. Then, we adopt the RALS procedure to gain much improved power when the error term follows a non-normal distribution. Since the suggested test is more powerful and free of nuisance parameters, rejection of the null can be considered as more accurate evidence of stationarity. We apply the new test on the recently extended Grilli and Yang index of 24 commodity series from 1900 to 2007. The empirical findings provide significant evidence to support that primary commodity prices are stationary with one or two trend breaks. However, compared with past studies, they provide even weaker evidence to support the Prebisch-Singer hypothesis. The third essay extends the Fourier Lagrange Multiplier (FLM) unit root tests of Enders and Lee (2012a) by using the RALS estimation procedure of Im and Schmidt (2008). While the F\LM type of tests can be used to control for smooth structural breaks of an unknown functional form, the RALS procedure can utilize additional higher-moment information contained in non-normal errors. For these new tests, knowledge of the underlying type of non-normal distribution of the error term or the precise functional form of the structure breaks is not required. Our simulation results demonstrate significant power gains over the FLM tests in the presence of non-normal errors. (Published By University of Alabama Libraries)
*Advisors/Committee Members: Lee, Junsoo, Reed, Robert, Ma, Jun, Mobbs, Shawn, Sun, Min, University of Alabama. Dept. of Economics, Finance, and Legal Studies.*

Subjects/Keywords: Electronic Thesis or Dissertation; – thesis; Economics; Lagrange Multiplier; Non-normal Errors; Prebisch-Singer Hypothesis; Residual Augmented Least Squares; Structural Breaks; Unit Root Tests

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Meng, M. (2013). Three essays on more powerful unit root tests with non-normal errors. (Thesis). University of Alabama. Retrieved from http://purl.lib.ua.edu/97218

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Meng, Ming. “Three essays on more powerful unit root tests with non-normal errors.” 2013. Thesis, University of Alabama. Accessed December 07, 2019. http://purl.lib.ua.edu/97218.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Meng, Ming. “Three essays on more powerful unit root tests with non-normal errors.” 2013. Web. 07 Dec 2019.

Vancouver:

Meng M. Three essays on more powerful unit root tests with non-normal errors. [Internet] [Thesis]. University of Alabama; 2013. [cited 2019 Dec 07]. Available from: http://purl.lib.ua.edu/97218.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Meng M. Three essays on more powerful unit root tests with non-normal errors. [Thesis]. University of Alabama; 2013. Available from: http://purl.lib.ua.edu/97218

Not specified: Masters Thesis or Doctoral Dissertation

University of Central Florida

3. Oreifej, Omar. Robust Subspace Estimation Using Low-rank Optimization. Theory And Applications In Scene Reconstruction, Video Denoising, And Activity Recognition.

Degree: 2013, University of Central Florida

URL: https://stars.library.ucf.edu/etd/2569

In this dissertation, we discuss the problem of robust linear subspace estimation using low-rank optimization and propose three formulations of it. We demonstrate how these formulations can be used to solve fundamental computer vision problems, and provide superior performance in terms of accuracy and running time. Consider a set of observations extracted from images (such as pixel gray values, local features, trajectories . . . etc). If the assumption that these observations are drawn from a liner subspace (or can be linearly approximated) is valid, then the goal is to represent each observation as a linear combination of a compact basis, while maintaining a minimal reconstruction error. One of the earliest, yet most popular, approaches to achieve that is Principal Component Analysis (PCA). However, PCA can only handle Gaussian noise, and thus suffers when the observations are contaminated with gross and sparse outliers. To this end, in this dissertation, we focus on estimating the subspace robustly using low-rank optimization, where the sparse outliers are detected and separated through the `1 norm. The robust estimation has a two-fold advantage: First, the obtained basis better represents the actual subspace because it does not include contributions from the outliers. Second, the detected outliers are often of a specific interest in many applications, as we will show throughout this thesis. We demonstrate four different formulations and applications for low-rank optimization. First, we consider the problem of reconstructing an underwater sequence by removing the iii turbulence caused by the water waves. The main drawback of most previous attempts to tackle this problem is that they heavily depend on modelling the waves, which in fact is ill-posed since the actual behavior of the waves along with the imaging process are complicated and include several noise components; therefore, their results are not satisfactory. In contrast, we propose a novel approach which outperforms the state-of-the-art. The intuition behind our method is that in a sequence where the water is static, the frames would be linearly correlated. Therefore, in the presence of water waves, we may consider the frames as noisy observations drawn from a the subspace of linearly correlated frames. However, the noise introduced by the water waves is not sparse, and thus cannot directly be detected using low-rank optimization. Therefore, we propose a data-driven two-stage approach, where the first stage “sparsifies” the noise, and the second stage detects it. The first stage leverages the temporal mean of the sequence to overcome the structured turbulence of the waves through an iterative registration algorithm. The result of the first stage is a high quality mean and a better structured sequence; however, the sequence still contains unstructured sparse noise. Thus, we employ a second stage at which we extract the sparse errors from the sequence through rank minimization. Our method converges faster, and drastically outperforms state of the art on all…
*Advisors/Committee Members: Shah, Mubarak.*

Subjects/Keywords: low rank representation; low rank; sparse representation; sparse; activity recognition; turbulence mitigation; video denoising; complex event recognition; nuclear norm; augmented lagrange multiplier; camera motion estimation; trecvid; hoha; water waves; rank; trajectories; particle advection; registration; decomposition; moving object detection; background subtraction; atmospheric turbulence; Computer Engineering; Engineering; Dissertations, Academic – Engineering and Computer Science, Engineering and Computer Science – Dissertations, Academic

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Oreifej, O. (2013). Robust Subspace Estimation Using Low-rank Optimization. Theory And Applications In Scene Reconstruction, Video Denoising, And Activity Recognition. (Doctoral Dissertation). University of Central Florida. Retrieved from https://stars.library.ucf.edu/etd/2569

Chicago Manual of Style (16^{th} Edition):

Oreifej, Omar. “Robust Subspace Estimation Using Low-rank Optimization. Theory And Applications In Scene Reconstruction, Video Denoising, And Activity Recognition.” 2013. Doctoral Dissertation, University of Central Florida. Accessed December 07, 2019. https://stars.library.ucf.edu/etd/2569.

MLA Handbook (7^{th} Edition):

Oreifej, Omar. “Robust Subspace Estimation Using Low-rank Optimization. Theory And Applications In Scene Reconstruction, Video Denoising, And Activity Recognition.” 2013. Web. 07 Dec 2019.

Vancouver:

Oreifej O. Robust Subspace Estimation Using Low-rank Optimization. Theory And Applications In Scene Reconstruction, Video Denoising, And Activity Recognition. [Internet] [Doctoral dissertation]. University of Central Florida; 2013. [cited 2019 Dec 07]. Available from: https://stars.library.ucf.edu/etd/2569.

Council of Science Editors:

Oreifej O. Robust Subspace Estimation Using Low-rank Optimization. Theory And Applications In Scene Reconstruction, Video Denoising, And Activity Recognition. [Doctoral Dissertation]. University of Central Florida; 2013. Available from: https://stars.library.ucf.edu/etd/2569