Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(Low rank matrix correction)`

.
Showing records 1 – 30 of
23375 total matches.

◁ [1] [2] [3] [4] [5] … [780] ▶

Search Limiters

Dates

- 2017 – 2021 (6546)
- 2012 – 2016 (9295)
- 2007 – 2011 (5417)
- 2002 – 2006 (1860)
- 1997 – 2001 (628)
- 1992 – 1996 (318)
- 1987 – 1991 (206)
- 1982 – 1986 (123)
- 1977 – 1981 (70)
- 1972 – 1976 (72)

Universities

- Brno University of Technology (1196)
- University of São Paulo (514)
- University of Michigan (423)
- NSYSU (416)
- Virginia Tech (382)
- Texas A&M University (378)
- Delft University of Technology (374)
- National University of Singapore (323)
- Georgia Tech (321)
- University of Florida (301)
- University of Illinois – Urbana-Champaign (283)
- University of Texas – Austin (278)
- University of Manchester (240)
- University of Waterloo (229)
- The Ohio State University (224)

Department

- Electrical Engineering (585)
- Electrical and Computer Engineering (455)
- Mechanical Engineering (361)
- Physics (305)
- Mathematics (164)
- Computer Science (156)
- Civil Engineering (143)
- Biomedical Engineering (142)
- Chemistry (139)
- Materials Science and Engineering (128)
- Chemical Engineering (114)
- Aerospace Engineering (100)
- Petroleum Engineering (96)
- Psychology (95)
- Statistics (89)

Degrees

Levels

- doctoral (8625)
- masters (4128)
- thesis (436)
- 1 (32)
- dissertation (31)
- doctor of philosophy (ph.d.) (19)
- doctor of philosophy ph.d. (16)
- project/capstone (11)

Languages

Country

- US (8855)
- France (1586)
- Canada (1582)
- Brazil (1554)
- Czech Republic (1200)
- Sweden (1130)
- UK (920)
- Netherlands (880)
- Australia (738)
- Greece (589)
- South Africa (531)
- Taiwan (416)
- India (404)
- Singapore (323)
- Japan (313)

▼ Search Limiters

Delft University of Technology

1. Swart, Wouter (author). Methods for improving the computational performance of sequentially linear analsysis.

Degree: 2018, Delft University of Technology

URL: http://resolver.tudelft.nl/uuid:dc35a7e3-beb7-4d46-88c6-36e6f980a597

►

The numerical simulation of brittle failure with nonlinear finite element analysis (NLFEA) remains a challenge due to robustness issues. These problems are attributed to the… (more)

Subjects/Keywords: Finite Element Analysis; Preconditioning; Structural analysis; Direct method; Iterative method; Low-rank matrix correction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Swart, W. (. (2018). Methods for improving the computational performance of sequentially linear analsysis. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:dc35a7e3-beb7-4d46-88c6-36e6f980a597

Chicago Manual of Style (16^{th} Edition):

Swart, Wouter (author). “Methods for improving the computational performance of sequentially linear analsysis.” 2018. Masters Thesis, Delft University of Technology. Accessed January 20, 2021. http://resolver.tudelft.nl/uuid:dc35a7e3-beb7-4d46-88c6-36e6f980a597.

MLA Handbook (7^{th} Edition):

Swart, Wouter (author). “Methods for improving the computational performance of sequentially linear analsysis.” 2018. Web. 20 Jan 2021.

Vancouver:

Swart W(. Methods for improving the computational performance of sequentially linear analsysis. [Internet] [Masters thesis]. Delft University of Technology; 2018. [cited 2021 Jan 20]. Available from: http://resolver.tudelft.nl/uuid:dc35a7e3-beb7-4d46-88c6-36e6f980a597.

Council of Science Editors:

Swart W(. Methods for improving the computational performance of sequentially linear analsysis. [Masters Thesis]. Delft University of Technology; 2018. Available from: http://resolver.tudelft.nl/uuid:dc35a7e3-beb7-4d46-88c6-36e6f980a597

2. Biradar, Rakesh. Analysis and Prediction of Community Structure Using Unsupervised Learning.

Degree: MS, 2016, Worcester Polytechnic Institute

URL: etd-012616-134431 ; https://digitalcommons.wpi.edu/etd-theses/138

► In this thesis, we perform analysis and prediction for community structures in graphs using unsupervised learning. The methods we use require the data matrices to…
(more)

Subjects/Keywords: eRPCA; Community Prediction; Low Rank; Sparse Matrix

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Biradar, R. (2016). Analysis and Prediction of Community Structure Using Unsupervised Learning. (Thesis). Worcester Polytechnic Institute. Retrieved from etd-012616-134431 ; https://digitalcommons.wpi.edu/etd-theses/138

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Biradar, Rakesh. “Analysis and Prediction of Community Structure Using Unsupervised Learning.” 2016. Thesis, Worcester Polytechnic Institute. Accessed January 20, 2021. etd-012616-134431 ; https://digitalcommons.wpi.edu/etd-theses/138.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Biradar, Rakesh. “Analysis and Prediction of Community Structure Using Unsupervised Learning.” 2016. Web. 20 Jan 2021.

Vancouver:

Biradar R. Analysis and Prediction of Community Structure Using Unsupervised Learning. [Internet] [Thesis]. Worcester Polytechnic Institute; 2016. [cited 2021 Jan 20]. Available from: etd-012616-134431 ; https://digitalcommons.wpi.edu/etd-theses/138.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Biradar R. Analysis and Prediction of Community Structure Using Unsupervised Learning. [Thesis]. Worcester Polytechnic Institute; 2016. Available from: etd-012616-134431 ; https://digitalcommons.wpi.edu/etd-theses/138

Not specified: Masters Thesis or Doctoral Dissertation

Temple University

3.
Shank, Stephen David.
* Low*-

Degree: PhD, 2014, Temple University

URL: http://digital.library.temple.edu/u?/p245801coll10,273331

►

Mathematics

We consider *low*-*rank* solution methods for certain classes of large-scale linear *matrix* equations. Our aim is to adapt existing *low*-*rank* solution methods based on…
(more)

Subjects/Keywords: Applied mathematics;

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Shank, S. D. (2014). Low-rank solution methods for large-scale linear matrix equations. (Doctoral Dissertation). Temple University. Retrieved from http://digital.library.temple.edu/u?/p245801coll10,273331

Chicago Manual of Style (16^{th} Edition):

Shank, Stephen David. “Low-rank solution methods for large-scale linear matrix equations.” 2014. Doctoral Dissertation, Temple University. Accessed January 20, 2021. http://digital.library.temple.edu/u?/p245801coll10,273331.

MLA Handbook (7^{th} Edition):

Shank, Stephen David. “Low-rank solution methods for large-scale linear matrix equations.” 2014. Web. 20 Jan 2021.

Vancouver:

Shank SD. Low-rank solution methods for large-scale linear matrix equations. [Internet] [Doctoral dissertation]. Temple University; 2014. [cited 2021 Jan 20]. Available from: http://digital.library.temple.edu/u?/p245801coll10,273331.

Council of Science Editors:

Shank SD. Low-rank solution methods for large-scale linear matrix equations. [Doctoral Dissertation]. Temple University; 2014. Available from: http://digital.library.temple.edu/u?/p245801coll10,273331

Colorado School of Mines

4.
Yang, Dehui.
Structured *low*-*rank* *matrix* recovery via optimization methods.

Degree: PhD, Electrical Engineering, 2018, Colorado School of Mines

URL: http://hdl.handle.net/11124/172154

► From single-molecule microscopy in biology, to collaborative filtering in recommendation systems, to quantum state tomography in physics, many scientific discoveries involve solving ill-posed inverse problems,…
(more)

Subjects/Keywords: matrix completion; models; super-resolution; modal analysis; low-rank; optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Yang, D. (2018). Structured low-rank matrix recovery via optimization methods. (Doctoral Dissertation). Colorado School of Mines. Retrieved from http://hdl.handle.net/11124/172154

Chicago Manual of Style (16^{th} Edition):

Yang, Dehui. “Structured low-rank matrix recovery via optimization methods.” 2018. Doctoral Dissertation, Colorado School of Mines. Accessed January 20, 2021. http://hdl.handle.net/11124/172154.

MLA Handbook (7^{th} Edition):

Yang, Dehui. “Structured low-rank matrix recovery via optimization methods.” 2018. Web. 20 Jan 2021.

Vancouver:

Yang D. Structured low-rank matrix recovery via optimization methods. [Internet] [Doctoral dissertation]. Colorado School of Mines; 2018. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/11124/172154.

Council of Science Editors:

Yang D. Structured low-rank matrix recovery via optimization methods. [Doctoral Dissertation]. Colorado School of Mines; 2018. Available from: http://hdl.handle.net/11124/172154

Georgia Tech

5.
Rangel Walteros, Pedro Andres.
A non-asymptotic study of *low*-*rank* estimation of smooth kernels on graphs.

Degree: PhD, Mathematics, 2014, Georgia Tech

URL: http://hdl.handle.net/1853/52988

► This dissertation investigates the problem of estimating a kernel over a large graph based on a sample of noisy observations of linear measurements of the…
(more)

Subjects/Keywords: Low-rank matrix completion; Kernels on graphs; High dimensional probability

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Rangel Walteros, P. A. (2014). A non-asymptotic study of low-rank estimation of smooth kernels on graphs. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/52988

Chicago Manual of Style (16^{th} Edition):

Rangel Walteros, Pedro Andres. “A non-asymptotic study of low-rank estimation of smooth kernels on graphs.” 2014. Doctoral Dissertation, Georgia Tech. Accessed January 20, 2021. http://hdl.handle.net/1853/52988.

MLA Handbook (7^{th} Edition):

Rangel Walteros, Pedro Andres. “A non-asymptotic study of low-rank estimation of smooth kernels on graphs.” 2014. Web. 20 Jan 2021.

Vancouver:

Rangel Walteros PA. A non-asymptotic study of low-rank estimation of smooth kernels on graphs. [Internet] [Doctoral dissertation]. Georgia Tech; 2014. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/1853/52988.

Council of Science Editors:

Rangel Walteros PA. A non-asymptotic study of low-rank estimation of smooth kernels on graphs. [Doctoral Dissertation]. Georgia Tech; 2014. Available from: http://hdl.handle.net/1853/52988

Georgia Tech

6. Xia, Dong. Statistical inference for large matrices.

Degree: PhD, Mathematics, 2016, Georgia Tech

URL: http://hdl.handle.net/1853/55632

► This thesis covers two topics on *matrix* analysis and estimation in machine learning and statistics. The first topic is about density *matrix* estimation with application…
(more)

Subjects/Keywords: Low rank; Matrix estimation; Singular vectors; Random perturbation

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Xia, D. (2016). Statistical inference for large matrices. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/55632

Chicago Manual of Style (16^{th} Edition):

Xia, Dong. “Statistical inference for large matrices.” 2016. Doctoral Dissertation, Georgia Tech. Accessed January 20, 2021. http://hdl.handle.net/1853/55632.

MLA Handbook (7^{th} Edition):

Xia, Dong. “Statistical inference for large matrices.” 2016. Web. 20 Jan 2021.

Vancouver:

Xia D. Statistical inference for large matrices. [Internet] [Doctoral dissertation]. Georgia Tech; 2016. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/1853/55632.

Council of Science Editors:

Xia D. Statistical inference for large matrices. [Doctoral Dissertation]. Georgia Tech; 2016. Available from: http://hdl.handle.net/1853/55632

Princeton University

7. Zhong, Yiqiao. Spectral methods and MLE: a modern statistical perspective .

Degree: PhD, 2019, Princeton University

URL: http://arks.princeton.edu/ark:/88435/dsp01gb19f8728

► Modern statistical analysis often requires the integration of statistical thinking and algorithmic thinking. There are new challenges posed for classical estimation principles. Indeed, in high-dimensional…
(more)

Subjects/Keywords: Eigenvectors; High dimensional statistics; Low rank matrices; Matrix perturbation; Nonconvex optimization; Random matrix theory

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhong, Y. (2019). Spectral methods and MLE: a modern statistical perspective . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01gb19f8728

Chicago Manual of Style (16^{th} Edition):

Zhong, Yiqiao. “Spectral methods and MLE: a modern statistical perspective .” 2019. Doctoral Dissertation, Princeton University. Accessed January 20, 2021. http://arks.princeton.edu/ark:/88435/dsp01gb19f8728.

MLA Handbook (7^{th} Edition):

Zhong, Yiqiao. “Spectral methods and MLE: a modern statistical perspective .” 2019. Web. 20 Jan 2021.

Vancouver:

Zhong Y. Spectral methods and MLE: a modern statistical perspective . [Internet] [Doctoral dissertation]. Princeton University; 2019. [cited 2021 Jan 20]. Available from: http://arks.princeton.edu/ark:/88435/dsp01gb19f8728.

Council of Science Editors:

Zhong Y. Spectral methods and MLE: a modern statistical perspective . [Doctoral Dissertation]. Princeton University; 2019. Available from: http://arks.princeton.edu/ark:/88435/dsp01gb19f8728

University of Manchester

8.
Borsdorf, Ruediger.
Structured *Matrix* Nearness Problems:Theory and
Algorithms.

Degree: 2012, University of Manchester

URL: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:162521

► In many areas of science one often has a given *matrix*, representing forexample a measured data set and is required to find a *matrix* that…
(more)

Subjects/Keywords: correlation matrix; factor structure; matrix embedding; Stiefel manifold; linearly structured matrix; Grassmannian manifold; low rank; optimization over manifolds; matrix nearness problems

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Borsdorf, R. (2012). Structured Matrix Nearness Problems:Theory and Algorithms. (Doctoral Dissertation). University of Manchester. Retrieved from http://www.manchester.ac.uk/escholar/uk-ac-man-scw:162521

Chicago Manual of Style (16^{th} Edition):

Borsdorf, Ruediger. “Structured Matrix Nearness Problems:Theory and Algorithms.” 2012. Doctoral Dissertation, University of Manchester. Accessed January 20, 2021. http://www.manchester.ac.uk/escholar/uk-ac-man-scw:162521.

MLA Handbook (7^{th} Edition):

Borsdorf, Ruediger. “Structured Matrix Nearness Problems:Theory and Algorithms.” 2012. Web. 20 Jan 2021.

Vancouver:

Borsdorf R. Structured Matrix Nearness Problems:Theory and Algorithms. [Internet] [Doctoral dissertation]. University of Manchester; 2012. [cited 2021 Jan 20]. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:162521.

Council of Science Editors:

Borsdorf R. Structured Matrix Nearness Problems:Theory and Algorithms. [Doctoral Dissertation]. University of Manchester; 2012. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:162521

9.
Haraldson, Joseph.
* Matrix* Polynomials and their Lower

Degree: 2019, University of Waterloo

URL: http://hdl.handle.net/10012/14847

► This thesis is a wide ranging work on computing a “lower-rank” approximation of a *matrix* polynomial using second-order non-linear optimization techniques. Two notions of *rank*…
(more)

Subjects/Keywords: numerical linear algebra; optimization; matrix polynomial; eigenvalue; gcd; low rank; low rank approximation; polynomial eigenvalue; matrix pencil; smith form; kronecker form; kernel; matrix pencil; approximate gcd

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Haraldson, J. (2019). Matrix Polynomials and their Lower Rank Approximations. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/14847

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Haraldson, Joseph. “Matrix Polynomials and their Lower Rank Approximations.” 2019. Thesis, University of Waterloo. Accessed January 20, 2021. http://hdl.handle.net/10012/14847.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Haraldson, Joseph. “Matrix Polynomials and their Lower Rank Approximations.” 2019. Web. 20 Jan 2021.

Vancouver:

Haraldson J. Matrix Polynomials and their Lower Rank Approximations. [Internet] [Thesis]. University of Waterloo; 2019. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/10012/14847.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Haraldson J. Matrix Polynomials and their Lower Rank Approximations. [Thesis]. University of Waterloo; 2019. Available from: http://hdl.handle.net/10012/14847

Not specified: Masters Thesis or Doctoral Dissertation

Texas A&M University

10. Sun, Ranye. Thresholding Multivariate Regression and Generalized Principal Components.

Degree: PhD, Statistics, 2014, Texas A&M University

URL: http://hdl.handle.net/1969.1/152564

► As high-dimensional data arises from various fields in science and technology, traditional multivariate methods need to be updated. Principal component analysis and reduced *rank* regression…
(more)

Subjects/Keywords: cross-validation; iterative subspace projections; low-rank matrix approximation; regularization; transposable data.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sun, R. (2014). Thresholding Multivariate Regression and Generalized Principal Components. (Doctoral Dissertation). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/152564

Chicago Manual of Style (16^{th} Edition):

Sun, Ranye. “Thresholding Multivariate Regression and Generalized Principal Components.” 2014. Doctoral Dissertation, Texas A&M University. Accessed January 20, 2021. http://hdl.handle.net/1969.1/152564.

MLA Handbook (7^{th} Edition):

Sun, Ranye. “Thresholding Multivariate Regression and Generalized Principal Components.” 2014. Web. 20 Jan 2021.

Vancouver:

Sun R. Thresholding Multivariate Regression and Generalized Principal Components. [Internet] [Doctoral dissertation]. Texas A&M University; 2014. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/1969.1/152564.

Council of Science Editors:

Sun R. Thresholding Multivariate Regression and Generalized Principal Components. [Doctoral Dissertation]. Texas A&M University; 2014. Available from: http://hdl.handle.net/1969.1/152564

Université Catholique de Louvain

11.
Gillis, Nicolas.
Nonnegative *matrix* factorization : complexity, algorithms and applications.

Degree: 2011, Université Catholique de Louvain

URL: http://hdl.handle.net/2078.1/70744

►

Linear dimensionality reduction techniques such as principal component analysis are powerful tools for the analysis of high-dimensional data. In this thesis, we explore a closely… (more)

Subjects/Keywords: Low-rank matrix approximation; Nonnegative matrices; Computational complexity; Optimization; Underapproximation; Data mining; Hyperspectral image analysis

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Gillis, N. (2011). Nonnegative matrix factorization : complexity, algorithms and applications. (Thesis). Université Catholique de Louvain. Retrieved from http://hdl.handle.net/2078.1/70744

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Gillis, Nicolas. “Nonnegative matrix factorization : complexity, algorithms and applications.” 2011. Thesis, Université Catholique de Louvain. Accessed January 20, 2021. http://hdl.handle.net/2078.1/70744.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Gillis, Nicolas. “Nonnegative matrix factorization : complexity, algorithms and applications.” 2011. Web. 20 Jan 2021.

Vancouver:

Gillis N. Nonnegative matrix factorization : complexity, algorithms and applications. [Internet] [Thesis]. Université Catholique de Louvain; 2011. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/2078.1/70744.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Gillis N. Nonnegative matrix factorization : complexity, algorithms and applications. [Thesis]. Université Catholique de Louvain; 2011. Available from: http://hdl.handle.net/2078.1/70744

Not specified: Masters Thesis or Doctoral Dissertation

12. Zhu, Ziwei. Distributed and Robust Statistical Learning .

Degree: PhD, 2018, Princeton University

URL: http://arks.princeton.edu/ark:/88435/dsp01d217qs22x

► Decentralized and corrupted data are nowadays ubiquitous, which impose fundamental challenges for modern statistical analysis. Illustrative examples are massive and decentralized data produced by distributed…
(more)

Subjects/Keywords: distributed learning; high-dimensional statistics; low-rank matrix recovery; principal component analysis; regression; robust statistics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhu, Z. (2018). Distributed and Robust Statistical Learning . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01d217qs22x

Chicago Manual of Style (16^{th} Edition):

Zhu, Ziwei. “Distributed and Robust Statistical Learning .” 2018. Doctoral Dissertation, Princeton University. Accessed January 20, 2021. http://arks.princeton.edu/ark:/88435/dsp01d217qs22x.

MLA Handbook (7^{th} Edition):

Zhu, Ziwei. “Distributed and Robust Statistical Learning .” 2018. Web. 20 Jan 2021.

Vancouver:

Zhu Z. Distributed and Robust Statistical Learning . [Internet] [Doctoral dissertation]. Princeton University; 2018. [cited 2021 Jan 20]. Available from: http://arks.princeton.edu/ark:/88435/dsp01d217qs22x.

Council of Science Editors:

Zhu Z. Distributed and Robust Statistical Learning . [Doctoral Dissertation]. Princeton University; 2018. Available from: http://arks.princeton.edu/ark:/88435/dsp01d217qs22x

University of Illinois – Urbana-Champaign

13.
Balasubramanian, Arvind.
Applications of *low*-*rank* *matrix* recovery methods in computer vision.

Degree: PhD, 1200, 2012, University of Illinois – Urbana-Champaign

URL: http://hdl.handle.net/2142/31929

► The ubiquitous availability of high-dimensional data such as images and videos has generated a lot of interest in high-dimensional data analysis. One of the key…
(more)

Subjects/Keywords: Image Alignment; Texture Rectification; Low-Rank Matrix Recovery; Convex Optimization; Photometric Stereo; Principal Component Pursuit

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Balasubramanian, A. (2012). Applications of low-rank matrix recovery methods in computer vision. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/31929

Chicago Manual of Style (16^{th} Edition):

Balasubramanian, Arvind. “Applications of low-rank matrix recovery methods in computer vision.” 2012. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed January 20, 2021. http://hdl.handle.net/2142/31929.

MLA Handbook (7^{th} Edition):

Balasubramanian, Arvind. “Applications of low-rank matrix recovery methods in computer vision.” 2012. Web. 20 Jan 2021.

Vancouver:

Balasubramanian A. Applications of low-rank matrix recovery methods in computer vision. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2012. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/2142/31929.

Council of Science Editors:

Balasubramanian A. Applications of low-rank matrix recovery methods in computer vision. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2012. Available from: http://hdl.handle.net/2142/31929

Georgia Tech

14.
Zhou, Fan.
Statistical inference for high dimensional data with *low* *rank* structure.

Degree: PhD, Mathematics, 2018, Georgia Tech

URL: http://hdl.handle.net/1853/60750

► We study two major topics on statistical inference for high dimensional data with *low* *rank* structure occurred in many machine learning and statistics applications. The…
(more)

Subjects/Keywords: Nonparametric statistics; Matrix completion; Low rank; Nuclear norm; Tensor; Singular vector perturbation

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhou, F. (2018). Statistical inference for high dimensional data with low rank structure. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/60750

Chicago Manual of Style (16^{th} Edition):

Zhou, Fan. “Statistical inference for high dimensional data with low rank structure.” 2018. Doctoral Dissertation, Georgia Tech. Accessed January 20, 2021. http://hdl.handle.net/1853/60750.

MLA Handbook (7^{th} Edition):

Zhou, Fan. “Statistical inference for high dimensional data with low rank structure.” 2018. Web. 20 Jan 2021.

Vancouver:

Zhou F. Statistical inference for high dimensional data with low rank structure. [Internet] [Doctoral dissertation]. Georgia Tech; 2018. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/1853/60750.

Council of Science Editors:

Zhou F. Statistical inference for high dimensional data with low rank structure. [Doctoral Dissertation]. Georgia Tech; 2018. Available from: http://hdl.handle.net/1853/60750

University of Texas – Austin

15.
Bhojanapalli, Venkata Sesha Pavana Srinadh.
Large scale *matrix* factorization with guarantees: sampling and bi-linearity.

Degree: PhD, Electrical and Computer Engineering, 2015, University of Texas – Austin

URL: http://hdl.handle.net/2152/32832

► *Low* *rank* *matrix* factorization is an important step in many high dimensional machine learning algorithms. Traditional algorithms for factorization do not scale well with the…
(more)

Subjects/Keywords: Matrix completion; Non-convex optimization; Low rank approximation; Semi-definite optimization; Tensor factorization; Scalable algorithms

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Bhojanapalli, V. S. P. S. (2015). Large scale matrix factorization with guarantees: sampling and bi-linearity. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/32832

Chicago Manual of Style (16^{th} Edition):

Bhojanapalli, Venkata Sesha Pavana Srinadh. “Large scale matrix factorization with guarantees: sampling and bi-linearity.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed January 20, 2021. http://hdl.handle.net/2152/32832.

MLA Handbook (7^{th} Edition):

Bhojanapalli, Venkata Sesha Pavana Srinadh. “Large scale matrix factorization with guarantees: sampling and bi-linearity.” 2015. Web. 20 Jan 2021.

Vancouver:

Bhojanapalli VSPS. Large scale matrix factorization with guarantees: sampling and bi-linearity. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/2152/32832.

Council of Science Editors:

Bhojanapalli VSPS. Large scale matrix factorization with guarantees: sampling and bi-linearity. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/32832

University of Pennsylvania

16. Yang, Dan. Singular Value Decomposition for High Dimensional Data.

Degree: 2012, University of Pennsylvania

URL: https://repository.upenn.edu/edissertations/595

► Singular value decomposition is a widely used tool for dimension reduction in multivariate analysis. However, when used for statistical estimation in high-dimensional *low* *rank* *matrix*…
(more)

Subjects/Keywords: Cross validation; Denoise; Low rank matrix approximation; PCA; Penalization; Thresholding; Statistics and Probability

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Yang, D. (2012). Singular Value Decomposition for High Dimensional Data. (Thesis). University of Pennsylvania. Retrieved from https://repository.upenn.edu/edissertations/595

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Yang, Dan. “Singular Value Decomposition for High Dimensional Data.” 2012. Thesis, University of Pennsylvania. Accessed January 20, 2021. https://repository.upenn.edu/edissertations/595.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Yang, Dan. “Singular Value Decomposition for High Dimensional Data.” 2012. Web. 20 Jan 2021.

Vancouver:

Yang D. Singular Value Decomposition for High Dimensional Data. [Internet] [Thesis]. University of Pennsylvania; 2012. [cited 2021 Jan 20]. Available from: https://repository.upenn.edu/edissertations/595.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Yang D. Singular Value Decomposition for High Dimensional Data. [Thesis]. University of Pennsylvania; 2012. Available from: https://repository.upenn.edu/edissertations/595

Not specified: Masters Thesis or Doctoral Dissertation

Michigan Technological University

17.
Azzam, Joy.
Sub-Sampled *Matrix* Approximations.

Degree: PhD, Department of Mathematical Sciences, 2020, Michigan Technological University

URL: https://digitalcommons.mtu.edu/etdr/1002

► *Matrix* approximations are widely used to accelerate many numerical algorithms. Current methods sample row (or column) spaces to reduce their computational footprint and approximate…
(more)

Subjects/Keywords: Matrix Approximation; Inverse Matrix Approximation; Low Rank Approximation; Quasi-Newton; Randomized Numerical Linear Algebra; Preconditioner; Sub-Sampled; Other Applied Mathematics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Azzam, J. (2020). Sub-Sampled Matrix Approximations. (Doctoral Dissertation). Michigan Technological University. Retrieved from https://digitalcommons.mtu.edu/etdr/1002

Chicago Manual of Style (16^{th} Edition):

Azzam, Joy. “Sub-Sampled Matrix Approximations.” 2020. Doctoral Dissertation, Michigan Technological University. Accessed January 20, 2021. https://digitalcommons.mtu.edu/etdr/1002.

MLA Handbook (7^{th} Edition):

Azzam, Joy. “Sub-Sampled Matrix Approximations.” 2020. Web. 20 Jan 2021.

Vancouver:

Azzam J. Sub-Sampled Matrix Approximations. [Internet] [Doctoral dissertation]. Michigan Technological University; 2020. [cited 2021 Jan 20]. Available from: https://digitalcommons.mtu.edu/etdr/1002.

Council of Science Editors:

Azzam J. Sub-Sampled Matrix Approximations. [Doctoral Dissertation]. Michigan Technological University; 2020. Available from: https://digitalcommons.mtu.edu/etdr/1002

18.
Amadeo, Lily.
Large Scale *Matrix* Completion and Recommender Systems.

Degree: MS, 2015, Worcester Polytechnic Institute

URL: etd-090415-162439 ; https://digitalcommons.wpi.edu/etd-theses/1021

► "The goal of this thesis is to extend the theory and practice of *matrix* completion algorithms, and how they can be utilized, improved, and scaled…
(more)

Subjects/Keywords: low rank matrix; robust principal component analysis; convex relaxation; principal component analysis; recommender systems; matrix completion

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Amadeo, L. (2015). Large Scale Matrix Completion and Recommender Systems. (Thesis). Worcester Polytechnic Institute. Retrieved from etd-090415-162439 ; https://digitalcommons.wpi.edu/etd-theses/1021

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Amadeo, Lily. “Large Scale Matrix Completion and Recommender Systems.” 2015. Thesis, Worcester Polytechnic Institute. Accessed January 20, 2021. etd-090415-162439 ; https://digitalcommons.wpi.edu/etd-theses/1021.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Amadeo, Lily. “Large Scale Matrix Completion and Recommender Systems.” 2015. Web. 20 Jan 2021.

Vancouver:

Amadeo L. Large Scale Matrix Completion and Recommender Systems. [Internet] [Thesis]. Worcester Polytechnic Institute; 2015. [cited 2021 Jan 20]. Available from: etd-090415-162439 ; https://digitalcommons.wpi.edu/etd-theses/1021.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Amadeo L. Large Scale Matrix Completion and Recommender Systems. [Thesis]. Worcester Polytechnic Institute; 2015. Available from: etd-090415-162439 ; https://digitalcommons.wpi.edu/etd-theses/1021

Not specified: Masters Thesis or Doctoral Dissertation

Georgia Tech

19. Lee, Joonseok. Local approaches for collaborative filtering.

Degree: PhD, Computer Science, 2015, Georgia Tech

URL: http://hdl.handle.net/1853/53846

► Recommendation systems are emerging as an important business application as the demand for personalized services in E-commerce increases. Collaborative filtering techniques are widely used for…
(more)

Subjects/Keywords: Recommendation systems; Collaborative filtering; Machine learning; Local low-rank assumption; Matrix factorization; Matrix approximation; Ensemble collaborative ranking

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Lee, J. (2015). Local approaches for collaborative filtering. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/53846

Chicago Manual of Style (16^{th} Edition):

Lee, Joonseok. “Local approaches for collaborative filtering.” 2015. Doctoral Dissertation, Georgia Tech. Accessed January 20, 2021. http://hdl.handle.net/1853/53846.

MLA Handbook (7^{th} Edition):

Lee, Joonseok. “Local approaches for collaborative filtering.” 2015. Web. 20 Jan 2021.

Vancouver:

Lee J. Local approaches for collaborative filtering. [Internet] [Doctoral dissertation]. Georgia Tech; 2015. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/1853/53846.

Council of Science Editors:

Lee J. Local approaches for collaborative filtering. [Doctoral Dissertation]. Georgia Tech; 2015. Available from: http://hdl.handle.net/1853/53846

20.
MIAO WEIMIN.
* Matrix* Completion Models with Fixed Basis Coefficients and

Degree: 2013, National University of Singapore

URL: http://scholarbank.nus.edu.sg/handle/10635/37889

Subjects/Keywords: matrix completion; rank minimization; low rank; error bound; rank consistency; semi-nuclear norm

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

WEIMIN, M. (2013). Matrix Completion Models with Fixed Basis Coefficients and Rank Regularized Problems with Hard Constraints. (Thesis). National University of Singapore. Retrieved from http://scholarbank.nus.edu.sg/handle/10635/37889

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

WEIMIN, MIAO. “Matrix Completion Models with Fixed Basis Coefficients and Rank Regularized Problems with Hard Constraints.” 2013. Thesis, National University of Singapore. Accessed January 20, 2021. http://scholarbank.nus.edu.sg/handle/10635/37889.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

WEIMIN, MIAO. “Matrix Completion Models with Fixed Basis Coefficients and Rank Regularized Problems with Hard Constraints.” 2013. Web. 20 Jan 2021.

Vancouver:

WEIMIN M. Matrix Completion Models with Fixed Basis Coefficients and Rank Regularized Problems with Hard Constraints. [Internet] [Thesis]. National University of Singapore; 2013. [cited 2021 Jan 20]. Available from: http://scholarbank.nus.edu.sg/handle/10635/37889.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

WEIMIN M. Matrix Completion Models with Fixed Basis Coefficients and Rank Regularized Problems with Hard Constraints. [Thesis]. National University of Singapore; 2013. Available from: http://scholarbank.nus.edu.sg/handle/10635/37889

Not specified: Masters Thesis or Doctoral Dissertation

University of Manchester

21.
Borsdorf, Ruediger.
Structured *matrix* nearness problems : theory and algorithms.

Degree: PhD, 2012, University of Manchester

URL: https://www.research.manchester.ac.uk/portal/en/theses/structured-matrix-nearness-problemstheory-and-algorithms(554f944d-9a78-4b54-90c2-1ef06866c402).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.554165

► In many areas of science one often has a given *matrix*, representing for example a measured data set and is required to find a *matrix*…
(more)

Subjects/Keywords: 025.04; correlation matrix; factor structure; matrix embedding; Stiefel manifold; linearly structured matrix; Grassmannian manifold; low rank; optimization over manifolds; matrix nearness problems

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Borsdorf, R. (2012). Structured matrix nearness problems : theory and algorithms. (Doctoral Dissertation). University of Manchester. Retrieved from https://www.research.manchester.ac.uk/portal/en/theses/structured-matrix-nearness-problemstheory-and-algorithms(554f944d-9a78-4b54-90c2-1ef06866c402).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.554165

Chicago Manual of Style (16^{th} Edition):

Borsdorf, Ruediger. “Structured matrix nearness problems : theory and algorithms.” 2012. Doctoral Dissertation, University of Manchester. Accessed January 20, 2021. https://www.research.manchester.ac.uk/portal/en/theses/structured-matrix-nearness-problemstheory-and-algorithms(554f944d-9a78-4b54-90c2-1ef06866c402).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.554165.

MLA Handbook (7^{th} Edition):

Borsdorf, Ruediger. “Structured matrix nearness problems : theory and algorithms.” 2012. Web. 20 Jan 2021.

Vancouver:

Borsdorf R. Structured matrix nearness problems : theory and algorithms. [Internet] [Doctoral dissertation]. University of Manchester; 2012. [cited 2021 Jan 20]. Available from: https://www.research.manchester.ac.uk/portal/en/theses/structured-matrix-nearness-problemstheory-and-algorithms(554f944d-9a78-4b54-90c2-1ef06866c402).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.554165.

Council of Science Editors:

Borsdorf R. Structured matrix nearness problems : theory and algorithms. [Doctoral Dissertation]. University of Manchester; 2012. Available from: https://www.research.manchester.ac.uk/portal/en/theses/structured-matrix-nearness-problemstheory-and-algorithms(554f944d-9a78-4b54-90c2-1ef06866c402).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.554165

University of Michigan

22.
Nayar, Himanshu.
Application of Random *Matrix* Theory to Multimodal Fusion.

Degree: PhD, Electrical Engineering: Systems, 2017, University of Michigan

URL: http://hdl.handle.net/2027.42/140962

► Multimodal data fusion is an interesting problem and its applications can be seen in image processing, signal processing and machine learning. In applications where we…
(more)

Subjects/Keywords: Data driven fusion; Random Matrix Theory; Factor analysis; Low rank decomposition; Clique recovery; Electrical Engineering; Engineering

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Nayar, H. (2017). Application of Random Matrix Theory to Multimodal Fusion. (Doctoral Dissertation). University of Michigan. Retrieved from http://hdl.handle.net/2027.42/140962

Chicago Manual of Style (16^{th} Edition):

Nayar, Himanshu. “Application of Random Matrix Theory to Multimodal Fusion.” 2017. Doctoral Dissertation, University of Michigan. Accessed January 20, 2021. http://hdl.handle.net/2027.42/140962.

MLA Handbook (7^{th} Edition):

Nayar, Himanshu. “Application of Random Matrix Theory to Multimodal Fusion.” 2017. Web. 20 Jan 2021.

Vancouver:

Nayar H. Application of Random Matrix Theory to Multimodal Fusion. [Internet] [Doctoral dissertation]. University of Michigan; 2017. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/2027.42/140962.

Council of Science Editors:

Nayar H. Application of Random Matrix Theory to Multimodal Fusion. [Doctoral Dissertation]. University of Michigan; 2017. Available from: http://hdl.handle.net/2027.42/140962

Wayne State University

23.
Wang, Lijun.
Complex data analytics via sparse, *low*-*rank* *matrix* approximation.

Degree: PhD, Computer Science, 2012, Wayne State University

URL: https://digitalcommons.wayne.edu/oa_dissertations/583

► Today, digital data is accumulated at a faster than ever speed in science, engineering, biomedicine, and real-world sensing. Data mining provides us an effective…
(more)

Subjects/Keywords: abnormal event detection, Big data analytics, clustering, evolutionary clustering, large-scale data analysis, low-rank matrix approximation; Computer Sciences

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, L. (2012). Complex data analytics via sparse, low-rank matrix approximation. (Doctoral Dissertation). Wayne State University. Retrieved from https://digitalcommons.wayne.edu/oa_dissertations/583

Chicago Manual of Style (16^{th} Edition):

Wang, Lijun. “Complex data analytics via sparse, low-rank matrix approximation.” 2012. Doctoral Dissertation, Wayne State University. Accessed January 20, 2021. https://digitalcommons.wayne.edu/oa_dissertations/583.

MLA Handbook (7^{th} Edition):

Wang, Lijun. “Complex data analytics via sparse, low-rank matrix approximation.” 2012. Web. 20 Jan 2021.

Vancouver:

Wang L. Complex data analytics via sparse, low-rank matrix approximation. [Internet] [Doctoral dissertation]. Wayne State University; 2012. [cited 2021 Jan 20]. Available from: https://digitalcommons.wayne.edu/oa_dissertations/583.

Council of Science Editors:

Wang L. Complex data analytics via sparse, low-rank matrix approximation. [Doctoral Dissertation]. Wayne State University; 2012. Available from: https://digitalcommons.wayne.edu/oa_dissertations/583

Northeastern University

24. Shao, Ming. Efficient transfer feature learning and its applications on social media.

Degree: PhD, Department of Electrical and Computer Engineering, 2016, Northeastern University

URL: http://hdl.handle.net/2047/D20213068

► In the era of social media, more and more social characteristics are conveyed by multimedia, i.e., images, videos, audios, and webpages with rich media information.…
(more)

Subjects/Keywords: domain adaptation; kinship verification; low-rank matrix analysis; transfer learning; Computer vision; Machine learning; Social media; Biometric identification; Mathematical models; Matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Shao, M. (2016). Efficient transfer feature learning and its applications on social media. (Doctoral Dissertation). Northeastern University. Retrieved from http://hdl.handle.net/2047/D20213068

Chicago Manual of Style (16^{th} Edition):

Shao, Ming. “Efficient transfer feature learning and its applications on social media.” 2016. Doctoral Dissertation, Northeastern University. Accessed January 20, 2021. http://hdl.handle.net/2047/D20213068.

MLA Handbook (7^{th} Edition):

Shao, Ming. “Efficient transfer feature learning and its applications on social media.” 2016. Web. 20 Jan 2021.

Vancouver:

Shao M. Efficient transfer feature learning and its applications on social media. [Internet] [Doctoral dissertation]. Northeastern University; 2016. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/2047/D20213068.

Council of Science Editors:

Shao M. Efficient transfer feature learning and its applications on social media. [Doctoral Dissertation]. Northeastern University; 2016. Available from: http://hdl.handle.net/2047/D20213068

University of Illinois – Chicago

25. Bhoi, Amlaan. Invariant Kernels for Few-shot Learning.

Degree: 2019, University of Illinois – Chicago

URL: http://hdl.handle.net/10027/23714

► Recent advances in few-shot learning algorithms focus on the development of meta-learning or improvements in distance-based algorithms. However, the majority of these approaches do not…
(more)

Subjects/Keywords: few-shot learning; invariance learning; adversarial learning; low-rank matrix approximation; image classification; deep learning; kernel learning

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Bhoi, A. (2019). Invariant Kernels for Few-shot Learning. (Thesis). University of Illinois – Chicago. Retrieved from http://hdl.handle.net/10027/23714

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Bhoi, Amlaan. “Invariant Kernels for Few-shot Learning.” 2019. Thesis, University of Illinois – Chicago. Accessed January 20, 2021. http://hdl.handle.net/10027/23714.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Bhoi, Amlaan. “Invariant Kernels for Few-shot Learning.” 2019. Web. 20 Jan 2021.

Vancouver:

Bhoi A. Invariant Kernels for Few-shot Learning. [Internet] [Thesis]. University of Illinois – Chicago; 2019. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/10027/23714.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Bhoi A. Invariant Kernels for Few-shot Learning. [Thesis]. University of Illinois – Chicago; 2019. Available from: http://hdl.handle.net/10027/23714

Not specified: Masters Thesis or Doctoral Dissertation

University of Iowa

26.
Wang, Tianming.
Non-convex methods for spectrally sparse signal reconstruction via *low*-*rank* Hankel *matrix* completion.

Degree: PhD, Applied Mathematical and Computational Sciences, 2018, University of Iowa

URL: https://ir.uiowa.edu/etd/6331

► Spectrally sparse signals arise in many applications of signal processing. A spectrally sparse signal is a mixture of a few undamped or damped complex…
(more)

Subjects/Keywords: low-rank Hankel matrix completion; NMR spectroscopy; projected gradient descent; Riemannian optimization; spectrally sparse signals; Applied Mathematics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, T. (2018). Non-convex methods for spectrally sparse signal reconstruction via low-rank Hankel matrix completion. (Doctoral Dissertation). University of Iowa. Retrieved from https://ir.uiowa.edu/etd/6331

Chicago Manual of Style (16^{th} Edition):

Wang, Tianming. “Non-convex methods for spectrally sparse signal reconstruction via low-rank Hankel matrix completion.” 2018. Doctoral Dissertation, University of Iowa. Accessed January 20, 2021. https://ir.uiowa.edu/etd/6331.

MLA Handbook (7^{th} Edition):

Wang, Tianming. “Non-convex methods for spectrally sparse signal reconstruction via low-rank Hankel matrix completion.” 2018. Web. 20 Jan 2021.

Vancouver:

Wang T. Non-convex methods for spectrally sparse signal reconstruction via low-rank Hankel matrix completion. [Internet] [Doctoral dissertation]. University of Iowa; 2018. [cited 2021 Jan 20]. Available from: https://ir.uiowa.edu/etd/6331.

Council of Science Editors:

Wang T. Non-convex methods for spectrally sparse signal reconstruction via low-rank Hankel matrix completion. [Doctoral Dissertation]. University of Iowa; 2018. Available from: https://ir.uiowa.edu/etd/6331

Rice University

27. Darvish Rouhani, Bita. A Resource-Aware Streaming-based Framework for Big Data Analysis.

Degree: MS, Engineering, 2015, Rice University

URL: http://hdl.handle.net/1911/87764

► The ever growing body of digital data is challenging conventional analytical techniques in machine learning, computer vision, and signal processing. Traditional analytical methods have been…
(more)

Subjects/Keywords: Streaming model; Big data; Dense matrix; Low-rank approximation; HW/SW co-design; Deep Learning; Scalable machine learning

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Darvish Rouhani, B. (2015). A Resource-Aware Streaming-based Framework for Big Data Analysis. (Masters Thesis). Rice University. Retrieved from http://hdl.handle.net/1911/87764

Chicago Manual of Style (16^{th} Edition):

Darvish Rouhani, Bita. “A Resource-Aware Streaming-based Framework for Big Data Analysis.” 2015. Masters Thesis, Rice University. Accessed January 20, 2021. http://hdl.handle.net/1911/87764.

MLA Handbook (7^{th} Edition):

Darvish Rouhani, Bita. “A Resource-Aware Streaming-based Framework for Big Data Analysis.” 2015. Web. 20 Jan 2021.

Vancouver:

Darvish Rouhani B. A Resource-Aware Streaming-based Framework for Big Data Analysis. [Internet] [Masters thesis]. Rice University; 2015. [cited 2021 Jan 20]. Available from: http://hdl.handle.net/1911/87764.

Council of Science Editors:

Darvish Rouhani B. A Resource-Aware Streaming-based Framework for Big Data Analysis. [Masters Thesis]. Rice University; 2015. Available from: http://hdl.handle.net/1911/87764

28.
Vinyes, Marina.
Convex *matrix* sparsity for demixing with an application to graphical model structure estimation : Parcimonie matricielle convexe pour les problèmes de démixage avec une application à l'apprentissage de structure de modèles graphiques.

Degree: Docteur es, Signal, Image, Automatique, 2018, Université Paris-Est

URL: http://www.theses.fr/2018PESC1130

►

En apprentissage automatique on a pour but d'apprendre un modèle, à partir de données, qui soit capable de faire des prédictions sur des nouvelles données… (more)

Subjects/Keywords: Normes atomiques; Optimisation convexe; Parcimonie matricielle; Rang faible; Parcimonie; Atomic norms; Convex optimisation; Matrix sparsity; Low rank; Sparsity

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Vinyes, M. (2018). Convex matrix sparsity for demixing with an application to graphical model structure estimation : Parcimonie matricielle convexe pour les problèmes de démixage avec une application à l'apprentissage de structure de modèles graphiques. (Doctoral Dissertation). Université Paris-Est. Retrieved from http://www.theses.fr/2018PESC1130

Chicago Manual of Style (16^{th} Edition):

Vinyes, Marina. “Convex matrix sparsity for demixing with an application to graphical model structure estimation : Parcimonie matricielle convexe pour les problèmes de démixage avec une application à l'apprentissage de structure de modèles graphiques.” 2018. Doctoral Dissertation, Université Paris-Est. Accessed January 20, 2021. http://www.theses.fr/2018PESC1130.

MLA Handbook (7^{th} Edition):

Vinyes, Marina. “Convex matrix sparsity for demixing with an application to graphical model structure estimation : Parcimonie matricielle convexe pour les problèmes de démixage avec une application à l'apprentissage de structure de modèles graphiques.” 2018. Web. 20 Jan 2021.

Vancouver:

Vinyes M. Convex matrix sparsity for demixing with an application to graphical model structure estimation : Parcimonie matricielle convexe pour les problèmes de démixage avec une application à l'apprentissage de structure de modèles graphiques. [Internet] [Doctoral dissertation]. Université Paris-Est; 2018. [cited 2021 Jan 20]. Available from: http://www.theses.fr/2018PESC1130.

Council of Science Editors:

Vinyes M. Convex matrix sparsity for demixing with an application to graphical model structure estimation : Parcimonie matricielle convexe pour les problèmes de démixage avec une application à l'apprentissage de structure de modèles graphiques. [Doctoral Dissertation]. Université Paris-Est; 2018. Available from: http://www.theses.fr/2018PESC1130

University of Lund

29.
Grussler, Christian.
* Rank* Reduction with Convex Constraints.

Degree: 2017, University of Lund

URL: https://lup.lub.lu.se/record/54cb814f-59fe-4bc9-a7ef-773cbcf06889 ; https://portal.research.lu.se/ws/files/19595129/Thesis.pdf

► This thesis addresses problems which require *low*-*rank* solutions under convex constraints. In particular, the focus lies on model reduction of positive systems, as well as…
(more)

Subjects/Keywords: Control Engineering; low-rank approximation; model reduction; non-convex optimization; Douglas-Rachford; matrix completion; overlapping norm; k-support norm; atomic norm

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Grussler, C. (2017). Rank Reduction with Convex Constraints. (Doctoral Dissertation). University of Lund. Retrieved from https://lup.lub.lu.se/record/54cb814f-59fe-4bc9-a7ef-773cbcf06889 ; https://portal.research.lu.se/ws/files/19595129/Thesis.pdf

Chicago Manual of Style (16^{th} Edition):

Grussler, Christian. “Rank Reduction with Convex Constraints.” 2017. Doctoral Dissertation, University of Lund. Accessed January 20, 2021. https://lup.lub.lu.se/record/54cb814f-59fe-4bc9-a7ef-773cbcf06889 ; https://portal.research.lu.se/ws/files/19595129/Thesis.pdf.

MLA Handbook (7^{th} Edition):

Grussler, Christian. “Rank Reduction with Convex Constraints.” 2017. Web. 20 Jan 2021.

Vancouver:

Grussler C. Rank Reduction with Convex Constraints. [Internet] [Doctoral dissertation]. University of Lund; 2017. [cited 2021 Jan 20]. Available from: https://lup.lub.lu.se/record/54cb814f-59fe-4bc9-a7ef-773cbcf06889 ; https://portal.research.lu.se/ws/files/19595129/Thesis.pdf.

Council of Science Editors:

Grussler C. Rank Reduction with Convex Constraints. [Doctoral Dissertation]. University of Lund; 2017. Available from: https://lup.lub.lu.se/record/54cb814f-59fe-4bc9-a7ef-773cbcf06889 ; https://portal.research.lu.se/ws/files/19595129/Thesis.pdf

University of Lund

30.
Jiang, Fangyuan.
*Low**Rank* *Matrix* Factorization and Relative Pose Problems
in Computer Vision.

Degree: 2015, University of Lund

URL: https://lup.lub.lu.se/record/5368358 ; https://portal.research.lu.se/ws/files/5379996/5368400.pdf

► This thesis is focused on geometric computer vision problems. The first part of the thesis aims at solving one fundamental problem, namely *low*-*rank* *matrix* factorization.…
(more)

Subjects/Keywords: Mathematics; Computer Vision and Robotics (Autonomous Systems); Geometric Computer Vision; Low-rank Matrix Factorization; Relative Pose

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Jiang, F. (2015). Low Rank Matrix Factorization and Relative Pose Problems in Computer Vision. (Doctoral Dissertation). University of Lund. Retrieved from https://lup.lub.lu.se/record/5368358 ; https://portal.research.lu.se/ws/files/5379996/5368400.pdf

Chicago Manual of Style (16^{th} Edition):

Jiang, Fangyuan. “Low Rank Matrix Factorization and Relative Pose Problems in Computer Vision.” 2015. Doctoral Dissertation, University of Lund. Accessed January 20, 2021. https://lup.lub.lu.se/record/5368358 ; https://portal.research.lu.se/ws/files/5379996/5368400.pdf.

MLA Handbook (7^{th} Edition):

Jiang, Fangyuan. “Low Rank Matrix Factorization and Relative Pose Problems in Computer Vision.” 2015. Web. 20 Jan 2021.

Vancouver:

Jiang F. Low Rank Matrix Factorization and Relative Pose Problems in Computer Vision. [Internet] [Doctoral dissertation]. University of Lund; 2015. [cited 2021 Jan 20]. Available from: https://lup.lub.lu.se/record/5368358 ; https://portal.research.lu.se/ws/files/5379996/5368400.pdf.

Council of Science Editors:

Jiang F. Low Rank Matrix Factorization and Relative Pose Problems in Computer Vision. [Doctoral Dissertation]. University of Lund; 2015. Available from: https://lup.lub.lu.se/record/5368358 ; https://portal.research.lu.se/ws/files/5379996/5368400.pdf