Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(dimension reduction)`

.
Showing records 1 – 30 of
162 total matches.

Search Limiters

Dates

- 2013 – 2017 (98)
- 2008 – 2012 (56)
- 2003 – 2007 (11)

Degrees

- PhD (61)
- Docteur es (25)
- MS (12)

▼ Search Limiters

Oregon State University

1. Thangavelu, Madan Kumar. On error bounds for linear feature extraction.

Degree: MS, Computer Science, 2010, Oregon State University

URL: http://hdl.handle.net/1957/13886

► Linear transformation for *dimension* *reduction* is a well established problem in the field of machine learning. Due to the numerous observability of parameters and data,…
(more)

Subjects/Keywords: Dimension reduction; Dimension reduction (Statistics)

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Thangavelu, M. K. (2010). On error bounds for linear feature extraction. (Masters Thesis). Oregon State University. Retrieved from http://hdl.handle.net/1957/13886

Chicago Manual of Style (16^{th} Edition):

Thangavelu, Madan Kumar. “On error bounds for linear feature extraction.” 2010. Masters Thesis, Oregon State University. Accessed December 12, 2017. http://hdl.handle.net/1957/13886.

MLA Handbook (7^{th} Edition):

Thangavelu, Madan Kumar. “On error bounds for linear feature extraction.” 2010. Web. 12 Dec 2017.

Vancouver:

Thangavelu MK. On error bounds for linear feature extraction. [Internet] [Masters thesis]. Oregon State University; 2010. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/1957/13886.

Council of Science Editors:

Thangavelu MK. On error bounds for linear feature extraction. [Masters Thesis]. Oregon State University; 2010. Available from: http://hdl.handle.net/1957/13886

University of Georgia

2.
Wang, Qin.
Sufficient *dimension* *reduction* and sufficient variable selection.

Degree: PhD, Statistics, 2009, University of Georgia

URL: http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd

► The development in theory and methodology for sufficient *dimension* *reduction* has provided a powerful tool to tackle the challenging problem of high dimensional data analysis.…
(more)

Subjects/Keywords: sufficient dimension reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, Q. (2009). Sufficient dimension reduction and sufficient variable selection. (Doctoral Dissertation). University of Georgia. Retrieved from http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd

Chicago Manual of Style (16^{th} Edition):

Wang, Qin. “Sufficient dimension reduction and sufficient variable selection.” 2009. Doctoral Dissertation, University of Georgia. Accessed December 12, 2017. http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd.

MLA Handbook (7^{th} Edition):

Wang, Qin. “Sufficient dimension reduction and sufficient variable selection.” 2009. Web. 12 Dec 2017.

Vancouver:

Wang Q. Sufficient dimension reduction and sufficient variable selection. [Internet] [Doctoral dissertation]. University of Georgia; 2009. [cited 2017 Dec 12]. Available from: http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd.

Council of Science Editors:

Wang Q. Sufficient dimension reduction and sufficient variable selection. [Doctoral Dissertation]. University of Georgia; 2009. Available from: http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd

University of Georgia

3.
Ling, Yangrong.
Statistical *dimension* *reduction* methods for appearance-based face recognition.

Degree: MS, Computer Science, 2003, University of Georgia

URL: http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms

► Two novel moment-based methods which are insensitive to large variation in lighting direction and facial expression are developed for appearance-based face recognition using *dimension* *reduction*…
(more)

Subjects/Keywords: Dimension-reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ling, Y. (2003). Statistical dimension reduction methods for appearance-based face recognition. (Masters Thesis). University of Georgia. Retrieved from http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms

Chicago Manual of Style (16^{th} Edition):

Ling, Yangrong. “Statistical dimension reduction methods for appearance-based face recognition.” 2003. Masters Thesis, University of Georgia. Accessed December 12, 2017. http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms.

MLA Handbook (7^{th} Edition):

Ling, Yangrong. “Statistical dimension reduction methods for appearance-based face recognition.” 2003. Web. 12 Dec 2017.

Vancouver:

Ling Y. Statistical dimension reduction methods for appearance-based face recognition. [Internet] [Masters thesis]. University of Georgia; 2003. [cited 2017 Dec 12]. Available from: http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms.

Council of Science Editors:

Ling Y. Statistical dimension reduction methods for appearance-based face recognition. [Masters Thesis]. University of Georgia; 2003. Available from: http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms

Baylor University

4.
Young, Phil D.
Topics in *dimension* *reduction* and missing data in statistical discrimination.

Degree: Statistical Sciences., 2010, Baylor University

URL: http://hdl.handle.net/2104/5543

► This dissertation is comprised of four chapters. In the first chapter, we define the concept of linear *dimension* *reduction*, review some popular linear *dimension* *reduction*…
(more)

Subjects/Keywords: Dimension reduction.; Statistical discrimination.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Young, P. D. (2010). Topics in dimension reduction and missing data in statistical discrimination. (Thesis). Baylor University. Retrieved from http://hdl.handle.net/2104/5543

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Young, Phil D. “Topics in dimension reduction and missing data in statistical discrimination. ” 2010. Thesis, Baylor University. Accessed December 12, 2017. http://hdl.handle.net/2104/5543.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Young, Phil D. “Topics in dimension reduction and missing data in statistical discrimination. ” 2010. Web. 12 Dec 2017.

Vancouver:

Young PD. Topics in dimension reduction and missing data in statistical discrimination. [Internet] [Thesis]. Baylor University; 2010. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/2104/5543.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Young PD. Topics in dimension reduction and missing data in statistical discrimination. [Thesis]. Baylor University; 2010. Available from: http://hdl.handle.net/2104/5543

Not specified: Masters Thesis or Doctoral Dissertation

Clemson University

5. Knoll, Fiona. Johnson-Lindenstrauss Transformations.

Degree: PhD, Mathematical Sciences, 2017, Clemson University

URL: http://tigerprints.clemson.edu/all_dissertations/1977

► With the quick progression of technology and the increasing need to process large data, there has been an increased interest in data-dependent and data-independent *dimension*…
(more)

Subjects/Keywords: Data; Dimension Reduction; Johnson-Lindenstrauss

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Knoll, F. (2017). Johnson-Lindenstrauss Transformations. (Doctoral Dissertation). Clemson University. Retrieved from http://tigerprints.clemson.edu/all_dissertations/1977

Chicago Manual of Style (16^{th} Edition):

Knoll, Fiona. “Johnson-Lindenstrauss Transformations.” 2017. Doctoral Dissertation, Clemson University. Accessed December 12, 2017. http://tigerprints.clemson.edu/all_dissertations/1977.

MLA Handbook (7^{th} Edition):

Knoll, Fiona. “Johnson-Lindenstrauss Transformations.” 2017. Web. 12 Dec 2017.

Vancouver:

Knoll F. Johnson-Lindenstrauss Transformations. [Internet] [Doctoral dissertation]. Clemson University; 2017. [cited 2017 Dec 12]. Available from: http://tigerprints.clemson.edu/all_dissertations/1977.

Council of Science Editors:

Knoll F. Johnson-Lindenstrauss Transformations. [Doctoral Dissertation]. Clemson University; 2017. Available from: http://tigerprints.clemson.edu/all_dissertations/1977

University of Johannesburg

6.
Coulter, Duncan Anthony.
Immunologically amplified knowledge and intentions dimensionality *reduction* in cooperative multi-agent systems
.

Degree: 2014, University of Johannesburg

URL: http://hdl.handle.net/10210/12341

► The development of software systems is a relatively recent field of human endeavour. Even so, it has followed a steady progression of dominant paradigms which…
(more)

Subjects/Keywords: Dimension reduction (Statistics); Multiagent systems

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Coulter, D. A. (2014). Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems . (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/12341

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Coulter, Duncan Anthony. “Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems .” 2014. Thesis, University of Johannesburg. Accessed December 12, 2017. http://hdl.handle.net/10210/12341.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Coulter, Duncan Anthony. “Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems .” 2014. Web. 12 Dec 2017.

Vancouver:

Coulter DA. Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems . [Internet] [Thesis]. University of Johannesburg; 2014. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/10210/12341.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Coulter DA. Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems . [Thesis]. University of Johannesburg; 2014. Available from: http://hdl.handle.net/10210/12341

Not specified: Masters Thesis or Doctoral Dissertation

Penn State University

7.
Wang, Yu.
Nonlinear *Dimension* *Reduction* in Feature Space.

Degree: PhD, Statistics, 2008, Penn State University

URL: https://etda.libraries.psu.edu/catalog/8637

► In this thesis I introduce an idea for applying *dimension* *reduction* methods to feature spaces. Three main methods will be used to estimate *dimension* *reduction*…
(more)

Subjects/Keywords: Dimension Reduction; Feature Space

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, Y. (2008). Nonlinear Dimension Reduction in Feature Space. (Doctoral Dissertation). Penn State University. Retrieved from https://etda.libraries.psu.edu/catalog/8637

Chicago Manual of Style (16^{th} Edition):

Wang, Yu. “Nonlinear Dimension Reduction in Feature Space.” 2008. Doctoral Dissertation, Penn State University. Accessed December 12, 2017. https://etda.libraries.psu.edu/catalog/8637.

MLA Handbook (7^{th} Edition):

Wang, Yu. “Nonlinear Dimension Reduction in Feature Space.” 2008. Web. 12 Dec 2017.

Vancouver:

Wang Y. Nonlinear Dimension Reduction in Feature Space. [Internet] [Doctoral dissertation]. Penn State University; 2008. [cited 2017 Dec 12]. Available from: https://etda.libraries.psu.edu/catalog/8637.

Council of Science Editors:

Wang Y. Nonlinear Dimension Reduction in Feature Space. [Doctoral Dissertation]. Penn State University; 2008. Available from: https://etda.libraries.psu.edu/catalog/8637

University of Waterloo

8. Liu, Kai. Effective Dimensionality Control in Quantitative Finance and Insurance.

Degree: 2017, University of Waterloo

URL: http://hdl.handle.net/10012/12324

► It is well-known that *dimension* *reduction* techniques such as the Brownian bridge, principal component analysis, linear transformation could increase the efficiency of Quasi-Monte Carlo (QMC)…
(more)

Subjects/Keywords: QMC; Dimension Reduction; Effective Dimension; Effective Portfolio; Effective Portfolio Dimension

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Liu, K. (2017). Effective Dimensionality Control in Quantitative Finance and Insurance. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/12324

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Liu, Kai. “Effective Dimensionality Control in Quantitative Finance and Insurance.” 2017. Thesis, University of Waterloo. Accessed December 12, 2017. http://hdl.handle.net/10012/12324.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Liu, Kai. “Effective Dimensionality Control in Quantitative Finance and Insurance.” 2017. Web. 12 Dec 2017.

Vancouver:

Liu K. Effective Dimensionality Control in Quantitative Finance and Insurance. [Internet] [Thesis]. University of Waterloo; 2017. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/10012/12324.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu K. Effective Dimensionality Control in Quantitative Finance and Insurance. [Thesis]. University of Waterloo; 2017. Available from: http://hdl.handle.net/10012/12324

Not specified: Masters Thesis or Doctoral Dissertation

Temple University

9.
Yang, Chaozheng.
Sufficient *Dimension* *Reduction* in Complex Datasets.

Degree: PhD, 2016, Temple University

URL: http://digital.library.temple.edu/u?/p245801coll10,404627

►

Statistics

This dissertation focuses on two problems in *dimension* *reduction*. One is using permutation approach to test predictor contribution. The permutation approach applies to marginal…
(more)

Subjects/Keywords: Statistics;

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Yang, C. (2016). Sufficient Dimension Reduction in Complex Datasets. (Doctoral Dissertation). Temple University. Retrieved from http://digital.library.temple.edu/u?/p245801coll10,404627

Chicago Manual of Style (16^{th} Edition):

Yang, Chaozheng. “Sufficient Dimension Reduction in Complex Datasets.” 2016. Doctoral Dissertation, Temple University. Accessed December 12, 2017. http://digital.library.temple.edu/u?/p245801coll10,404627.

MLA Handbook (7^{th} Edition):

Yang, Chaozheng. “Sufficient Dimension Reduction in Complex Datasets.” 2016. Web. 12 Dec 2017.

Vancouver:

Yang C. Sufficient Dimension Reduction in Complex Datasets. [Internet] [Doctoral dissertation]. Temple University; 2016. [cited 2017 Dec 12]. Available from: http://digital.library.temple.edu/u?/p245801coll10,404627.

Council of Science Editors:

Yang C. Sufficient Dimension Reduction in Complex Datasets. [Doctoral Dissertation]. Temple University; 2016. Available from: http://digital.library.temple.edu/u?/p245801coll10,404627

University of Minnesota

10.
Chen, Xin.
Sufficient *dimension* *reduction* and variable selection.

Degree: PhD, Statistics, 2010, University of Minnesota

URL: http://purl.umn.edu/99484

► Sufficient *dimension* *reduction* (SDR) in regression was first introduced by Cook (2004). It reduces the *dimension* of the predictor space without loss of information and…
(more)

Subjects/Keywords: Central subspace; Dimension reduction; Regression; Sufficient dimension reduction; Variable selection; Statistics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Chen, X. (2010). Sufficient dimension reduction and variable selection. (Doctoral Dissertation). University of Minnesota. Retrieved from http://purl.umn.edu/99484

Chicago Manual of Style (16^{th} Edition):

Chen, Xin. “Sufficient dimension reduction and variable selection.” 2010. Doctoral Dissertation, University of Minnesota. Accessed December 12, 2017. http://purl.umn.edu/99484.

MLA Handbook (7^{th} Edition):

Chen, Xin. “Sufficient dimension reduction and variable selection.” 2010. Web. 12 Dec 2017.

Vancouver:

Chen X. Sufficient dimension reduction and variable selection. [Internet] [Doctoral dissertation]. University of Minnesota; 2010. [cited 2017 Dec 12]. Available from: http://purl.umn.edu/99484.

Council of Science Editors:

Chen X. Sufficient dimension reduction and variable selection. [Doctoral Dissertation]. University of Minnesota; 2010. Available from: http://purl.umn.edu/99484

University of Technology, Sydney

11.
Bian, Wei.
Supervised linear *dimension* * reduction*.

Degree: 2012, University of Technology, Sydney

URL: http://hdl.handle.net/10453/20422

► Supervised linear *dimension* *reduction* (SLDR) is one of the most effective methods for complexity *reduction*, which has been widely applied in pattern recognition, computer vision,…
(more)

Subjects/Keywords: Pattern recognition.; Dimension reduction.; Statistics.; Mathematics.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Bian, W. (2012). Supervised linear dimension reduction. (Thesis). University of Technology, Sydney. Retrieved from http://hdl.handle.net/10453/20422

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Bian, Wei. “Supervised linear dimension reduction.” 2012. Thesis, University of Technology, Sydney. Accessed December 12, 2017. http://hdl.handle.net/10453/20422.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Bian, Wei. “Supervised linear dimension reduction.” 2012. Web. 12 Dec 2017.

Vancouver:

Bian W. Supervised linear dimension reduction. [Internet] [Thesis]. University of Technology, Sydney; 2012. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/10453/20422.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Bian W. Supervised linear dimension reduction. [Thesis]. University of Technology, Sydney; 2012. Available from: http://hdl.handle.net/10453/20422

Not specified: Masters Thesis or Doctoral Dissertation

Cornell University

12.
Chen, Maximillian.
*Dimension**Reduction* And Inferential Procedures For Images
.

Degree: 2014, Cornell University

URL: http://hdl.handle.net/1813/37105

► High-dimensional data analysis has been a prominent topic of statistical research in recent years due to the growing presence of high-dimensional electronic data. Much of…
(more)

Subjects/Keywords: imaging data; dimension reduction; hypothesis testing

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Chen, M. (2014). Dimension Reduction And Inferential Procedures For Images . (Thesis). Cornell University. Retrieved from http://hdl.handle.net/1813/37105

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Chen, Maximillian. “Dimension Reduction And Inferential Procedures For Images .” 2014. Thesis, Cornell University. Accessed December 12, 2017. http://hdl.handle.net/1813/37105.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Chen, Maximillian. “Dimension Reduction And Inferential Procedures For Images .” 2014. Web. 12 Dec 2017.

Vancouver:

Chen M. Dimension Reduction And Inferential Procedures For Images . [Internet] [Thesis]. Cornell University; 2014. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/1813/37105.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chen M. Dimension Reduction And Inferential Procedures For Images . [Thesis]. Cornell University; 2014. Available from: http://hdl.handle.net/1813/37105

Not specified: Masters Thesis or Doctoral Dissertation

Uppsala University

13. Li, Qiongzhu. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.

Degree: Statistics, 2016, Uppsala University

URL: http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080

► In this paper, we try to compare the performance of two feature *dimension* *reduction* methods, the LASSO and PCA. Both simulation study and empirical…
(more)

Subjects/Keywords: Machine learning; Feature Dimension Reduction; NPL

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Li, Q. (2016). Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. (Thesis). Uppsala University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Li, Qiongzhu. “Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.” 2016. Thesis, Uppsala University. Accessed December 12, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Li, Qiongzhu. “Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.” 2016. Web. 12 Dec 2017.

Vancouver:

Li Q. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. [Internet] [Thesis]. Uppsala University; 2016. [cited 2017 Dec 12]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Li Q. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. [Thesis]. Uppsala University; 2016. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080

Not specified: Masters Thesis or Doctoral Dissertation

Linnaeus University

14.
Sun, Xuebo.
An Application of *Dimension* *Reduction* for Intention Groups in Reddit.

Degree: Computer Science, 2016, Linnaeus University

URL: http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500

► Reddit (www.reddit.com) is a social news platform for information sharing and exchanging. The amount of data, in terms of both observations and dimensions is…
(more)

Subjects/Keywords: Reddit; communication model; dimension reduction; similarity metric

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sun, X. (2016). An Application of Dimension Reduction for Intention Groups in Reddit. (Thesis). Linnaeus University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Sun, Xuebo. “An Application of Dimension Reduction for Intention Groups in Reddit.” 2016. Thesis, Linnaeus University. Accessed December 12, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Sun, Xuebo. “An Application of Dimension Reduction for Intention Groups in Reddit.” 2016. Web. 12 Dec 2017.

Vancouver:

Sun X. An Application of Dimension Reduction for Intention Groups in Reddit. [Internet] [Thesis]. Linnaeus University; 2016. [cited 2017 Dec 12]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sun X. An Application of Dimension Reduction for Intention Groups in Reddit. [Thesis]. Linnaeus University; 2016. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500

Not specified: Masters Thesis or Doctoral Dissertation

University of Minnesota

15.
Adragni, Kofi Placid.
*Dimension**reduction* and prediction in large p regressions.

Degree: PhD, Statistics, 2009, University of Minnesota

URL: http://purl.umn.edu/51904

► A high dimensional regression setting is considered with p predictors X=(X1,...,Xp)T and a response Y. The interest is with large p, possibly much larger than…
(more)

Subjects/Keywords: Dimension Reduction; Prediction; Principal Components; Principal Fitted Components; Regression; Sufficient Dimension Reduction; Statistics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Adragni, K. P. (2009). Dimension reduction and prediction in large p regressions. (Doctoral Dissertation). University of Minnesota. Retrieved from http://purl.umn.edu/51904

Chicago Manual of Style (16^{th} Edition):

Adragni, Kofi Placid. “Dimension reduction and prediction in large p regressions.” 2009. Doctoral Dissertation, University of Minnesota. Accessed December 12, 2017. http://purl.umn.edu/51904.

MLA Handbook (7^{th} Edition):

Adragni, Kofi Placid. “Dimension reduction and prediction in large p regressions.” 2009. Web. 12 Dec 2017.

Vancouver:

Adragni KP. Dimension reduction and prediction in large p regressions. [Internet] [Doctoral dissertation]. University of Minnesota; 2009. [cited 2017 Dec 12]. Available from: http://purl.umn.edu/51904.

Council of Science Editors:

Adragni KP. Dimension reduction and prediction in large p regressions. [Doctoral Dissertation]. University of Minnesota; 2009. Available from: http://purl.umn.edu/51904

16. Hoyos-Idrobo, Andrés. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.

Degree: Docteur es, Informatique, 2017, Paris Saclay

URL: http://www.theses.fr/2017SACLS029

►

En imagerie médicale, des collaborations internationales ont lançé l'acquisition de centaines de Terabytes de données - et en particulierde données d'Imagerie par Résonance Magnétique fonctionelle… (more)

Subjects/Keywords: IRMf; Clustering; Reduction de dimension; Décodage; FMRI; Clustering; Dimentionality reduction; Decoding

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Hoyos-Idrobo, A. (2017). Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. (Doctoral Dissertation). Paris Saclay. Retrieved from http://www.theses.fr/2017SACLS029

Chicago Manual of Style (16^{th} Edition):

Hoyos-Idrobo, Andrés. “Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.” 2017. Doctoral Dissertation, Paris Saclay. Accessed December 12, 2017. http://www.theses.fr/2017SACLS029.

MLA Handbook (7^{th} Edition):

Hoyos-Idrobo, Andrés. “Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.” 2017. Web. 12 Dec 2017.

Vancouver:

Hoyos-Idrobo A. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. [Internet] [Doctoral dissertation]. Paris Saclay; 2017. [cited 2017 Dec 12]. Available from: http://www.theses.fr/2017SACLS029.

Council of Science Editors:

Hoyos-Idrobo A. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. [Doctoral Dissertation]. Paris Saclay; 2017. Available from: http://www.theses.fr/2017SACLS029

Clemson University

17. Wilson, Matthew Robert. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.

Degree: MS, Computer Engineering, 2016, Clemson University

URL: http://tigerprints.clemson.edu/all_theses/2357

► Reducing the input dimensionality of large datasets for subsequent processing will allow the process to become less computationally complex and expensive. This thesis tests if…
(more)

Subjects/Keywords: Dimension reduction; feature reduction; feature selection; input reduction; Karnin Sensitivity; Principal Component Analysis

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wilson, M. R. (2016). Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. (Masters Thesis). Clemson University. Retrieved from http://tigerprints.clemson.edu/all_theses/2357

Chicago Manual of Style (16^{th} Edition):

Wilson, Matthew Robert. “Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.” 2016. Masters Thesis, Clemson University. Accessed December 12, 2017. http://tigerprints.clemson.edu/all_theses/2357.

MLA Handbook (7^{th} Edition):

Wilson, Matthew Robert. “Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.” 2016. Web. 12 Dec 2017.

Vancouver:

Wilson MR. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. [Internet] [Masters thesis]. Clemson University; 2016. [cited 2017 Dec 12]. Available from: http://tigerprints.clemson.edu/all_theses/2357.

Council of Science Editors:

Wilson MR. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. [Masters Thesis]. Clemson University; 2016. Available from: http://tigerprints.clemson.edu/all_theses/2357

University of Waterloo

18. Liu, Kai. Directional Control of Generating Brownian Path under Quasi Monte Carlo.

Degree: 2012, University of Waterloo

URL: http://hdl.handle.net/10012/6984

► Quasi-Monte Carlo (QMC) methods are playing an increasingly important role in computational finance. This is attributed to the increased complexity of the derivative securities and…
(more)

Subjects/Keywords: QMC; Low Discrepancy Sequence; Effective Dimension; Dimension Reduction; PCA; BB; LT; OT; FOT; DC

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Liu, K. (2012). Directional Control of Generating Brownian Path under Quasi Monte Carlo. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/6984

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Liu, Kai. “Directional Control of Generating Brownian Path under Quasi Monte Carlo.” 2012. Thesis, University of Waterloo. Accessed December 12, 2017. http://hdl.handle.net/10012/6984.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Liu, Kai. “Directional Control of Generating Brownian Path under Quasi Monte Carlo.” 2012. Web. 12 Dec 2017.

Vancouver:

Liu K. Directional Control of Generating Brownian Path under Quasi Monte Carlo. [Internet] [Thesis]. University of Waterloo; 2012. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/10012/6984.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu K. Directional Control of Generating Brownian Path under Quasi Monte Carlo. [Thesis]. University of Waterloo; 2012. Available from: http://hdl.handle.net/10012/6984

Not specified: Masters Thesis or Doctoral Dissertation

19.
Lu, Weizhi.
Contribution to *dimension* *reduction* techniques : application to object tracking : Contribution aux techniques de la réduction de *dimension* : application au suivi d'objet.

Degree: Docteur es, Traitement du signal et de l'image, 2014, Rennes, INSA

URL: http://www.theses.fr/2014ISAR0010

►

Cette thèse étudie et apporte des améliorations significatives sur trois techniques répandues en réduction de *dimension* : l'acquisition parcimonieuse (ou l'échantillonnage parcimonieux), la projection aléatoire…
(more)

Subjects/Keywords: Réduction de dimension; Dimension reduction; Compressed sensing; Random projection; Sparse representation; 621.382

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Lu, W. (2014). Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. (Doctoral Dissertation). Rennes, INSA. Retrieved from http://www.theses.fr/2014ISAR0010

Chicago Manual of Style (16^{th} Edition):

Lu, Weizhi. “Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet.” 2014. Doctoral Dissertation, Rennes, INSA. Accessed December 12, 2017. http://www.theses.fr/2014ISAR0010.

MLA Handbook (7^{th} Edition):

Lu, Weizhi. “Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet.” 2014. Web. 12 Dec 2017.

Vancouver:

Lu W. Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. [Internet] [Doctoral dissertation]. Rennes, INSA; 2014. [cited 2017 Dec 12]. Available from: http://www.theses.fr/2014ISAR0010.

Council of Science Editors:

Lu W. Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. [Doctoral Dissertation]. Rennes, INSA; 2014. Available from: http://www.theses.fr/2014ISAR0010

20.
Vu, Khac Ky.
Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande * dimension*.

Degree: Docteur es, Informatique, 2016, Paris Saclay

URL: http://www.theses.fr/2016SACLX031

► À l'ère de la numérisation, les données devient pas cher et facile à obtenir. Cela se traduit par de nombreux nouveaux problèmes d'optimisation avec de…
(more)

Subjects/Keywords: Réduction de dimension; Approximation; Optimisation; Algorithmes randomisés; Dimension reduction; Approximation; Optimization; Randomized algorithms

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Vu, K. K. (2016). Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. (Doctoral Dissertation). Paris Saclay. Retrieved from http://www.theses.fr/2016SACLX031

Chicago Manual of Style (16^{th} Edition):

Vu, Khac Ky. “Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension.” 2016. Doctoral Dissertation, Paris Saclay. Accessed December 12, 2017. http://www.theses.fr/2016SACLX031.

MLA Handbook (7^{th} Edition):

Vu, Khac Ky. “Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension.” 2016. Web. 12 Dec 2017.

Vancouver:

Vu KK. Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. [Internet] [Doctoral dissertation]. Paris Saclay; 2016. [cited 2017 Dec 12]. Available from: http://www.theses.fr/2016SACLX031.

Council of Science Editors:

Vu KK. Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. [Doctoral Dissertation]. Paris Saclay; 2016. Available from: http://www.theses.fr/2016SACLX031

Texas State University – San Marcos

21.
Reiss, Randolf H.
EIGENVALUES AND EIGENVECTORS IN DATA *DIMENSION* *REDUCTION* FOR REGRESSION.

Degree: 2013, Texas State University – San Marcos

URL: https://digital.library.txstate.edu/handle/10877/4696

► A basic theory of eigenvalues and eigenvectors as a means to reduce the *dimension* of data, is presented. Iterative methods for finding eigenvalues and eigenvectors…
(more)

Subjects/Keywords: Eigenvector, Eigenvalue, Dimension Reduction, Power Method, Partial Least Squares.; Eigenvalues; Eigenvectors; Data reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Reiss, R. H. (2013). EIGENVALUES AND EIGENVECTORS IN DATA DIMENSION REDUCTION FOR REGRESSION. (Thesis). Texas State University – San Marcos. Retrieved from https://digital.library.txstate.edu/handle/10877/4696

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Reiss, Randolf H. “EIGENVALUES AND EIGENVECTORS IN DATA DIMENSION REDUCTION FOR REGRESSION.” 2013. Thesis, Texas State University – San Marcos. Accessed December 12, 2017. https://digital.library.txstate.edu/handle/10877/4696.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Reiss, Randolf H. “EIGENVALUES AND EIGENVECTORS IN DATA DIMENSION REDUCTION FOR REGRESSION.” 2013. Web. 12 Dec 2017.

Vancouver:

Reiss RH. EIGENVALUES AND EIGENVECTORS IN DATA DIMENSION REDUCTION FOR REGRESSION. [Internet] [Thesis]. Texas State University – San Marcos; 2013. [cited 2017 Dec 12]. Available from: https://digital.library.txstate.edu/handle/10877/4696.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Reiss RH. EIGENVALUES AND EIGENVECTORS IN DATA DIMENSION REDUCTION FOR REGRESSION. [Thesis]. Texas State University – San Marcos; 2013. Available from: https://digital.library.txstate.edu/handle/10877/4696

Not specified: Masters Thesis or Doctoral Dissertation

University of Oregon

22. Huck, Kevin A., 1972-. Knowledge support for parallel performance data mining.

Degree: 2009, University of Oregon

URL: http://hdl.handle.net/1794/10087

► Parallel applications running on high-end computer systems manifest a complex combination of performance phenomena, such as communication patterns, work distributions, and computational inefficiencies. Current performance…
(more)

Subjects/Keywords: Parallel performance; Data mining; Dimension reduction; Clustering; Computer science

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Huck, Kevin A., 1. (2009). Knowledge support for parallel performance data mining. (Thesis). University of Oregon. Retrieved from http://hdl.handle.net/1794/10087

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Huck, Kevin A., 1972-. “Knowledge support for parallel performance data mining.” 2009. Thesis, University of Oregon. Accessed December 12, 2017. http://hdl.handle.net/1794/10087.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Huck, Kevin A., 1972-. “Knowledge support for parallel performance data mining.” 2009. Web. 12 Dec 2017.

Vancouver:

Huck, Kevin A. 1. Knowledge support for parallel performance data mining. [Internet] [Thesis]. University of Oregon; 2009. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/1794/10087.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Huck, Kevin A. 1. Knowledge support for parallel performance data mining. [Thesis]. University of Oregon; 2009. Available from: http://hdl.handle.net/1794/10087

Not specified: Masters Thesis or Doctoral Dissertation

University of Arizona

23. Wauters, John. Independence Screening in High-Dimensional Data .

Degree: 2016, University of Arizona

URL: http://hdl.handle.net/10150/623083

► High-dimensional data, data in which the number of dimensions exceeds the number of observations, is increasingly common in statistics. The term "ultra-high dimensional" is defined…
(more)

Subjects/Keywords: feature screening; high-dimensional data; independence screening; modeling; dimension reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wauters, J. (2016). Independence Screening in High-Dimensional Data . (Masters Thesis). University of Arizona. Retrieved from http://hdl.handle.net/10150/623083

Chicago Manual of Style (16^{th} Edition):

Wauters, John. “Independence Screening in High-Dimensional Data .” 2016. Masters Thesis, University of Arizona. Accessed December 12, 2017. http://hdl.handle.net/10150/623083.

MLA Handbook (7^{th} Edition):

Wauters, John. “Independence Screening in High-Dimensional Data .” 2016. Web. 12 Dec 2017.

Vancouver:

Wauters J. Independence Screening in High-Dimensional Data . [Internet] [Masters thesis]. University of Arizona; 2016. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/10150/623083.

Council of Science Editors:

Wauters J. Independence Screening in High-Dimensional Data . [Masters Thesis]. University of Arizona; 2016. Available from: http://hdl.handle.net/10150/623083

Rochester Institute of Technology

24. Johnson, Juan Emmanuel. Schroedinger Eigenmaps for Manifold Alignment of Multimodal Hyperspectral Images.

Degree: MS, School of Mathematical Sciences (COS), 2016, Rochester Institute of Technology

URL: http://scholarworks.rit.edu/theses/9324

► Multimodal remote sensing is an upcoming field as it allows for many views of the same region of interest. Domain adaption attempts to fuse…
(more)

Subjects/Keywords: Computer vision; Data fusion; Dimension reduction; Image science; Remote sensing

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Johnson, J. E. (2016). Schroedinger Eigenmaps for Manifold Alignment of Multimodal Hyperspectral Images. (Masters Thesis). Rochester Institute of Technology. Retrieved from http://scholarworks.rit.edu/theses/9324

Chicago Manual of Style (16^{th} Edition):

Johnson, Juan Emmanuel. “Schroedinger Eigenmaps for Manifold Alignment of Multimodal Hyperspectral Images.” 2016. Masters Thesis, Rochester Institute of Technology. Accessed December 12, 2017. http://scholarworks.rit.edu/theses/9324.

MLA Handbook (7^{th} Edition):

Johnson, Juan Emmanuel. “Schroedinger Eigenmaps for Manifold Alignment of Multimodal Hyperspectral Images.” 2016. Web. 12 Dec 2017.

Vancouver:

Johnson JE. Schroedinger Eigenmaps for Manifold Alignment of Multimodal Hyperspectral Images. [Internet] [Masters thesis]. Rochester Institute of Technology; 2016. [cited 2017 Dec 12]. Available from: http://scholarworks.rit.edu/theses/9324.

Council of Science Editors:

Johnson JE. Schroedinger Eigenmaps for Manifold Alignment of Multimodal Hyperspectral Images. [Masters Thesis]. Rochester Institute of Technology; 2016. Available from: http://scholarworks.rit.edu/theses/9324

University of Guelph

25.
Morris, Katherine.
*Dimension**Reduction* for Model-based Clustering via Mixtures of Multivariate t-Distributions
.

Degree: 2012, University of Guelph

URL: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863

► We introduce a *dimension* *reduction* method for model-based clustering obtained from a finite mixture of t-distributions. This approach is based on existing work on reducing…
(more)

Subjects/Keywords: mclust; tEIGEN; model-based clustering; dimension reduction; multivariate t-mixtures

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Morris, K. (2012). Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions . (Thesis). University of Guelph. Retrieved from https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Morris, Katherine. “Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions .” 2012. Thesis, University of Guelph. Accessed December 12, 2017. https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Morris, Katherine. “Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions .” 2012. Web. 12 Dec 2017.

Vancouver:

Morris K. Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions . [Internet] [Thesis]. University of Guelph; 2012. [cited 2017 Dec 12]. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Morris K. Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions . [Thesis]. University of Guelph; 2012. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863

Not specified: Masters Thesis or Doctoral Dissertation

Colorado State University

26.
Emerson, Tegan Halley.
A geometric data analysis approach to *dimension* *reduction* in machine learning and data mining in medical and biological sensing.

Degree: PhD, Mathematics, 2017, Colorado State University

URL: http://hdl.handle.net/10217/183941

► Geometric data analysis seeks to uncover and leverage structure in data for tasks in machine learning when data is visualized as points in some dimensional,…
(more)

Subjects/Keywords: dimension reduction; Grassmannian manifold; data mining; machine learning; geometric data analysis

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Emerson, T. H. (2017). A geometric data analysis approach to dimension reduction in machine learning and data mining in medical and biological sensing. (Doctoral Dissertation). Colorado State University. Retrieved from http://hdl.handle.net/10217/183941

Chicago Manual of Style (16^{th} Edition):

Emerson, Tegan Halley. “A geometric data analysis approach to dimension reduction in machine learning and data mining in medical and biological sensing.” 2017. Doctoral Dissertation, Colorado State University. Accessed December 12, 2017. http://hdl.handle.net/10217/183941.

MLA Handbook (7^{th} Edition):

Emerson, Tegan Halley. “A geometric data analysis approach to dimension reduction in machine learning and data mining in medical and biological sensing.” 2017. Web. 12 Dec 2017.

Vancouver:

Emerson TH. A geometric data analysis approach to dimension reduction in machine learning and data mining in medical and biological sensing. [Internet] [Doctoral dissertation]. Colorado State University; 2017. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/10217/183941.

Council of Science Editors:

Emerson TH. A geometric data analysis approach to dimension reduction in machine learning and data mining in medical and biological sensing. [Doctoral Dissertation]. Colorado State University; 2017. Available from: http://hdl.handle.net/10217/183941

University of Cincinnati

27. Sun, Yan. Regularization for High-dimensional Time Series Models.

Degree: PhD, Arts and Sciences: Mathematical Sciences, 2011, University of Cincinnati

URL: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387

► Analyzing multivariate time series has been a very important topic in economics, finance, engineering, social and natural sciences. Compared to univariate models, the multivariate…
(more)

Subjects/Keywords: Statistics; conditional likelihood; dimension reduction; oracle property; sparse; stationary; time series

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sun, Y. (2011). Regularization for High-dimensional Time Series Models. (Doctoral Dissertation). University of Cincinnati. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387

Chicago Manual of Style (16^{th} Edition):

Sun, Yan. “Regularization for High-dimensional Time Series Models.” 2011. Doctoral Dissertation, University of Cincinnati. Accessed December 12, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387.

MLA Handbook (7^{th} Edition):

Sun, Yan. “Regularization for High-dimensional Time Series Models.” 2011. Web. 12 Dec 2017.

Vancouver:

Sun Y. Regularization for High-dimensional Time Series Models. [Internet] [Doctoral dissertation]. University of Cincinnati; 2011. [cited 2017 Dec 12]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387.

Council of Science Editors:

Sun Y. Regularization for High-dimensional Time Series Models. [Doctoral Dissertation]. University of Cincinnati; 2011. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387

University of Cincinnati

28. Zhou, Xuan. An Efficient Algorithm for Clustering Genomic Data.

Degree: MS, Engineering and Applied Science: Computer Science, 2014, University of Cincinnati

URL: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1418910389

► In this thesis, we investigated an efficient framework for clustering analysis of gene expression profiles by discretizing continuous genomic data and adopting the 1D-jury approach…
(more)

Subjects/Keywords: Computer Science; genomic data; clustering; discretization; 1D-Jury; dimension reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhou, X. (2014). An Efficient Algorithm for Clustering Genomic Data. (Masters Thesis). University of Cincinnati. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=ucin1418910389

Chicago Manual of Style (16^{th} Edition):

Zhou, Xuan. “An Efficient Algorithm for Clustering Genomic Data.” 2014. Masters Thesis, University of Cincinnati. Accessed December 12, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1418910389.

MLA Handbook (7^{th} Edition):

Zhou, Xuan. “An Efficient Algorithm for Clustering Genomic Data.” 2014. Web. 12 Dec 2017.

Vancouver:

Zhou X. An Efficient Algorithm for Clustering Genomic Data. [Internet] [Masters thesis]. University of Cincinnati; 2014. [cited 2017 Dec 12]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1418910389.

Council of Science Editors:

Zhou X. An Efficient Algorithm for Clustering Genomic Data. [Masters Thesis]. University of Cincinnati; 2014. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1418910389

Duke University

29.
Mao, Kai.
Nonparametric Bayesian Models for Supervised *Dimension* *Reduction* and Regression.

Degree: 2009, Duke University

URL: http://hdl.handle.net/10161/1581

► We propose nonparametric Bayesian models for supervised *dimension* *reduction* and regression problems. Supervised *dimension* *reduction* is a setting where one needs to reduce the…
(more)

Subjects/Keywords: Statistics; Dirichlet process; Kernel models; Nonparametric Bayesian; Supervised dimension reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Mao, K. (2009). Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression. (Thesis). Duke University. Retrieved from http://hdl.handle.net/10161/1581

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Mao, Kai. “Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression.” 2009. Thesis, Duke University. Accessed December 12, 2017. http://hdl.handle.net/10161/1581.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Mao, Kai. “Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression.” 2009. Web. 12 Dec 2017.

Vancouver:

Mao K. Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression. [Internet] [Thesis]. Duke University; 2009. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/10161/1581.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Mao K. Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression. [Thesis]. Duke University; 2009. Available from: http://hdl.handle.net/10161/1581

Not specified: Masters Thesis or Doctoral Dissertation

University of Ottawa

30. He, Qiangsen. Person Re-identification Based on Kernel Local Fisher Discriminant Analysis and Mahalanobis Distance Learning .

Degree: 2017, University of Ottawa

URL: http://hdl.handle.net/10393/36044

► Person re-identification (Re-ID) has become an intense research area in recent years. The main goal of this topic is to recognize and match individuals over…
(more)

Subjects/Keywords: Re-ID; Mahalanobis distance metric; KLFDA dimension reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

He, Q. (2017). Person Re-identification Based on Kernel Local Fisher Discriminant Analysis and Mahalanobis Distance Learning . (Thesis). University of Ottawa. Retrieved from http://hdl.handle.net/10393/36044

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

He, Qiangsen. “Person Re-identification Based on Kernel Local Fisher Discriminant Analysis and Mahalanobis Distance Learning .” 2017. Thesis, University of Ottawa. Accessed December 12, 2017. http://hdl.handle.net/10393/36044.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

He, Qiangsen. “Person Re-identification Based on Kernel Local Fisher Discriminant Analysis and Mahalanobis Distance Learning .” 2017. Web. 12 Dec 2017.

Vancouver:

He Q. Person Re-identification Based on Kernel Local Fisher Discriminant Analysis and Mahalanobis Distance Learning . [Internet] [Thesis]. University of Ottawa; 2017. [cited 2017 Dec 12]. Available from: http://hdl.handle.net/10393/36044.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

He Q. Person Re-identification Based on Kernel Local Fisher Discriminant Analysis and Mahalanobis Distance Learning . [Thesis]. University of Ottawa; 2017. Available from: http://hdl.handle.net/10393/36044

Not specified: Masters Thesis or Doctoral Dissertation