Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(Dimension reduction)`

.
Showing records 1 – 30 of
215 total matches.

◁ [1] [2] [3] [4] [5] [6] [7] [8] ▶

Search Limiters

Dates

- 2015 – 2019 (101)
- 2010 – 2014 (89)
- 2005 – 2009 (29)

Department

- Statistics (24)
- Computer Science (11)

Degrees

- PhD (73)
- Docteur es (34)
- MS (16)

▼ Search Limiters

Oregon State University

1. Thangavelu, Madan Kumar. On error bounds for linear feature extraction.

Degree: MS, Computer Science, 2010, Oregon State University

URL: http://hdl.handle.net/1957/13886

► Linear transformation for *dimension* *reduction* is a well established problem in the field of machine learning. Due to the numerous observability of parameters and data,…
(more)

Subjects/Keywords: Dimension reduction; Dimension reduction (Statistics)

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Thangavelu, M. K. (2010). On error bounds for linear feature extraction. (Masters Thesis). Oregon State University. Retrieved from http://hdl.handle.net/1957/13886

Chicago Manual of Style (16^{th} Edition):

Thangavelu, Madan Kumar. “On error bounds for linear feature extraction.” 2010. Masters Thesis, Oregon State University. Accessed October 21, 2019. http://hdl.handle.net/1957/13886.

MLA Handbook (7^{th} Edition):

Thangavelu, Madan Kumar. “On error bounds for linear feature extraction.” 2010. Web. 21 Oct 2019.

Vancouver:

Thangavelu MK. On error bounds for linear feature extraction. [Internet] [Masters thesis]. Oregon State University; 2010. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/1957/13886.

Council of Science Editors:

Thangavelu MK. On error bounds for linear feature extraction. [Masters Thesis]. Oregon State University; 2010. Available from: http://hdl.handle.net/1957/13886

University of Georgia

2.
Wang, Qin.
Sufficient *dimension* *reduction* and sufficient variable selection.

Degree: PhD, Statistics, 2009, University of Georgia

URL: http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd

► The development in theory and methodology for sufficient *dimension* *reduction* has provided a powerful tool to tackle the challenging problem of high dimensional data analysis.…
(more)

Subjects/Keywords: sufficient dimension reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, Q. (2009). Sufficient dimension reduction and sufficient variable selection. (Doctoral Dissertation). University of Georgia. Retrieved from http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd

Chicago Manual of Style (16^{th} Edition):

Wang, Qin. “Sufficient dimension reduction and sufficient variable selection.” 2009. Doctoral Dissertation, University of Georgia. Accessed October 21, 2019. http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd.

MLA Handbook (7^{th} Edition):

Wang, Qin. “Sufficient dimension reduction and sufficient variable selection.” 2009. Web. 21 Oct 2019.

Vancouver:

Wang Q. Sufficient dimension reduction and sufficient variable selection. [Internet] [Doctoral dissertation]. University of Georgia; 2009. [cited 2019 Oct 21]. Available from: http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd.

Council of Science Editors:

Wang Q. Sufficient dimension reduction and sufficient variable selection. [Doctoral Dissertation]. University of Georgia; 2009. Available from: http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd

Baylor University

3.
[No author].
Three applications of linear *dimension* *reduction*.

Degree: 2017, Baylor University

URL: http://hdl.handle.net/2104/10182

► Linear *Dimension* *Reduction* (LDR) has many uses in engineering, business, medicine, economics, data science and others. LDR can be employed when observations are recorded with…
(more)

Subjects/Keywords: Linear dimension reduction.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

author], [. (2017). Three applications of linear dimension reduction. (Thesis). Baylor University. Retrieved from http://hdl.handle.net/2104/10182

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

author], [No. “Three applications of linear dimension reduction. ” 2017. Thesis, Baylor University. Accessed October 21, 2019. http://hdl.handle.net/2104/10182.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

author], [No. “Three applications of linear dimension reduction. ” 2017. Web. 21 Oct 2019.

Vancouver:

author] [. Three applications of linear dimension reduction. [Internet] [Thesis]. Baylor University; 2017. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/2104/10182.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

author] [. Three applications of linear dimension reduction. [Thesis]. Baylor University; 2017. Available from: http://hdl.handle.net/2104/10182

Not specified: Masters Thesis or Doctoral Dissertation

University of Georgia

4.
Ling, Yangrong.
Statistical *dimension* *reduction* methods for appearance-based face recognition.

Degree: MS, Computer Science, 2003, University of Georgia

URL: http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms

► Two novel moment-based methods which are insensitive to large variation in lighting direction and facial expression are developed for appearance-based face recognition using *dimension* *reduction*…
(more)

Subjects/Keywords: Dimension-reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ling, Y. (2003). Statistical dimension reduction methods for appearance-based face recognition. (Masters Thesis). University of Georgia. Retrieved from http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms

Chicago Manual of Style (16^{th} Edition):

Ling, Yangrong. “Statistical dimension reduction methods for appearance-based face recognition.” 2003. Masters Thesis, University of Georgia. Accessed October 21, 2019. http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms.

MLA Handbook (7^{th} Edition):

Ling, Yangrong. “Statistical dimension reduction methods for appearance-based face recognition.” 2003. Web. 21 Oct 2019.

Vancouver:

Ling Y. Statistical dimension reduction methods for appearance-based face recognition. [Internet] [Masters thesis]. University of Georgia; 2003. [cited 2019 Oct 21]. Available from: http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms.

Council of Science Editors:

Ling Y. Statistical dimension reduction methods for appearance-based face recognition. [Masters Thesis]. University of Georgia; 2003. Available from: http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms

Baylor University

5.
Young, Phil D.
Topics in *dimension* *reduction* and missing data in statistical discrimination.

Degree: Statistical Sciences., 2010, Baylor University

URL: http://hdl.handle.net/2104/5543

► This dissertation is comprised of four chapters. In the first chapter, we define the concept of linear *dimension* *reduction*, review some popular linear *dimension* *reduction*…
(more)

Subjects/Keywords: Dimension reduction.; Statistical discrimination.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Young, P. D. (2010). Topics in dimension reduction and missing data in statistical discrimination. (Thesis). Baylor University. Retrieved from http://hdl.handle.net/2104/5543

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Young, Phil D. “Topics in dimension reduction and missing data in statistical discrimination. ” 2010. Thesis, Baylor University. Accessed October 21, 2019. http://hdl.handle.net/2104/5543.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Young, Phil D. “Topics in dimension reduction and missing data in statistical discrimination. ” 2010. Web. 21 Oct 2019.

Vancouver:

Young PD. Topics in dimension reduction and missing data in statistical discrimination. [Internet] [Thesis]. Baylor University; 2010. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/2104/5543.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Young PD. Topics in dimension reduction and missing data in statistical discrimination. [Thesis]. Baylor University; 2010. Available from: http://hdl.handle.net/2104/5543

Not specified: Masters Thesis or Doctoral Dissertation

Penn State University

6.
Wang, Yu.
Nonlinear *Dimension* *Reduction* in Feature Space.

Degree: PhD, Statistics, 2008, Penn State University

URL: https://etda.libraries.psu.edu/catalog/8637

► In this thesis I introduce an idea for applying *dimension* *reduction* methods to feature spaces. Three main methods will be used to estimate *dimension* *reduction*…
(more)

Subjects/Keywords: Dimension Reduction; Feature Space

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, Y. (2008). Nonlinear Dimension Reduction in Feature Space. (Doctoral Dissertation). Penn State University. Retrieved from https://etda.libraries.psu.edu/catalog/8637

Chicago Manual of Style (16^{th} Edition):

Wang, Yu. “Nonlinear Dimension Reduction in Feature Space.” 2008. Doctoral Dissertation, Penn State University. Accessed October 21, 2019. https://etda.libraries.psu.edu/catalog/8637.

MLA Handbook (7^{th} Edition):

Wang, Yu. “Nonlinear Dimension Reduction in Feature Space.” 2008. Web. 21 Oct 2019.

Vancouver:

Wang Y. Nonlinear Dimension Reduction in Feature Space. [Internet] [Doctoral dissertation]. Penn State University; 2008. [cited 2019 Oct 21]. Available from: https://etda.libraries.psu.edu/catalog/8637.

Council of Science Editors:

Wang Y. Nonlinear Dimension Reduction in Feature Space. [Doctoral Dissertation]. Penn State University; 2008. Available from: https://etda.libraries.psu.edu/catalog/8637

Clemson University

7. Knoll, Fiona. Johnson-Lindenstrauss Transformations.

Degree: PhD, Mathematical Sciences, 2017, Clemson University

URL: https://tigerprints.clemson.edu/all_dissertations/1977

► With the quick progression of technology and the increasing need to process large data, there has been an increased interest in data-dependent and data-independent *dimension*…
(more)

Subjects/Keywords: Data; Dimension Reduction; Johnson-Lindenstrauss

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Knoll, F. (2017). Johnson-Lindenstrauss Transformations. (Doctoral Dissertation). Clemson University. Retrieved from https://tigerprints.clemson.edu/all_dissertations/1977

Chicago Manual of Style (16^{th} Edition):

Knoll, Fiona. “Johnson-Lindenstrauss Transformations.” 2017. Doctoral Dissertation, Clemson University. Accessed October 21, 2019. https://tigerprints.clemson.edu/all_dissertations/1977.

MLA Handbook (7^{th} Edition):

Knoll, Fiona. “Johnson-Lindenstrauss Transformations.” 2017. Web. 21 Oct 2019.

Vancouver:

Knoll F. Johnson-Lindenstrauss Transformations. [Internet] [Doctoral dissertation]. Clemson University; 2017. [cited 2019 Oct 21]. Available from: https://tigerprints.clemson.edu/all_dissertations/1977.

Council of Science Editors:

Knoll F. Johnson-Lindenstrauss Transformations. [Doctoral Dissertation]. Clemson University; 2017. Available from: https://tigerprints.clemson.edu/all_dissertations/1977

University of Johannesburg

8.
Coulter, Duncan Anthony.
Immunologically amplified knowledge and intentions dimensionality *reduction* in cooperative multi-agent systems.

Degree: 2014, University of Johannesburg

URL: http://hdl.handle.net/10210/12341

►

Ph.D. (Computer Science)

The development of software systems is a relatively recent field of human endeavour. Even so, it has followed a steady progression of… (more)

Subjects/Keywords: Dimension reduction (Statistics); Multiagent systems

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Coulter, D. A. (2014). Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/12341

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Coulter, Duncan Anthony. “Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems.” 2014. Thesis, University of Johannesburg. Accessed October 21, 2019. http://hdl.handle.net/10210/12341.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Coulter, Duncan Anthony. “Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems.” 2014. Web. 21 Oct 2019.

Vancouver:

Coulter DA. Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems. [Internet] [Thesis]. University of Johannesburg; 2014. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/10210/12341.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Coulter DA. Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems. [Thesis]. University of Johannesburg; 2014. Available from: http://hdl.handle.net/10210/12341

Not specified: Masters Thesis or Doctoral Dissertation

University of Waterloo

9. Liu, Kai. Effective Dimensionality Control in Quantitative Finance and Insurance.

Degree: 2017, University of Waterloo

URL: http://hdl.handle.net/10012/12324

► It is well-known that *dimension* *reduction* techniques such as the Brownian bridge, principal component analysis, linear transformation could increase the efficiency of Quasi-Monte Carlo (QMC)…
(more)

Subjects/Keywords: QMC; Dimension Reduction; Effective Dimension; Effective Portfolio; Effective Portfolio Dimension

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Liu, K. (2017). Effective Dimensionality Control in Quantitative Finance and Insurance. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/12324

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Liu, Kai. “Effective Dimensionality Control in Quantitative Finance and Insurance.” 2017. Thesis, University of Waterloo. Accessed October 21, 2019. http://hdl.handle.net/10012/12324.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Liu, Kai. “Effective Dimensionality Control in Quantitative Finance and Insurance.” 2017. Web. 21 Oct 2019.

Vancouver:

Liu K. Effective Dimensionality Control in Quantitative Finance and Insurance. [Internet] [Thesis]. University of Waterloo; 2017. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/10012/12324.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu K. Effective Dimensionality Control in Quantitative Finance and Insurance. [Thesis]. University of Waterloo; 2017. Available from: http://hdl.handle.net/10012/12324

Not specified: Masters Thesis or Doctoral Dissertation

Temple University

10.
Yang, Chaozheng.
Sufficient *Dimension* *Reduction* in Complex Datasets.

Degree: PhD, 2016, Temple University

URL: http://digital.library.temple.edu/u?/p245801coll10,404627

►

Statistics

This dissertation focuses on two problems in *dimension* *reduction*. One is using permutation approach to test predictor contribution. The permutation approach applies to marginal…
(more)

Subjects/Keywords: Statistics;

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Yang, C. (2016). Sufficient Dimension Reduction in Complex Datasets. (Doctoral Dissertation). Temple University. Retrieved from http://digital.library.temple.edu/u?/p245801coll10,404627

Chicago Manual of Style (16^{th} Edition):

Yang, Chaozheng. “Sufficient Dimension Reduction in Complex Datasets.” 2016. Doctoral Dissertation, Temple University. Accessed October 21, 2019. http://digital.library.temple.edu/u?/p245801coll10,404627.

MLA Handbook (7^{th} Edition):

Yang, Chaozheng. “Sufficient Dimension Reduction in Complex Datasets.” 2016. Web. 21 Oct 2019.

Vancouver:

Yang C. Sufficient Dimension Reduction in Complex Datasets. [Internet] [Doctoral dissertation]. Temple University; 2016. [cited 2019 Oct 21]. Available from: http://digital.library.temple.edu/u?/p245801coll10,404627.

Council of Science Editors:

Yang C. Sufficient Dimension Reduction in Complex Datasets. [Doctoral Dissertation]. Temple University; 2016. Available from: http://digital.library.temple.edu/u?/p245801coll10,404627

University of Minnesota

11.
Chen, Xin.
Sufficient *dimension* *reduction* and variable selection.

Degree: PhD, Statistics, 2010, University of Minnesota

URL: http://purl.umn.edu/99484

► Sufficient *dimension* *reduction* (SDR) in regression was first introduced by Cook (2004). It reduces the *dimension* of the predictor space without loss of information and…
(more)

Subjects/Keywords: Central subspace; Dimension reduction; Regression; Sufficient dimension reduction; Variable selection; Statistics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Chen, X. (2010). Sufficient dimension reduction and variable selection. (Doctoral Dissertation). University of Minnesota. Retrieved from http://purl.umn.edu/99484

Chicago Manual of Style (16^{th} Edition):

Chen, Xin. “Sufficient dimension reduction and variable selection.” 2010. Doctoral Dissertation, University of Minnesota. Accessed October 21, 2019. http://purl.umn.edu/99484.

MLA Handbook (7^{th} Edition):

Chen, Xin. “Sufficient dimension reduction and variable selection.” 2010. Web. 21 Oct 2019.

Vancouver:

Chen X. Sufficient dimension reduction and variable selection. [Internet] [Doctoral dissertation]. University of Minnesota; 2010. [cited 2019 Oct 21]. Available from: http://purl.umn.edu/99484.

Council of Science Editors:

Chen X. Sufficient dimension reduction and variable selection. [Doctoral Dissertation]. University of Minnesota; 2010. Available from: http://purl.umn.edu/99484

University of Technology, Sydney

12.
Bian, Wei.
Supervised linear *dimension* * reduction*.

Degree: 2012, University of Technology, Sydney

URL: http://hdl.handle.net/10453/20422

► Supervised linear *dimension* *reduction* (SLDR) is one of the most effective methods for complexity *reduction*, which has been widely applied in pattern recognition, computer vision,…
(more)

Subjects/Keywords: Pattern recognition.; Dimension reduction.; Statistics.; Mathematics.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Bian, W. (2012). Supervised linear dimension reduction. (Thesis). University of Technology, Sydney. Retrieved from http://hdl.handle.net/10453/20422

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Bian, Wei. “Supervised linear dimension reduction.” 2012. Thesis, University of Technology, Sydney. Accessed October 21, 2019. http://hdl.handle.net/10453/20422.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Bian, Wei. “Supervised linear dimension reduction.” 2012. Web. 21 Oct 2019.

Vancouver:

Bian W. Supervised linear dimension reduction. [Internet] [Thesis]. University of Technology, Sydney; 2012. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/10453/20422.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Bian W. Supervised linear dimension reduction. [Thesis]. University of Technology, Sydney; 2012. Available from: http://hdl.handle.net/10453/20422

Not specified: Masters Thesis or Doctoral Dissertation

McMaster University

13.
Pathmanathan, Thinesh.
*Dimension**Reduction* and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions.

Degree: MSc, 2018, McMaster University

URL: http://hdl.handle.net/11375/22758

►

Model-based clustering is a probabilistic approach that views each cluster as a component in an appropriate mixture model. The Gaussian mixture model is one of… (more)

Subjects/Keywords: Model-based clustering; dimension reduction; statistical learning

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Pathmanathan, T. (2018). Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/22758

Chicago Manual of Style (16^{th} Edition):

Pathmanathan, Thinesh. “Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions.” 2018. Masters Thesis, McMaster University. Accessed October 21, 2019. http://hdl.handle.net/11375/22758.

MLA Handbook (7^{th} Edition):

Pathmanathan, Thinesh. “Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions.” 2018. Web. 21 Oct 2019.

Vancouver:

Pathmanathan T. Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions. [Internet] [Masters thesis]. McMaster University; 2018. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/11375/22758.

Council of Science Editors:

Pathmanathan T. Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions. [Masters Thesis]. McMaster University; 2018. Available from: http://hdl.handle.net/11375/22758

Cornell University

14.
Chen, Maximillian.
*Dimension**Reduction* And Inferential Procedures For Images
.

Degree: 2014, Cornell University

URL: http://hdl.handle.net/1813/37105

► High-dimensional data analysis has been a prominent topic of statistical research in recent years due to the growing presence of high-dimensional electronic data. Much of…
(more)

Subjects/Keywords: imaging data; dimension reduction; hypothesis testing

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Chen, M. (2014). Dimension Reduction And Inferential Procedures For Images . (Thesis). Cornell University. Retrieved from http://hdl.handle.net/1813/37105

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Chen, Maximillian. “Dimension Reduction And Inferential Procedures For Images .” 2014. Thesis, Cornell University. Accessed October 21, 2019. http://hdl.handle.net/1813/37105.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Chen, Maximillian. “Dimension Reduction And Inferential Procedures For Images .” 2014. Web. 21 Oct 2019.

Vancouver:

Chen M. Dimension Reduction And Inferential Procedures For Images . [Internet] [Thesis]. Cornell University; 2014. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/1813/37105.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chen M. Dimension Reduction And Inferential Procedures For Images . [Thesis]. Cornell University; 2014. Available from: http://hdl.handle.net/1813/37105

Not specified: Masters Thesis or Doctoral Dissertation

Uppsala University

15. Li, Qiongzhu. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.

Degree: Statistics, 2016, Uppsala University

URL: http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080

► In this paper, we try to compare the performance of two feature *dimension* *reduction* methods, the LASSO and PCA. Both simulation study and empirical…
(more)

Subjects/Keywords: Machine learning; Feature Dimension Reduction; NPL

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Li, Q. (2016). Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. (Thesis). Uppsala University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Li, Qiongzhu. “Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.” 2016. Thesis, Uppsala University. Accessed October 21, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Li, Qiongzhu. “Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.” 2016. Web. 21 Oct 2019.

Vancouver:

Li Q. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. [Internet] [Thesis]. Uppsala University; 2016. [cited 2019 Oct 21]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Li Q. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. [Thesis]. Uppsala University; 2016. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080

Not specified: Masters Thesis or Doctoral Dissertation

University of Toronto

16.
Santiago, Anna Theresa.
In Silico Comparative Evaluation of Classical and Robust *Dimension* *Reduction* for Psychological Assessment.

Degree: 2018, University of Toronto

URL: http://hdl.handle.net/1807/89519

►

The classic exploration of correlated multivariable psychological assessment data employs *dimension* *reduction* of the original p¬ variables to a lower q-dimensional space through principal component…
(more)

Subjects/Keywords: dimension reduction; PCA; projection pursuit; robust; 0308

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Santiago, A. T. (2018). In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/89519

Chicago Manual of Style (16^{th} Edition):

Santiago, Anna Theresa. “In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment.” 2018. Masters Thesis, University of Toronto. Accessed October 21, 2019. http://hdl.handle.net/1807/89519.

MLA Handbook (7^{th} Edition):

Santiago, Anna Theresa. “In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment.” 2018. Web. 21 Oct 2019.

Vancouver:

Santiago AT. In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment. [Internet] [Masters thesis]. University of Toronto; 2018. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/1807/89519.

Council of Science Editors:

Santiago AT. In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment. [Masters Thesis]. University of Toronto; 2018. Available from: http://hdl.handle.net/1807/89519

Princeton University

17.
BERTALAN, THOMAS.
*Dimension**Reduction* for Heterogeneous Populations of Oscillators
.

Degree: PhD, 2018, Princeton University

URL: http://arks.princeton.edu/ark:/88435/dsp017m01bp385

► This dissertation discusses coarse-graining methods and applications for simulations of large heterogeneous populations of neurons. These simulations are structured as large coupled sets of ordinary…
(more)

Subjects/Keywords: dimension reduction; heterogeneity; machine learning; polynomial chaos

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

BERTALAN, T. (2018). Dimension Reduction for Heterogeneous Populations of Oscillators . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp017m01bp385

Chicago Manual of Style (16^{th} Edition):

BERTALAN, THOMAS. “Dimension Reduction for Heterogeneous Populations of Oscillators .” 2018. Doctoral Dissertation, Princeton University. Accessed October 21, 2019. http://arks.princeton.edu/ark:/88435/dsp017m01bp385.

MLA Handbook (7^{th} Edition):

BERTALAN, THOMAS. “Dimension Reduction for Heterogeneous Populations of Oscillators .” 2018. Web. 21 Oct 2019.

Vancouver:

BERTALAN T. Dimension Reduction for Heterogeneous Populations of Oscillators . [Internet] [Doctoral dissertation]. Princeton University; 2018. [cited 2019 Oct 21]. Available from: http://arks.princeton.edu/ark:/88435/dsp017m01bp385.

Council of Science Editors:

BERTALAN T. Dimension Reduction for Heterogeneous Populations of Oscillators . [Doctoral Dissertation]. Princeton University; 2018. Available from: http://arks.princeton.edu/ark:/88435/dsp017m01bp385

University of Colorado

18.
Glaws, Andrew Taylor.
Parameter *Dimension* *Reduction* for Scientific Computing.

Degree: PhD, 2018, University of Colorado

URL: https://scholar.colorado.edu/csci_gradetds/195

► Advances in computational power have enabled the simulation of increasingly complex physical systems. Mathematically, we represent these simulations as a mapping from inputs to…
(more)

Subjects/Keywords: active subspaces; dimension reduction; magnetohydrodynamics; ridge function; ridge recovery; sufficient dimension reduction; Computer Sciences; Mathematics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Glaws, A. T. (2018). Parameter Dimension Reduction for Scientific Computing. (Doctoral Dissertation). University of Colorado. Retrieved from https://scholar.colorado.edu/csci_gradetds/195

Chicago Manual of Style (16^{th} Edition):

Glaws, Andrew Taylor. “Parameter Dimension Reduction for Scientific Computing.” 2018. Doctoral Dissertation, University of Colorado. Accessed October 21, 2019. https://scholar.colorado.edu/csci_gradetds/195.

MLA Handbook (7^{th} Edition):

Glaws, Andrew Taylor. “Parameter Dimension Reduction for Scientific Computing.” 2018. Web. 21 Oct 2019.

Vancouver:

Glaws AT. Parameter Dimension Reduction for Scientific Computing. [Internet] [Doctoral dissertation]. University of Colorado; 2018. [cited 2019 Oct 21]. Available from: https://scholar.colorado.edu/csci_gradetds/195.

Council of Science Editors:

Glaws AT. Parameter Dimension Reduction for Scientific Computing. [Doctoral Dissertation]. University of Colorado; 2018. Available from: https://scholar.colorado.edu/csci_gradetds/195

University of Minnesota

19.
Adragni, Kofi Placid.
*Dimension**reduction* and prediction in large p regressions.

Degree: PhD, Statistics, 2009, University of Minnesota

URL: http://purl.umn.edu/51904

► A high dimensional regression setting is considered with p predictors X=(X1,...,Xp)T and a response Y. The interest is with large p, possibly much larger than…
(more)

Subjects/Keywords: Dimension Reduction; Prediction; Principal Components; Principal Fitted Components; Regression; Sufficient Dimension Reduction; Statistics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Adragni, K. P. (2009). Dimension reduction and prediction in large p regressions. (Doctoral Dissertation). University of Minnesota. Retrieved from http://purl.umn.edu/51904

Chicago Manual of Style (16^{th} Edition):

Adragni, Kofi Placid. “Dimension reduction and prediction in large p regressions.” 2009. Doctoral Dissertation, University of Minnesota. Accessed October 21, 2019. http://purl.umn.edu/51904.

MLA Handbook (7^{th} Edition):

Adragni, Kofi Placid. “Dimension reduction and prediction in large p regressions.” 2009. Web. 21 Oct 2019.

Vancouver:

Adragni KP. Dimension reduction and prediction in large p regressions. [Internet] [Doctoral dissertation]. University of Minnesota; 2009. [cited 2019 Oct 21]. Available from: http://purl.umn.edu/51904.

Council of Science Editors:

Adragni KP. Dimension reduction and prediction in large p regressions. [Doctoral Dissertation]. University of Minnesota; 2009. Available from: http://purl.umn.edu/51904

Georgia Tech

20.
Li, Qingbin.
Online sufficient dimensionality *reduction* for sequential high-dimensional time-series.

Degree: MS, Industrial and Systems Engineering, 2015, Georgia Tech

URL: http://hdl.handle.net/1853/60385

In this thesis, we present Online Sufficient Dimensionality Reduction (OSDR) algorithm for real-time high-dimensional sequential data analysis.
*Advisors/Committee Members: Xie, Yao (advisor), Song, Le (committee member), Zhou, Enlu (committee member).*

Subjects/Keywords: Online learning; Dimension reduction; Sufficient dimensionality reduction; Stochastic gradient descent

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Li, Q. (2015). Online sufficient dimensionality reduction for sequential high-dimensional time-series. (Masters Thesis). Georgia Tech. Retrieved from http://hdl.handle.net/1853/60385

Chicago Manual of Style (16^{th} Edition):

Li, Qingbin. “Online sufficient dimensionality reduction for sequential high-dimensional time-series.” 2015. Masters Thesis, Georgia Tech. Accessed October 21, 2019. http://hdl.handle.net/1853/60385.

MLA Handbook (7^{th} Edition):

Li, Qingbin. “Online sufficient dimensionality reduction for sequential high-dimensional time-series.” 2015. Web. 21 Oct 2019.

Vancouver:

Li Q. Online sufficient dimensionality reduction for sequential high-dimensional time-series. [Internet] [Masters thesis]. Georgia Tech; 2015. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/1853/60385.

Council of Science Editors:

Li Q. Online sufficient dimensionality reduction for sequential high-dimensional time-series. [Masters Thesis]. Georgia Tech; 2015. Available from: http://hdl.handle.net/1853/60385

21. Hoyos-Idrobo, Andrés. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.

Degree: Docteur es, Informatique, 2017, Paris Saclay

URL: http://www.theses.fr/2017SACLS029

►

En imagerie médicale, des collaborations internationales ont lançé l'acquisition de centaines de Terabytes de données - et en particulierde données d'Imagerie par Résonance Magnétique fonctionelle… (more)

Subjects/Keywords: IRMf; Clustering; Reduction de dimension; Décodage; FMRI; Clustering; Dimentionality reduction; Decoding

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Hoyos-Idrobo, A. (2017). Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. (Doctoral Dissertation). Paris Saclay. Retrieved from http://www.theses.fr/2017SACLS029

Chicago Manual of Style (16^{th} Edition):

Hoyos-Idrobo, Andrés. “Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.” 2017. Doctoral Dissertation, Paris Saclay. Accessed October 21, 2019. http://www.theses.fr/2017SACLS029.

MLA Handbook (7^{th} Edition):

Hoyos-Idrobo, Andrés. “Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.” 2017. Web. 21 Oct 2019.

Vancouver:

Hoyos-Idrobo A. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. [Internet] [Doctoral dissertation]. Paris Saclay; 2017. [cited 2019 Oct 21]. Available from: http://www.theses.fr/2017SACLS029.

Council of Science Editors:

Hoyos-Idrobo A. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. [Doctoral Dissertation]. Paris Saclay; 2017. Available from: http://www.theses.fr/2017SACLS029

Clemson University

22. Wilson, Matthew Robert. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.

Degree: MS, Computer Engineering, 2016, Clemson University

URL: https://tigerprints.clemson.edu/all_theses/2357

► Reducing the input dimensionality of large datasets for subsequent processing will allow the process to become less computationally complex and expensive. This thesis tests if…
(more)

Subjects/Keywords: Dimension reduction; feature reduction; feature selection; input reduction; Karnin Sensitivity; Principal Component Analysis

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wilson, M. R. (2016). Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. (Masters Thesis). Clemson University. Retrieved from https://tigerprints.clemson.edu/all_theses/2357

Chicago Manual of Style (16^{th} Edition):

Wilson, Matthew Robert. “Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.” 2016. Masters Thesis, Clemson University. Accessed October 21, 2019. https://tigerprints.clemson.edu/all_theses/2357.

MLA Handbook (7^{th} Edition):

Wilson, Matthew Robert. “Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.” 2016. Web. 21 Oct 2019.

Vancouver:

Wilson MR. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. [Internet] [Masters thesis]. Clemson University; 2016. [cited 2019 Oct 21]. Available from: https://tigerprints.clemson.edu/all_theses/2357.

Council of Science Editors:

Wilson MR. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. [Masters Thesis]. Clemson University; 2016. Available from: https://tigerprints.clemson.edu/all_theses/2357

23.
Lu, Weizhi.
Contribution to *dimension* *reduction* techniques : application to object tracking : Contribution aux techniques de la réduction de *dimension* : application au suivi d'objet.

Degree: Docteur es, Traitement du signal et de l'image, 2014, Rennes, INSA

URL: http://www.theses.fr/2014ISAR0010

►

Cette thèse étudie et apporte des améliorations significatives sur trois techniques répandues en réduction de *dimension* : l'acquisition parcimonieuse (ou l'échantillonnage parcimonieux), la projection aléatoire…
(more)

Subjects/Keywords: Réduction de dimension; Dimension reduction; Compressed sensing; Random projection; Sparse representation; 621.382

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Lu, W. (2014). Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. (Doctoral Dissertation). Rennes, INSA. Retrieved from http://www.theses.fr/2014ISAR0010

Chicago Manual of Style (16^{th} Edition):

Lu, Weizhi. “Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet.” 2014. Doctoral Dissertation, Rennes, INSA. Accessed October 21, 2019. http://www.theses.fr/2014ISAR0010.

MLA Handbook (7^{th} Edition):

Lu, Weizhi. “Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet.” 2014. Web. 21 Oct 2019.

Vancouver:

Lu W. Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. [Internet] [Doctoral dissertation]. Rennes, INSA; 2014. [cited 2019 Oct 21]. Available from: http://www.theses.fr/2014ISAR0010.

Council of Science Editors:

Lu W. Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. [Doctoral Dissertation]. Rennes, INSA; 2014. Available from: http://www.theses.fr/2014ISAR0010

University of Waterloo

24. Liu, Kai. Directional Control of Generating Brownian Path under Quasi Monte Carlo.

Degree: 2012, University of Waterloo

URL: http://hdl.handle.net/10012/6984

► Quasi-Monte Carlo (QMC) methods are playing an increasingly important role in computational finance. This is attributed to the increased complexity of the derivative securities and…
(more)

Subjects/Keywords: QMC; Low Discrepancy Sequence; Effective Dimension; Dimension Reduction; PCA; BB; LT; OT; FOT; DC

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Liu, K. (2012). Directional Control of Generating Brownian Path under Quasi Monte Carlo. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/6984

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Liu, Kai. “Directional Control of Generating Brownian Path under Quasi Monte Carlo.” 2012. Thesis, University of Waterloo. Accessed October 21, 2019. http://hdl.handle.net/10012/6984.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Liu, Kai. “Directional Control of Generating Brownian Path under Quasi Monte Carlo.” 2012. Web. 21 Oct 2019.

Vancouver:

Liu K. Directional Control of Generating Brownian Path under Quasi Monte Carlo. [Internet] [Thesis]. University of Waterloo; 2012. [cited 2019 Oct 21]. Available from: http://hdl.handle.net/10012/6984.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu K. Directional Control of Generating Brownian Path under Quasi Monte Carlo. [Thesis]. University of Waterloo; 2012. Available from: http://hdl.handle.net/10012/6984

Not specified: Masters Thesis or Doctoral Dissertation

25.
Vu, Khac Ky.
Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande * dimension*.

Degree: Docteur es, Informatique, 2016, Paris Saclay

URL: http://www.theses.fr/2016SACLX031

► À l'ère de la numérisation, les données devient pas cher et facile à obtenir. Cela se traduit par de nombreux nouveaux problèmes d'optimisation avec de…
(more)

Subjects/Keywords: Réduction de dimension; Approximation; Optimisation; Algorithmes randomisés; Dimension reduction; Approximation; Optimization; Randomized algorithms

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Vu, K. K. (2016). Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. (Doctoral Dissertation). Paris Saclay. Retrieved from http://www.theses.fr/2016SACLX031

Chicago Manual of Style (16^{th} Edition):

Vu, Khac Ky. “Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension.” 2016. Doctoral Dissertation, Paris Saclay. Accessed October 21, 2019. http://www.theses.fr/2016SACLX031.

MLA Handbook (7^{th} Edition):

Vu, Khac Ky. “Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension.” 2016. Web. 21 Oct 2019.

Vancouver:

Vu KK. Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. [Internet] [Doctoral dissertation]. Paris Saclay; 2016. [cited 2019 Oct 21]. Available from: http://www.theses.fr/2016SACLX031.

Council of Science Editors:

Vu KK. Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. [Doctoral Dissertation]. Paris Saclay; 2016. Available from: http://www.theses.fr/2016SACLX031

26.
Chiancone, Alessandro.
Réduction de *dimension* via Sliced Inverse Regression : Idées et nouvelles propositions : *Dimension* reductio via Sliced Inverse Regression : ideas and extensions.

Degree: Docteur es, Mathématiques Appliquées, 2016, Grenoble Alpes

URL: http://www.theses.fr/2016GREAM051

►

Cette thèse propose trois extensions de la Régression linéaire par tranches (Sliced Inverse Regression, SIR), notamment Collaborative SIR, Student SIR et Knockoff SIR.Une des faiblesses… (more)

Subjects/Keywords: Régression linéaire par tranches; Reduction de dimension; Selection de variables; Sliced Inverse Regression; Dimension reduction; Variable selection; 510

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Chiancone, A. (2016). Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions. (Doctoral Dissertation). Grenoble Alpes. Retrieved from http://www.theses.fr/2016GREAM051

Chicago Manual of Style (16^{th} Edition):

Chiancone, Alessandro. “Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions.” 2016. Doctoral Dissertation, Grenoble Alpes. Accessed October 21, 2019. http://www.theses.fr/2016GREAM051.

MLA Handbook (7^{th} Edition):

Chiancone, Alessandro. “Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions.” 2016. Web. 21 Oct 2019.

Vancouver:

Chiancone A. Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions. [Internet] [Doctoral dissertation]. Grenoble Alpes; 2016. [cited 2019 Oct 21]. Available from: http://www.theses.fr/2016GREAM051.

Council of Science Editors:

Chiancone A. Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions. [Doctoral Dissertation]. Grenoble Alpes; 2016. Available from: http://www.theses.fr/2016GREAM051

Texas State University – San Marcos

27.
Reiss, Randolf H.
Eigenvalues and Eigenvectors in Data *Dimension* *Reduction* for Regression.

Degree: MS, Mathematics, 2013, Texas State University – San Marcos

URL: https://digital.library.txstate.edu/handle/10877/4696

► A basic theory of eigenvalues and eigenvectors as a means to reduce the *dimension* of data, is presented. Iterative methods for finding eigenvalues and eigenvectors…
(more)

Subjects/Keywords: Eigenvector; Eigenvalue; Dimension Reduction; Power Method; Partial Least Squares; Eigenvalues; Eigenvectors; Data reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Reiss, R. H. (2013). Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression. (Masters Thesis). Texas State University – San Marcos. Retrieved from https://digital.library.txstate.edu/handle/10877/4696

Chicago Manual of Style (16^{th} Edition):

Reiss, Randolf H. “Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression.” 2013. Masters Thesis, Texas State University – San Marcos. Accessed October 21, 2019. https://digital.library.txstate.edu/handle/10877/4696.

MLA Handbook (7^{th} Edition):

Reiss, Randolf H. “Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression.” 2013. Web. 21 Oct 2019.

Vancouver:

Reiss RH. Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression. [Internet] [Masters thesis]. Texas State University – San Marcos; 2013. [cited 2019 Oct 21]. Available from: https://digital.library.txstate.edu/handle/10877/4696.

Council of Science Editors:

Reiss RH. Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression. [Masters Thesis]. Texas State University – San Marcos; 2013. Available from: https://digital.library.txstate.edu/handle/10877/4696

Temple University

28. Zhang, Yongxu. ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS.

Degree: PhD, 2017, Temple University

URL: http://digital.library.temple.edu/u?/p245801coll10,449833

►

Statistics

As a useful tool for multivariate analysis, sufficient *dimension* *reduction* (SDR) aims to reduce the predictor dimensionality while simultaneously keeping the full regression information,…
(more)

Subjects/Keywords: Statistics;

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhang, Y. (2017). ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS. (Doctoral Dissertation). Temple University. Retrieved from http://digital.library.temple.edu/u?/p245801coll10,449833

Chicago Manual of Style (16^{th} Edition):

Zhang, Yongxu. “ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS.” 2017. Doctoral Dissertation, Temple University. Accessed October 21, 2019. http://digital.library.temple.edu/u?/p245801coll10,449833.

MLA Handbook (7^{th} Edition):

Zhang, Yongxu. “ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS.” 2017. Web. 21 Oct 2019.

Vancouver:

Zhang Y. ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS. [Internet] [Doctoral dissertation]. Temple University; 2017. [cited 2019 Oct 21]. Available from: http://digital.library.temple.edu/u?/p245801coll10,449833.

Council of Science Editors:

Zhang Y. ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS. [Doctoral Dissertation]. Temple University; 2017. Available from: http://digital.library.temple.edu/u?/p245801coll10,449833

Linnaeus University

29.
Sun, Xuebo.
An Application of *Dimension* *Reduction* for Intention Groups in Reddit.

Degree: Computer Science, 2016, Linnaeus University

URL: http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500

► Reddit (www.reddit.com) is a social news platform for information sharing and exchanging. The amount of data, in terms of both observations and dimensions is…
(more)

Subjects/Keywords: Reddit; communication model; dimension reduction; similarity metric; Computer Sciences; Datavetenskap (datalogi)

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sun, X. (2016). An Application of Dimension Reduction for Intention Groups in Reddit. (Thesis). Linnaeus University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Sun, Xuebo. “An Application of Dimension Reduction for Intention Groups in Reddit.” 2016. Thesis, Linnaeus University. Accessed October 21, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Sun, Xuebo. “An Application of Dimension Reduction for Intention Groups in Reddit.” 2016. Web. 21 Oct 2019.

Vancouver:

Sun X. An Application of Dimension Reduction for Intention Groups in Reddit. [Internet] [Thesis]. Linnaeus University; 2016. [cited 2019 Oct 21]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sun X. An Application of Dimension Reduction for Intention Groups in Reddit. [Thesis]. Linnaeus University; 2016. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-56500

Not specified: Masters Thesis or Doctoral Dissertation

University of Cincinnati

30. Sun, Yan. Regularization for High-dimensional Time Series Models.

Degree: PhD, Arts and Sciences: Mathematical Sciences, 2011, University of Cincinnati

URL: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387

► Analyzing multivariate time series has been a very important topic in economics, finance, engineering, social and natural sciences. Compared to univariate models, the multivariate…
(more)

Subjects/Keywords: Statistics; conditional likelihood; dimension reduction; oracle property; sparse; stationary; time series

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sun, Y. (2011). Regularization for High-dimensional Time Series Models. (Doctoral Dissertation). University of Cincinnati. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387

Chicago Manual of Style (16^{th} Edition):

Sun, Yan. “Regularization for High-dimensional Time Series Models.” 2011. Doctoral Dissertation, University of Cincinnati. Accessed October 21, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387.

MLA Handbook (7^{th} Edition):

Sun, Yan. “Regularization for High-dimensional Time Series Models.” 2011. Web. 21 Oct 2019.

Vancouver:

Sun Y. Regularization for High-dimensional Time Series Models. [Internet] [Doctoral dissertation]. University of Cincinnati; 2011. [cited 2019 Oct 21]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387.

Council of Science Editors:

Sun Y. Regularization for High-dimensional Time Series Models. [Doctoral Dissertation]. University of Cincinnati; 2011. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1307321387