Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(dimension reduction). Showing records 1 – 30 of 190 total matches.

[1] [2] [3] [4] [5] [6] [7]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Languages

Country

▼ Search Limiters


Oregon State University

1. Thangavelu, Madan Kumar. On error bounds for linear feature extraction.

Degree: MS, Computer Science, 2010, Oregon State University

 Linear transformation for dimension reduction is a well established problem in the field of machine learning. Due to the numerous observability of parameters and data,… (more)

Subjects/Keywords: Dimension reduction; Dimension reduction (Statistics)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Thangavelu, M. K. (2010). On error bounds for linear feature extraction. (Masters Thesis). Oregon State University. Retrieved from http://hdl.handle.net/1957/13886

Chicago Manual of Style (16th Edition):

Thangavelu, Madan Kumar. “On error bounds for linear feature extraction.” 2010. Masters Thesis, Oregon State University. Accessed December 16, 2018. http://hdl.handle.net/1957/13886.

MLA Handbook (7th Edition):

Thangavelu, Madan Kumar. “On error bounds for linear feature extraction.” 2010. Web. 16 Dec 2018.

Vancouver:

Thangavelu MK. On error bounds for linear feature extraction. [Internet] [Masters thesis]. Oregon State University; 2010. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/1957/13886.

Council of Science Editors:

Thangavelu MK. On error bounds for linear feature extraction. [Masters Thesis]. Oregon State University; 2010. Available from: http://hdl.handle.net/1957/13886


University of Georgia

2. Wang, Qin. Sufficient dimension reduction and sufficient variable selection.

Degree: PhD, Statistics, 2009, University of Georgia

 The development in theory and methodology for sufficient dimension reduction has provided a powerful tool to tackle the challenging problem of high dimensional data analysis.… (more)

Subjects/Keywords: sufficient dimension reduction

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, Q. (2009). Sufficient dimension reduction and sufficient variable selection. (Doctoral Dissertation). University of Georgia. Retrieved from http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd

Chicago Manual of Style (16th Edition):

Wang, Qin. “Sufficient dimension reduction and sufficient variable selection.” 2009. Doctoral Dissertation, University of Georgia. Accessed December 16, 2018. http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd.

MLA Handbook (7th Edition):

Wang, Qin. “Sufficient dimension reduction and sufficient variable selection.” 2009. Web. 16 Dec 2018.

Vancouver:

Wang Q. Sufficient dimension reduction and sufficient variable selection. [Internet] [Doctoral dissertation]. University of Georgia; 2009. [cited 2018 Dec 16]. Available from: http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd.

Council of Science Editors:

Wang Q. Sufficient dimension reduction and sufficient variable selection. [Doctoral Dissertation]. University of Georgia; 2009. Available from: http://purl.galileo.usg.edu/uga_etd/wang_qin_200905_phd


Baylor University

3. [No author]. Three applications of linear dimension reduction.

Degree: 2017, Baylor University

 Linear Dimension Reduction (LDR) has many uses in engineering, business, medicine, economics, data science and others. LDR can be employed when observations are recorded with… (more)

Subjects/Keywords: Linear dimension reduction.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

author], [. (2017). Three applications of linear dimension reduction. (Thesis). Baylor University. Retrieved from http://hdl.handle.net/2104/10182

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

author], [No. “Three applications of linear dimension reduction. ” 2017. Thesis, Baylor University. Accessed December 16, 2018. http://hdl.handle.net/2104/10182.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

author], [No. “Three applications of linear dimension reduction. ” 2017. Web. 16 Dec 2018.

Vancouver:

author] [. Three applications of linear dimension reduction. [Internet] [Thesis]. Baylor University; 2017. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/2104/10182.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

author] [. Three applications of linear dimension reduction. [Thesis]. Baylor University; 2017. Available from: http://hdl.handle.net/2104/10182

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Georgia

4. Ling, Yangrong. Statistical dimension reduction methods for appearance-based face recognition.

Degree: MS, Computer Science, 2003, University of Georgia

 Two novel moment-based methods which are insensitive to large variation in lighting direction and facial expression are developed for appearance-based face recognition using dimension reduction(more)

Subjects/Keywords: Dimension-reduction

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ling, Y. (2003). Statistical dimension reduction methods for appearance-based face recognition. (Masters Thesis). University of Georgia. Retrieved from http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms

Chicago Manual of Style (16th Edition):

Ling, Yangrong. “Statistical dimension reduction methods for appearance-based face recognition.” 2003. Masters Thesis, University of Georgia. Accessed December 16, 2018. http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms.

MLA Handbook (7th Edition):

Ling, Yangrong. “Statistical dimension reduction methods for appearance-based face recognition.” 2003. Web. 16 Dec 2018.

Vancouver:

Ling Y. Statistical dimension reduction methods for appearance-based face recognition. [Internet] [Masters thesis]. University of Georgia; 2003. [cited 2018 Dec 16]. Available from: http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms.

Council of Science Editors:

Ling Y. Statistical dimension reduction methods for appearance-based face recognition. [Masters Thesis]. University of Georgia; 2003. Available from: http://purl.galileo.usg.edu/uga_etd/ling_yangrong_200305_ms


Baylor University

5. Young, Phil D. Topics in dimension reduction and missing data in statistical discrimination.

Degree: Statistical Sciences., 2010, Baylor University

 This dissertation is comprised of four chapters. In the first chapter, we define the concept of linear dimension reduction, review some popular linear dimension reduction(more)

Subjects/Keywords: Dimension reduction.; Statistical discrimination.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Young, P. D. (2010). Topics in dimension reduction and missing data in statistical discrimination. (Thesis). Baylor University. Retrieved from http://hdl.handle.net/2104/5543

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Young, Phil D. “Topics in dimension reduction and missing data in statistical discrimination. ” 2010. Thesis, Baylor University. Accessed December 16, 2018. http://hdl.handle.net/2104/5543.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Young, Phil D. “Topics in dimension reduction and missing data in statistical discrimination. ” 2010. Web. 16 Dec 2018.

Vancouver:

Young PD. Topics in dimension reduction and missing data in statistical discrimination. [Internet] [Thesis]. Baylor University; 2010. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/2104/5543.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Young PD. Topics in dimension reduction and missing data in statistical discrimination. [Thesis]. Baylor University; 2010. Available from: http://hdl.handle.net/2104/5543

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Penn State University

6. Wang, Yu. Nonlinear Dimension Reduction in Feature Space.

Degree: PhD, Statistics, 2008, Penn State University

 In this thesis I introduce an idea for applying dimension reduction methods to feature spaces. Three main methods will be used to estimate dimension reduction(more)

Subjects/Keywords: Dimension Reduction; Feature Space

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, Y. (2008). Nonlinear Dimension Reduction in Feature Space. (Doctoral Dissertation). Penn State University. Retrieved from https://etda.libraries.psu.edu/catalog/8637

Chicago Manual of Style (16th Edition):

Wang, Yu. “Nonlinear Dimension Reduction in Feature Space.” 2008. Doctoral Dissertation, Penn State University. Accessed December 16, 2018. https://etda.libraries.psu.edu/catalog/8637.

MLA Handbook (7th Edition):

Wang, Yu. “Nonlinear Dimension Reduction in Feature Space.” 2008. Web. 16 Dec 2018.

Vancouver:

Wang Y. Nonlinear Dimension Reduction in Feature Space. [Internet] [Doctoral dissertation]. Penn State University; 2008. [cited 2018 Dec 16]. Available from: https://etda.libraries.psu.edu/catalog/8637.

Council of Science Editors:

Wang Y. Nonlinear Dimension Reduction in Feature Space. [Doctoral Dissertation]. Penn State University; 2008. Available from: https://etda.libraries.psu.edu/catalog/8637


Clemson University

7. Knoll, Fiona. Johnson-Lindenstrauss Transformations.

Degree: PhD, Mathematical Sciences, 2017, Clemson University

 With the quick progression of technology and the increasing need to process large data, there has been an increased interest in data-dependent and data-independent dimension(more)

Subjects/Keywords: Data; Dimension Reduction; Johnson-Lindenstrauss

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Knoll, F. (2017). Johnson-Lindenstrauss Transformations. (Doctoral Dissertation). Clemson University. Retrieved from https://tigerprints.clemson.edu/all_dissertations/1977

Chicago Manual of Style (16th Edition):

Knoll, Fiona. “Johnson-Lindenstrauss Transformations.” 2017. Doctoral Dissertation, Clemson University. Accessed December 16, 2018. https://tigerprints.clemson.edu/all_dissertations/1977.

MLA Handbook (7th Edition):

Knoll, Fiona. “Johnson-Lindenstrauss Transformations.” 2017. Web. 16 Dec 2018.

Vancouver:

Knoll F. Johnson-Lindenstrauss Transformations. [Internet] [Doctoral dissertation]. Clemson University; 2017. [cited 2018 Dec 16]. Available from: https://tigerprints.clemson.edu/all_dissertations/1977.

Council of Science Editors:

Knoll F. Johnson-Lindenstrauss Transformations. [Doctoral Dissertation]. Clemson University; 2017. Available from: https://tigerprints.clemson.edu/all_dissertations/1977


University of Johannesburg

8. Coulter, Duncan Anthony. Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems.

Degree: 2014, University of Johannesburg

Ph.D. (Computer Science)

The development of software systems is a relatively recent field of human endeavour. Even so, it has followed a steady progression of… (more)

Subjects/Keywords: Dimension reduction (Statistics); Multiagent systems

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Coulter, D. A. (2014). Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/12341

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Coulter, Duncan Anthony. “Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems.” 2014. Thesis, University of Johannesburg. Accessed December 16, 2018. http://hdl.handle.net/10210/12341.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Coulter, Duncan Anthony. “Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems.” 2014. Web. 16 Dec 2018.

Vancouver:

Coulter DA. Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems. [Internet] [Thesis]. University of Johannesburg; 2014. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/10210/12341.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Coulter DA. Immunologically amplified knowledge and intentions dimensionality reduction in cooperative multi-agent systems. [Thesis]. University of Johannesburg; 2014. Available from: http://hdl.handle.net/10210/12341

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Waterloo

9. Liu, Kai. Effective Dimensionality Control in Quantitative Finance and Insurance.

Degree: 2017, University of Waterloo

 It is well-known that dimension reduction techniques such as the Brownian bridge, principal component analysis, linear transformation could increase the efficiency of Quasi-Monte Carlo (QMC)… (more)

Subjects/Keywords: QMC; Dimension Reduction; Effective Dimension; Effective Portfolio; Effective Portfolio Dimension

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liu, K. (2017). Effective Dimensionality Control in Quantitative Finance and Insurance. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/12324

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Liu, Kai. “Effective Dimensionality Control in Quantitative Finance and Insurance.” 2017. Thesis, University of Waterloo. Accessed December 16, 2018. http://hdl.handle.net/10012/12324.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Liu, Kai. “Effective Dimensionality Control in Quantitative Finance and Insurance.” 2017. Web. 16 Dec 2018.

Vancouver:

Liu K. Effective Dimensionality Control in Quantitative Finance and Insurance. [Internet] [Thesis]. University of Waterloo; 2017. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/10012/12324.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu K. Effective Dimensionality Control in Quantitative Finance and Insurance. [Thesis]. University of Waterloo; 2017. Available from: http://hdl.handle.net/10012/12324

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Temple University

10. Yang, Chaozheng. Sufficient Dimension Reduction in Complex Datasets.

Degree: PhD, 2016, Temple University

Statistics

This dissertation focuses on two problems in dimension reduction. One is using permutation approach to test predictor contribution. The permutation approach applies to marginal… (more)

Subjects/Keywords: Statistics;

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yang, C. (2016). Sufficient Dimension Reduction in Complex Datasets. (Doctoral Dissertation). Temple University. Retrieved from http://digital.library.temple.edu/u?/p245801coll10,404627

Chicago Manual of Style (16th Edition):

Yang, Chaozheng. “Sufficient Dimension Reduction in Complex Datasets.” 2016. Doctoral Dissertation, Temple University. Accessed December 16, 2018. http://digital.library.temple.edu/u?/p245801coll10,404627.

MLA Handbook (7th Edition):

Yang, Chaozheng. “Sufficient Dimension Reduction in Complex Datasets.” 2016. Web. 16 Dec 2018.

Vancouver:

Yang C. Sufficient Dimension Reduction in Complex Datasets. [Internet] [Doctoral dissertation]. Temple University; 2016. [cited 2018 Dec 16]. Available from: http://digital.library.temple.edu/u?/p245801coll10,404627.

Council of Science Editors:

Yang C. Sufficient Dimension Reduction in Complex Datasets. [Doctoral Dissertation]. Temple University; 2016. Available from: http://digital.library.temple.edu/u?/p245801coll10,404627


University of Minnesota

11. Chen, Xin. Sufficient dimension reduction and variable selection.

Degree: PhD, Statistics, 2010, University of Minnesota

 Sufficient dimension reduction (SDR) in regression was first introduced by Cook (2004). It reduces the dimension of the predictor space without loss of information and… (more)

Subjects/Keywords: Central subspace; Dimension reduction; Regression; Sufficient dimension reduction; Variable selection; Statistics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chen, X. (2010). Sufficient dimension reduction and variable selection. (Doctoral Dissertation). University of Minnesota. Retrieved from http://purl.umn.edu/99484

Chicago Manual of Style (16th Edition):

Chen, Xin. “Sufficient dimension reduction and variable selection.” 2010. Doctoral Dissertation, University of Minnesota. Accessed December 16, 2018. http://purl.umn.edu/99484.

MLA Handbook (7th Edition):

Chen, Xin. “Sufficient dimension reduction and variable selection.” 2010. Web. 16 Dec 2018.

Vancouver:

Chen X. Sufficient dimension reduction and variable selection. [Internet] [Doctoral dissertation]. University of Minnesota; 2010. [cited 2018 Dec 16]. Available from: http://purl.umn.edu/99484.

Council of Science Editors:

Chen X. Sufficient dimension reduction and variable selection. [Doctoral Dissertation]. University of Minnesota; 2010. Available from: http://purl.umn.edu/99484


Princeton University

12. BERTALAN, THOMAS. Dimension Reduction for Heterogeneous Populations of Oscillators .

Degree: PhD, 2018, Princeton University

 This dissertation discusses coarse-graining methods and applications for simulations of large heterogeneous populations of neurons. These simulations are structured as large coupled sets of ordinary… (more)

Subjects/Keywords: dimension reduction; heterogeneity; machine learning; polynomial chaos

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

BERTALAN, T. (2018). Dimension Reduction for Heterogeneous Populations of Oscillators . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp017m01bp385

Chicago Manual of Style (16th Edition):

BERTALAN, THOMAS. “Dimension Reduction for Heterogeneous Populations of Oscillators .” 2018. Doctoral Dissertation, Princeton University. Accessed December 16, 2018. http://arks.princeton.edu/ark:/88435/dsp017m01bp385.

MLA Handbook (7th Edition):

BERTALAN, THOMAS. “Dimension Reduction for Heterogeneous Populations of Oscillators .” 2018. Web. 16 Dec 2018.

Vancouver:

BERTALAN T. Dimension Reduction for Heterogeneous Populations of Oscillators . [Internet] [Doctoral dissertation]. Princeton University; 2018. [cited 2018 Dec 16]. Available from: http://arks.princeton.edu/ark:/88435/dsp017m01bp385.

Council of Science Editors:

BERTALAN T. Dimension Reduction for Heterogeneous Populations of Oscillators . [Doctoral Dissertation]. Princeton University; 2018. Available from: http://arks.princeton.edu/ark:/88435/dsp017m01bp385


University of Technology, Sydney

13. Bian, Wei. Supervised linear dimension reduction.

Degree: 2012, University of Technology, Sydney

 Supervised linear dimension reduction (SLDR) is one of the most effective methods for complexity reduction, which has been widely applied in pattern recognition, computer vision,… (more)

Subjects/Keywords: Pattern recognition.; Dimension reduction.; Statistics.; Mathematics.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bian, W. (2012). Supervised linear dimension reduction. (Thesis). University of Technology, Sydney. Retrieved from http://hdl.handle.net/10453/20422

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Bian, Wei. “Supervised linear dimension reduction.” 2012. Thesis, University of Technology, Sydney. Accessed December 16, 2018. http://hdl.handle.net/10453/20422.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Bian, Wei. “Supervised linear dimension reduction.” 2012. Web. 16 Dec 2018.

Vancouver:

Bian W. Supervised linear dimension reduction. [Internet] [Thesis]. University of Technology, Sydney; 2012. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/10453/20422.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Bian W. Supervised linear dimension reduction. [Thesis]. University of Technology, Sydney; 2012. Available from: http://hdl.handle.net/10453/20422

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Cornell University

14. Chen, Maximillian. Dimension Reduction And Inferential Procedures For Images .

Degree: 2014, Cornell University

 High-dimensional data analysis has been a prominent topic of statistical research in recent years due to the growing presence of high-dimensional electronic data. Much of… (more)

Subjects/Keywords: imaging data; dimension reduction; hypothesis testing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chen, M. (2014). Dimension Reduction And Inferential Procedures For Images . (Thesis). Cornell University. Retrieved from http://hdl.handle.net/1813/37105

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Chen, Maximillian. “Dimension Reduction And Inferential Procedures For Images .” 2014. Thesis, Cornell University. Accessed December 16, 2018. http://hdl.handle.net/1813/37105.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Chen, Maximillian. “Dimension Reduction And Inferential Procedures For Images .” 2014. Web. 16 Dec 2018.

Vancouver:

Chen M. Dimension Reduction And Inferential Procedures For Images . [Internet] [Thesis]. Cornell University; 2014. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/1813/37105.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chen M. Dimension Reduction And Inferential Procedures For Images . [Thesis]. Cornell University; 2014. Available from: http://hdl.handle.net/1813/37105

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


McMaster University

15. Pathmanathan, Thinesh. Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions.

Degree: MSc, 2018, McMaster University

Model-based clustering is a probabilistic approach that views each cluster as a component in an appropriate mixture model. The Gaussian mixture model is one of… (more)

Subjects/Keywords: Model-based clustering; dimension reduction; statistical learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Pathmanathan, T. (2018). Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/22758

Chicago Manual of Style (16th Edition):

Pathmanathan, Thinesh. “Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions.” 2018. Masters Thesis, McMaster University. Accessed December 16, 2018. http://hdl.handle.net/11375/22758.

MLA Handbook (7th Edition):

Pathmanathan, Thinesh. “Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions.” 2018. Web. 16 Dec 2018.

Vancouver:

Pathmanathan T. Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions. [Internet] [Masters thesis]. McMaster University; 2018. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/11375/22758.

Council of Science Editors:

Pathmanathan T. Dimension Reduction and Clustering of High Dimensional Data using a Mixture of Generalized Hyperbolic Distributions. [Masters Thesis]. McMaster University; 2018. Available from: http://hdl.handle.net/11375/22758


Uppsala University

16. Li, Qiongzhu. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.

Degree: Statistics, 2016, Uppsala University

  In this paper, we try to compare the performance of two feature dimension reduction methods, the LASSO and PCA. Both simulation study and empirical… (more)

Subjects/Keywords: Machine learning; Feature Dimension Reduction; NPL

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, Q. (2016). Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. (Thesis). Uppsala University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Li, Qiongzhu. “Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.” 2016. Thesis, Uppsala University. Accessed December 16, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Li, Qiongzhu. “Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans.” 2016. Web. 16 Dec 2018.

Vancouver:

Li Q. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. [Internet] [Thesis]. Uppsala University; 2016. [cited 2018 Dec 16]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Li Q. Study of Single and Ensemble Machine Learning Models on Credit Data to Detect Underlying Non-performing Loans. [Thesis]. Uppsala University; 2016. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297080

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Toronto

17. Santiago, Anna Theresa. In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment.

Degree: 2018, University of Toronto

The classic exploration of correlated multivariable psychological assessment data employs dimension reduction of the original p¬ variables to a lower q-dimensional space through principal component… (more)

Subjects/Keywords: dimension reduction; PCA; projection pursuit; robust; 0308

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Santiago, A. T. (2018). In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/89519

Chicago Manual of Style (16th Edition):

Santiago, Anna Theresa. “In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment.” 2018. Masters Thesis, University of Toronto. Accessed December 16, 2018. http://hdl.handle.net/1807/89519.

MLA Handbook (7th Edition):

Santiago, Anna Theresa. “In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment.” 2018. Web. 16 Dec 2018.

Vancouver:

Santiago AT. In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment. [Internet] [Masters thesis]. University of Toronto; 2018. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/1807/89519.

Council of Science Editors:

Santiago AT. In Silico Comparative Evaluation of Classical and Robust Dimension Reduction for Psychological Assessment. [Masters Thesis]. University of Toronto; 2018. Available from: http://hdl.handle.net/1807/89519


University of Minnesota

18. Adragni, Kofi Placid. Dimension reduction and prediction in large p regressions.

Degree: PhD, Statistics, 2009, University of Minnesota

 A high dimensional regression setting is considered with p predictors X=(X1,...,Xp)T and a response Y. The interest is with large p, possibly much larger than… (more)

Subjects/Keywords: Dimension Reduction; Prediction; Principal Components; Principal Fitted Components; Regression; Sufficient Dimension Reduction; Statistics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Adragni, K. P. (2009). Dimension reduction and prediction in large p regressions. (Doctoral Dissertation). University of Minnesota. Retrieved from http://purl.umn.edu/51904

Chicago Manual of Style (16th Edition):

Adragni, Kofi Placid. “Dimension reduction and prediction in large p regressions.” 2009. Doctoral Dissertation, University of Minnesota. Accessed December 16, 2018. http://purl.umn.edu/51904.

MLA Handbook (7th Edition):

Adragni, Kofi Placid. “Dimension reduction and prediction in large p regressions.” 2009. Web. 16 Dec 2018.

Vancouver:

Adragni KP. Dimension reduction and prediction in large p regressions. [Internet] [Doctoral dissertation]. University of Minnesota; 2009. [cited 2018 Dec 16]. Available from: http://purl.umn.edu/51904.

Council of Science Editors:

Adragni KP. Dimension reduction and prediction in large p regressions. [Doctoral Dissertation]. University of Minnesota; 2009. Available from: http://purl.umn.edu/51904

19. Hoyos-Idrobo, Andrés. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.

Degree: Docteur es, Informatique, 2017, Paris Saclay

En imagerie médicale, des collaborations internationales ont lançé l'acquisition de centaines de Terabytes de données - et en particulierde données d'Imagerie par Résonance Magnétique fonctionelle… (more)

Subjects/Keywords: IRMf; Clustering; Reduction de dimension; Décodage; FMRI; Clustering; Dimentionality reduction; Decoding

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hoyos-Idrobo, A. (2017). Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. (Doctoral Dissertation). Paris Saclay. Retrieved from http://www.theses.fr/2017SACLS029

Chicago Manual of Style (16th Edition):

Hoyos-Idrobo, Andrés. “Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.” 2017. Doctoral Dissertation, Paris Saclay. Accessed December 16, 2018. http://www.theses.fr/2017SACLS029.

MLA Handbook (7th Edition):

Hoyos-Idrobo, Andrés. “Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings.” 2017. Web. 16 Dec 2018.

Vancouver:

Hoyos-Idrobo A. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. [Internet] [Doctoral dissertation]. Paris Saclay; 2017. [cited 2018 Dec 16]. Available from: http://www.theses.fr/2017SACLS029.

Council of Science Editors:

Hoyos-Idrobo A. Ensembles des modeles en fMRI : l'apprentissage stable à grande échelle : Ensembles of models in fMRI : stable learning in large-scale settings. [Doctoral Dissertation]. Paris Saclay; 2017. Available from: http://www.theses.fr/2017SACLS029


Georgia Tech

20. Li, Qingbin. Online sufficient dimensionality reduction for sequential high-dimensional time-series.

Degree: MS, Industrial and Systems Engineering, 2015, Georgia Tech

In this thesis, we present Online Sufficient Dimensionality Reduction (OSDR) algorithm for real-time high-dimensional sequential data analysis. Advisors/Committee Members: Xie, Yao (advisor), Song, Le (committee member), Zhou, Enlu (committee member).

Subjects/Keywords: Online learning; Dimension reduction; Sufficient dimensionality reduction; Stochastic gradient descent

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, Q. (2015). Online sufficient dimensionality reduction for sequential high-dimensional time-series. (Masters Thesis). Georgia Tech. Retrieved from http://hdl.handle.net/1853/60385

Chicago Manual of Style (16th Edition):

Li, Qingbin. “Online sufficient dimensionality reduction for sequential high-dimensional time-series.” 2015. Masters Thesis, Georgia Tech. Accessed December 16, 2018. http://hdl.handle.net/1853/60385.

MLA Handbook (7th Edition):

Li, Qingbin. “Online sufficient dimensionality reduction for sequential high-dimensional time-series.” 2015. Web. 16 Dec 2018.

Vancouver:

Li Q. Online sufficient dimensionality reduction for sequential high-dimensional time-series. [Internet] [Masters thesis]. Georgia Tech; 2015. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/1853/60385.

Council of Science Editors:

Li Q. Online sufficient dimensionality reduction for sequential high-dimensional time-series. [Masters Thesis]. Georgia Tech; 2015. Available from: http://hdl.handle.net/1853/60385


Clemson University

21. Wilson, Matthew Robert. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.

Degree: MS, Computer Engineering, 2016, Clemson University

 Reducing the input dimensionality of large datasets for subsequent processing will allow the process to become less computationally complex and expensive. This thesis tests if… (more)

Subjects/Keywords: Dimension reduction; feature reduction; feature selection; input reduction; Karnin Sensitivity; Principal Component Analysis

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wilson, M. R. (2016). Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. (Masters Thesis). Clemson University. Retrieved from https://tigerprints.clemson.edu/all_theses/2357

Chicago Manual of Style (16th Edition):

Wilson, Matthew Robert. “Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.” 2016. Masters Thesis, Clemson University. Accessed December 16, 2018. https://tigerprints.clemson.edu/all_theses/2357.

MLA Handbook (7th Edition):

Wilson, Matthew Robert. “Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality.” 2016. Web. 16 Dec 2018.

Vancouver:

Wilson MR. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. [Internet] [Masters thesis]. Clemson University; 2016. [cited 2018 Dec 16]. Available from: https://tigerprints.clemson.edu/all_theses/2357.

Council of Science Editors:

Wilson MR. Comparison of Karnin Sensitivity and Principal Component Analysis in Reducing Input Dimensionality. [Masters Thesis]. Clemson University; 2016. Available from: https://tigerprints.clemson.edu/all_theses/2357

22. Vu, Khac Ky. Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension.

Degree: Docteur es, Informatique, 2016, Paris Saclay

 À l'ère de la numérisation, les données devient pas cher et facile à obtenir. Cela se traduit par de nombreux nouveaux problèmes d'optimisation avec de… (more)

Subjects/Keywords: Réduction de dimension; Approximation; Optimisation; Algorithmes randomisés; Dimension reduction; Approximation; Optimization; Randomized algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Vu, K. K. (2016). Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. (Doctoral Dissertation). Paris Saclay. Retrieved from http://www.theses.fr/2016SACLX031

Chicago Manual of Style (16th Edition):

Vu, Khac Ky. “Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension.” 2016. Doctoral Dissertation, Paris Saclay. Accessed December 16, 2018. http://www.theses.fr/2016SACLX031.

MLA Handbook (7th Edition):

Vu, Khac Ky. “Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension.” 2016. Web. 16 Dec 2018.

Vancouver:

Vu KK. Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. [Internet] [Doctoral dissertation]. Paris Saclay; 2016. [cited 2018 Dec 16]. Available from: http://www.theses.fr/2016SACLX031.

Council of Science Editors:

Vu KK. Random projection for high-dimensional optimization : Projection aléatoire pour l'optimisation de grande dimension. [Doctoral Dissertation]. Paris Saclay; 2016. Available from: http://www.theses.fr/2016SACLX031

23. Lu, Weizhi. Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet.

Degree: Docteur es, Traitement du signal et de l'image, 2014, Rennes, INSA

Cette thèse étudie et apporte des améliorations significatives sur trois techniques répandues en réduction de dimension : l'acquisition parcimonieuse (ou l'échantillonnage parcimonieux), la projection aléatoire… (more)

Subjects/Keywords: Réduction de dimension; Dimension reduction; Compressed sensing; Random projection; Sparse representation; 621.382

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lu, W. (2014). Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. (Doctoral Dissertation). Rennes, INSA. Retrieved from http://www.theses.fr/2014ISAR0010

Chicago Manual of Style (16th Edition):

Lu, Weizhi. “Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet.” 2014. Doctoral Dissertation, Rennes, INSA. Accessed December 16, 2018. http://www.theses.fr/2014ISAR0010.

MLA Handbook (7th Edition):

Lu, Weizhi. “Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet.” 2014. Web. 16 Dec 2018.

Vancouver:

Lu W. Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. [Internet] [Doctoral dissertation]. Rennes, INSA; 2014. [cited 2018 Dec 16]. Available from: http://www.theses.fr/2014ISAR0010.

Council of Science Editors:

Lu W. Contribution to dimension reduction techniques : application to object tracking : Contribution aux techniques de la réduction de dimension : application au suivi d'objet. [Doctoral Dissertation]. Rennes, INSA; 2014. Available from: http://www.theses.fr/2014ISAR0010


University of Waterloo

24. Liu, Kai. Directional Control of Generating Brownian Path under Quasi Monte Carlo.

Degree: 2012, University of Waterloo

 Quasi-Monte Carlo (QMC) methods are playing an increasingly important role in computational finance. This is attributed to the increased complexity of the derivative securities and… (more)

Subjects/Keywords: QMC; Low Discrepancy Sequence; Effective Dimension; Dimension Reduction; PCA; BB; LT; OT; FOT; DC

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liu, K. (2012). Directional Control of Generating Brownian Path under Quasi Monte Carlo. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/6984

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Liu, Kai. “Directional Control of Generating Brownian Path under Quasi Monte Carlo.” 2012. Thesis, University of Waterloo. Accessed December 16, 2018. http://hdl.handle.net/10012/6984.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Liu, Kai. “Directional Control of Generating Brownian Path under Quasi Monte Carlo.” 2012. Web. 16 Dec 2018.

Vancouver:

Liu K. Directional Control of Generating Brownian Path under Quasi Monte Carlo. [Internet] [Thesis]. University of Waterloo; 2012. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/10012/6984.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu K. Directional Control of Generating Brownian Path under Quasi Monte Carlo. [Thesis]. University of Waterloo; 2012. Available from: http://hdl.handle.net/10012/6984

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

25. Chiancone, Alessandro. Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions.

Degree: Docteur es, Mathématiques Appliquées, 2016, Grenoble Alpes

Cette thèse propose trois extensions de la Régression linéaire par tranches (Sliced Inverse Regression, SIR), notamment Collaborative SIR, Student SIR et Knockoff SIR.Une des faiblesses… (more)

Subjects/Keywords: Régression linéaire par tranches; Reduction de dimension; Selection de variables; Sliced Inverse Regression; Dimension reduction; Variable selection; 510

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chiancone, A. (2016). Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions. (Doctoral Dissertation). Grenoble Alpes. Retrieved from http://www.theses.fr/2016GREAM051

Chicago Manual of Style (16th Edition):

Chiancone, Alessandro. “Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions.” 2016. Doctoral Dissertation, Grenoble Alpes. Accessed December 16, 2018. http://www.theses.fr/2016GREAM051.

MLA Handbook (7th Edition):

Chiancone, Alessandro. “Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions.” 2016. Web. 16 Dec 2018.

Vancouver:

Chiancone A. Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions. [Internet] [Doctoral dissertation]. Grenoble Alpes; 2016. [cited 2018 Dec 16]. Available from: http://www.theses.fr/2016GREAM051.

Council of Science Editors:

Chiancone A. Réduction de dimension via Sliced Inverse Regression : Idées et nouvelles propositions : Dimension reductio via Sliced Inverse Regression : ideas and extensions. [Doctoral Dissertation]. Grenoble Alpes; 2016. Available from: http://www.theses.fr/2016GREAM051


Texas State University – San Marcos

26. Reiss, Randolf H. Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression.

Degree: MS, Mathematics, 2013, Texas State University – San Marcos

 A basic theory of eigenvalues and eigenvectors as a means to reduce the dimension of data, is presented. Iterative methods for finding eigenvalues and eigenvectors… (more)

Subjects/Keywords: Eigenvector; Eigenvalue; Dimension Reduction; Power Method; Partial Least Squares; Eigenvalues; Eigenvectors; Data reduction

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Reiss, R. H. (2013). Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression. (Masters Thesis). Texas State University – San Marcos. Retrieved from https://digital.library.txstate.edu/handle/10877/4696

Chicago Manual of Style (16th Edition):

Reiss, Randolf H. “Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression.” 2013. Masters Thesis, Texas State University – San Marcos. Accessed December 16, 2018. https://digital.library.txstate.edu/handle/10877/4696.

MLA Handbook (7th Edition):

Reiss, Randolf H. “Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression.” 2013. Web. 16 Dec 2018.

Vancouver:

Reiss RH. Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression. [Internet] [Masters thesis]. Texas State University – San Marcos; 2013. [cited 2018 Dec 16]. Available from: https://digital.library.txstate.edu/handle/10877/4696.

Council of Science Editors:

Reiss RH. Eigenvalues and Eigenvectors in Data Dimension Reduction for Regression. [Masters Thesis]. Texas State University – San Marcos; 2013. Available from: https://digital.library.txstate.edu/handle/10877/4696


Duke University

27. Wang, Ye. Bayesian Computation for High-Dimensional Continuous & Sparse Count Data .

Degree: 2018, Duke University

  Probabilistic modeling of multidimensional data is a common problem in practice. When the data is continuous, one common approach is to suppose that the… (more)

Subjects/Keywords: Statistics; dimension reduction; high dimensional; manifold learning; MCMC; scalable inference

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, Y. (2018). Bayesian Computation for High-Dimensional Continuous & Sparse Count Data . (Thesis). Duke University. Retrieved from http://hdl.handle.net/10161/16853

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Wang, Ye. “Bayesian Computation for High-Dimensional Continuous & Sparse Count Data .” 2018. Thesis, Duke University. Accessed December 16, 2018. http://hdl.handle.net/10161/16853.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Wang, Ye. “Bayesian Computation for High-Dimensional Continuous & Sparse Count Data .” 2018. Web. 16 Dec 2018.

Vancouver:

Wang Y. Bayesian Computation for High-Dimensional Continuous & Sparse Count Data . [Internet] [Thesis]. Duke University; 2018. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/10161/16853.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang Y. Bayesian Computation for High-Dimensional Continuous & Sparse Count Data . [Thesis]. Duke University; 2018. Available from: http://hdl.handle.net/10161/16853

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Duke University

28. Mao, Kai. Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression .

Degree: 2009, Duke University

  We propose nonparametric Bayesian models for supervised dimension reduction and regression problems. Supervised dimension reduction is a setting where one needs to reduce the… (more)

Subjects/Keywords: Statistics; Dirichlet process; Kernel models; Nonparametric Bayesian; Supervised dimension reduction

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mao, K. (2009). Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression . (Thesis). Duke University. Retrieved from http://hdl.handle.net/10161/1581

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Mao, Kai. “Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression .” 2009. Thesis, Duke University. Accessed December 16, 2018. http://hdl.handle.net/10161/1581.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Mao, Kai. “Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression .” 2009. Web. 16 Dec 2018.

Vancouver:

Mao K. Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression . [Internet] [Thesis]. Duke University; 2009. [cited 2018 Dec 16]. Available from: http://hdl.handle.net/10161/1581.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Mao K. Nonparametric Bayesian Models for Supervised Dimension Reduction and Regression . [Thesis]. Duke University; 2009. Available from: http://hdl.handle.net/10161/1581

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Temple University

29. Zhang, Yongxu. ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS.

Degree: PhD, 2017, Temple University

Statistics

As a useful tool for multivariate analysis, sufficient dimension reduction (SDR) aims to reduce the predictor dimensionality while simultaneously keeping the full regression information,… (more)

Subjects/Keywords: Statistics;

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhang, Y. (2017). ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS. (Doctoral Dissertation). Temple University. Retrieved from http://digital.library.temple.edu/u?/p245801coll10,449833

Chicago Manual of Style (16th Edition):

Zhang, Yongxu. “ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS.” 2017. Doctoral Dissertation, Temple University. Accessed December 16, 2018. http://digital.library.temple.edu/u?/p245801coll10,449833.

MLA Handbook (7th Edition):

Zhang, Yongxu. “ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS.” 2017. Web. 16 Dec 2018.

Vancouver:

Zhang Y. ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS. [Internet] [Doctoral dissertation]. Temple University; 2017. [cited 2018 Dec 16]. Available from: http://digital.library.temple.edu/u?/p245801coll10,449833.

Council of Science Editors:

Zhang Y. ON TWO NEW ESTIMATORS FOR THE CMS THROUGH EXTENSIONS OF OLS. [Doctoral Dissertation]. Temple University; 2017. Available from: http://digital.library.temple.edu/u?/p245801coll10,449833


University of Guelph

30. Morris, Katherine. Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions .

Degree: 2012, University of Guelph

 We introduce a dimension reduction method for model-based clustering obtained from a finite mixture of t-distributions. This approach is based on existing work on reducing… (more)

Subjects/Keywords: mclust; tEIGEN; model-based clustering; dimension reduction; multivariate t-mixtures

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Morris, K. (2012). Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions . (Thesis). University of Guelph. Retrieved from https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Morris, Katherine. “Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions .” 2012. Thesis, University of Guelph. Accessed December 16, 2018. https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Morris, Katherine. “Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions .” 2012. Web. 16 Dec 2018.

Vancouver:

Morris K. Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions . [Internet] [Thesis]. University of Guelph; 2012. [cited 2018 Dec 16]. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Morris K. Dimension Reduction for Model-based Clustering via Mixtures of Multivariate t-Distributions . [Thesis]. University of Guelph; 2012. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3863

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

[1] [2] [3] [4] [5] [6] [7]

.