Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Sparse data). Showing records 1 – 30 of 132 total matches.

[1] [2] [3] [4] [5]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Country

▼ Search Limiters


Penn State University

1. Chatterjee, Anirban. Exploiting Sparsity, Structure, and Geometry for Knowledge Discovery.

Degree: PhD, Computer Science and Engineering, 2011, Penn State University

Data-driven discovery seeks to obtain a computational model of the underlying process using observed data on a large number of variables. Observations can be viewed… (more)

Subjects/Keywords: sparse graph embedding; sparse graph partitioning; data mining; sparse linear solvers

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chatterjee, A. (2011). Exploiting Sparsity, Structure, and Geometry for Knowledge Discovery. (Doctoral Dissertation). Penn State University. Retrieved from https://etda.libraries.psu.edu/catalog/12026

Chicago Manual of Style (16th Edition):

Chatterjee, Anirban. “Exploiting Sparsity, Structure, and Geometry for Knowledge Discovery.” 2011. Doctoral Dissertation, Penn State University. Accessed January 18, 2020. https://etda.libraries.psu.edu/catalog/12026.

MLA Handbook (7th Edition):

Chatterjee, Anirban. “Exploiting Sparsity, Structure, and Geometry for Knowledge Discovery.” 2011. Web. 18 Jan 2020.

Vancouver:

Chatterjee A. Exploiting Sparsity, Structure, and Geometry for Knowledge Discovery. [Internet] [Doctoral dissertation]. Penn State University; 2011. [cited 2020 Jan 18]. Available from: https://etda.libraries.psu.edu/catalog/12026.

Council of Science Editors:

Chatterjee A. Exploiting Sparsity, Structure, and Geometry for Knowledge Discovery. [Doctoral Dissertation]. Penn State University; 2011. Available from: https://etda.libraries.psu.edu/catalog/12026


University of Hong Kong

2. Li, Mingfei. Sparse representation and fast processing of massive data.

Degree: M. Phil., 2012, University of Hong Kong

Many computational problems involve massive data. A reasonable solution to those problems should be able to store and process the data in a effective manner.… (more)

Subjects/Keywords: Data mining.; Sparse matrices.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, M. (2012). Sparse representation and fast processing of massive data. (Masters Thesis). University of Hong Kong. Retrieved from Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480

Chicago Manual of Style (16th Edition):

Li, Mingfei. “Sparse representation and fast processing of massive data.” 2012. Masters Thesis, University of Hong Kong. Accessed January 18, 2020. Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480.

MLA Handbook (7th Edition):

Li, Mingfei. “Sparse representation and fast processing of massive data.” 2012. Web. 18 Jan 2020.

Vancouver:

Li M. Sparse representation and fast processing of massive data. [Internet] [Masters thesis]. University of Hong Kong; 2012. [cited 2020 Jan 18]. Available from: Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480.

Council of Science Editors:

Li M. Sparse representation and fast processing of massive data. [Masters Thesis]. University of Hong Kong; 2012. Available from: Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480


University of Southern California

3. Lin, Yenting. Transmission tomography for high contrast media based on sparse data.

Degree: PhD, Electrical Engineering, 2013, University of Southern California

 Transmission tomography is a powerful tool to image the interior structure based on measured data on the boundary. It provides a "non-destructive" imaging and widely… (more)

Subjects/Keywords: tomography; high contrast; sparse data

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lin, Y. (2013). Transmission tomography for high contrast media based on sparse data. (Doctoral Dissertation). University of Southern California. Retrieved from http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll3/id/315223/rec/7584

Chicago Manual of Style (16th Edition):

Lin, Yenting. “Transmission tomography for high contrast media based on sparse data.” 2013. Doctoral Dissertation, University of Southern California. Accessed January 18, 2020. http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll3/id/315223/rec/7584.

MLA Handbook (7th Edition):

Lin, Yenting. “Transmission tomography for high contrast media based on sparse data.” 2013. Web. 18 Jan 2020.

Vancouver:

Lin Y. Transmission tomography for high contrast media based on sparse data. [Internet] [Doctoral dissertation]. University of Southern California; 2013. [cited 2020 Jan 18]. Available from: http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll3/id/315223/rec/7584.

Council of Science Editors:

Lin Y. Transmission tomography for high contrast media based on sparse data. [Doctoral Dissertation]. University of Southern California; 2013. Available from: http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll3/id/315223/rec/7584


Texas A&M University

4. Ren, Shaogang. SCALABLE ALGORITHMS FOR HIGH DIMENSIONAL STRUCTURED DATA.

Degree: PhD, Computer Engineering, 2017, Texas A&M University

 Emerging technologies and digital devices provide us with increasingly large volume of data with respect to both the sample size and the number of features.… (more)

Subjects/Keywords: Sparse Learning; LASSO; Structured Sparse; Scalability; Big Data

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ren, S. (2017). SCALABLE ALGORITHMS FOR HIGH DIMENSIONAL STRUCTURED DATA. (Doctoral Dissertation). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/173033

Chicago Manual of Style (16th Edition):

Ren, Shaogang. “SCALABLE ALGORITHMS FOR HIGH DIMENSIONAL STRUCTURED DATA.” 2017. Doctoral Dissertation, Texas A&M University. Accessed January 18, 2020. http://hdl.handle.net/1969.1/173033.

MLA Handbook (7th Edition):

Ren, Shaogang. “SCALABLE ALGORITHMS FOR HIGH DIMENSIONAL STRUCTURED DATA.” 2017. Web. 18 Jan 2020.

Vancouver:

Ren S. SCALABLE ALGORITHMS FOR HIGH DIMENSIONAL STRUCTURED DATA. [Internet] [Doctoral dissertation]. Texas A&M University; 2017. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/1969.1/173033.

Council of Science Editors:

Ren S. SCALABLE ALGORITHMS FOR HIGH DIMENSIONAL STRUCTURED DATA. [Doctoral Dissertation]. Texas A&M University; 2017. Available from: http://hdl.handle.net/1969.1/173033


University of Lethbridge

5. Hasan, Mahmudul. DSJM : a software toolkit for direct determination of sparse Jacobian matrices .

Degree: 2011, University of Lethbridge

 DSJM is a software toolkit written in portable C++ that enables direct determination of sparse Jacobian matrices whose sparsity pattern is a priori known. Using… (more)

Subjects/Keywords: Sparse matrices; Sparse matrices  – Computer programs; Jacobians  – Data processing; Dissertations, Academic

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hasan, M. (2011). DSJM : a software toolkit for direct determination of sparse Jacobian matrices . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/3216

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hasan, Mahmudul. “DSJM : a software toolkit for direct determination of sparse Jacobian matrices .” 2011. Thesis, University of Lethbridge. Accessed January 18, 2020. http://hdl.handle.net/10133/3216.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hasan, Mahmudul. “DSJM : a software toolkit for direct determination of sparse Jacobian matrices .” 2011. Web. 18 Jan 2020.

Vancouver:

Hasan M. DSJM : a software toolkit for direct determination of sparse Jacobian matrices . [Internet] [Thesis]. University of Lethbridge; 2011. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/10133/3216.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hasan M. DSJM : a software toolkit for direct determination of sparse Jacobian matrices . [Thesis]. University of Lethbridge; 2011. Available from: http://hdl.handle.net/10133/3216

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Georgia Tech

6. Whitaker, Bradley M. Modifying sparse coding to model imbalanced datasets.

Degree: PhD, Electrical and Computer Engineering, 2018, Georgia Tech

 The objective of this research is to explore the use of sparse coding as a tool for unsupervised feature learning to more effectively model imbalanced… (more)

Subjects/Keywords: Sparse coding; Imbalanced data; Machine learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Whitaker, B. M. (2018). Modifying sparse coding to model imbalanced datasets. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/59919

Chicago Manual of Style (16th Edition):

Whitaker, Bradley M. “Modifying sparse coding to model imbalanced datasets.” 2018. Doctoral Dissertation, Georgia Tech. Accessed January 18, 2020. http://hdl.handle.net/1853/59919.

MLA Handbook (7th Edition):

Whitaker, Bradley M. “Modifying sparse coding to model imbalanced datasets.” 2018. Web. 18 Jan 2020.

Vancouver:

Whitaker BM. Modifying sparse coding to model imbalanced datasets. [Internet] [Doctoral dissertation]. Georgia Tech; 2018. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/1853/59919.

Council of Science Editors:

Whitaker BM. Modifying sparse coding to model imbalanced datasets. [Doctoral Dissertation]. Georgia Tech; 2018. Available from: http://hdl.handle.net/1853/59919


Stellenbosch University

7. Stulumani, Agrippa. Classification in high dimensional data using sparse techniques.

Degree: MCom, Statistics and Actuarial Science, 2019, Stellenbosch University

ENGLISH SUMMARY : Traditional classification techniques fail in the analysis of high-dimensional data. In response, new classification techniques and accompanying theory have recently emerged. These… (more)

Subjects/Keywords: High dimensional data; Mathematical statistics; Sparse classification; Sparse grids; Dimension reduction (Statistics); UCTD

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Stulumani, A. (2019). Classification in high dimensional data using sparse techniques. (Thesis). Stellenbosch University. Retrieved from http://hdl.handle.net/10019.1/105792

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Stulumani, Agrippa. “Classification in high dimensional data using sparse techniques.” 2019. Thesis, Stellenbosch University. Accessed January 18, 2020. http://hdl.handle.net/10019.1/105792.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Stulumani, Agrippa. “Classification in high dimensional data using sparse techniques.” 2019. Web. 18 Jan 2020.

Vancouver:

Stulumani A. Classification in high dimensional data using sparse techniques. [Internet] [Thesis]. Stellenbosch University; 2019. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/10019.1/105792.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Stulumani A. Classification in high dimensional data using sparse techniques. [Thesis]. Stellenbosch University; 2019. Available from: http://hdl.handle.net/10019.1/105792

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Lethbridge

8. University of Lethbridge. Faculty of Arts and Science. An improved implementation of sparsity detection of sparse derivative matrices .

Degree: 2018, University of Lethbridge

 Optimization is a crucial branch of research with application in numerous domain. Determination of sparsity is a vital stream of optimization research with potentials for… (more)

Subjects/Keywords: Jacobians; Combinatorial optimization; Sparse matrices  – Data processing; Graph coloring; Parallel programs (Computer programs); Matix devrivatives; sparse data structure; CPR algorithm; sparse derivative matrices; Jacobian matrix; multilevel algorithm; parallel implementation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Science, U. o. L. F. o. A. a. (2018). An improved implementation of sparsity detection of sparse derivative matrices . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/5266

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Science, University of Lethbridge. Faculty of Arts and. “An improved implementation of sparsity detection of sparse derivative matrices .” 2018. Thesis, University of Lethbridge. Accessed January 18, 2020. http://hdl.handle.net/10133/5266.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Science, University of Lethbridge. Faculty of Arts and. “An improved implementation of sparsity detection of sparse derivative matrices .” 2018. Web. 18 Jan 2020.

Vancouver:

Science UoLFoAa. An improved implementation of sparsity detection of sparse derivative matrices . [Internet] [Thesis]. University of Lethbridge; 2018. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/10133/5266.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Science UoLFoAa. An improved implementation of sparsity detection of sparse derivative matrices . [Thesis]. University of Lethbridge; 2018. Available from: http://hdl.handle.net/10133/5266

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Minnesota

9. Ebtehaj, Mohammad. Leveraging sparsity in variational data assimilation.

Degree: MS, Mathematics, 2013, University of Minnesota

 Nowadays data assimilation is an essential component of any effective environmental prediction system. Environmental prediction models are, indeed, initial value problems and their forecast skills… (more)

Subjects/Keywords: Data assimilation; Discrete cosine domian; Sparse regularization; Wavelet

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ebtehaj, M. (2013). Leveraging sparsity in variational data assimilation. (Masters Thesis). University of Minnesota. Retrieved from http://hdl.handle.net/11299/162311

Chicago Manual of Style (16th Edition):

Ebtehaj, Mohammad. “Leveraging sparsity in variational data assimilation.” 2013. Masters Thesis, University of Minnesota. Accessed January 18, 2020. http://hdl.handle.net/11299/162311.

MLA Handbook (7th Edition):

Ebtehaj, Mohammad. “Leveraging sparsity in variational data assimilation.” 2013. Web. 18 Jan 2020.

Vancouver:

Ebtehaj M. Leveraging sparsity in variational data assimilation. [Internet] [Masters thesis]. University of Minnesota; 2013. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/11299/162311.

Council of Science Editors:

Ebtehaj M. Leveraging sparsity in variational data assimilation. [Masters Thesis]. University of Minnesota; 2013. Available from: http://hdl.handle.net/11299/162311


University of Technology, Sydney

10. Zhou, Tianyi. Compressed learning.

Degree: 2013, University of Technology, Sydney

 There has been an explosion of data derived from the internet and other digital sources. These data are usually multi-dimensional, massive in volume, frequently incomplete,… (more)

Subjects/Keywords: Compressed learning.; Sparse learning.; Machine learning.; Manifold learning.; Big data

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhou, T. (2013). Compressed learning. (Thesis). University of Technology, Sydney. Retrieved from http://hdl.handle.net/10453/24180

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Zhou, Tianyi. “Compressed learning.” 2013. Thesis, University of Technology, Sydney. Accessed January 18, 2020. http://hdl.handle.net/10453/24180.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Zhou, Tianyi. “Compressed learning.” 2013. Web. 18 Jan 2020.

Vancouver:

Zhou T. Compressed learning. [Internet] [Thesis]. University of Technology, Sydney; 2013. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/10453/24180.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhou T. Compressed learning. [Thesis]. University of Technology, Sydney; 2013. Available from: http://hdl.handle.net/10453/24180

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

11. Bonner, Ashley. Contributions to Sparse Statistical Methods for Data Integration.

Degree: PhD, 2018, McMaster University

Background: Scientists are measuring multiple sources of massive, complex, and diverse data in hopes to better understand the principles underpinning complex phenomena. Sophisticated statistical and… (more)

Subjects/Keywords: biostatistics; statistics; genetics; genomics; sparse methods; data integration

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bonner, A. (2018). Contributions to Sparse Statistical Methods for Data Integration. (Doctoral Dissertation). McMaster University. Retrieved from http://hdl.handle.net/11375/24009

Chicago Manual of Style (16th Edition):

Bonner, Ashley. “Contributions to Sparse Statistical Methods for Data Integration.” 2018. Doctoral Dissertation, McMaster University. Accessed January 18, 2020. http://hdl.handle.net/11375/24009.

MLA Handbook (7th Edition):

Bonner, Ashley. “Contributions to Sparse Statistical Methods for Data Integration.” 2018. Web. 18 Jan 2020.

Vancouver:

Bonner A. Contributions to Sparse Statistical Methods for Data Integration. [Internet] [Doctoral dissertation]. McMaster University; 2018. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/11375/24009.

Council of Science Editors:

Bonner A. Contributions to Sparse Statistical Methods for Data Integration. [Doctoral Dissertation]. McMaster University; 2018. Available from: http://hdl.handle.net/11375/24009


Tulane University

12. Gossmann, Alexej. Regaining control of false findings in feature selection, classification, and prediction on neuroimaging and genomics data.

Degree: 2018, Tulane University

The technological advances of past decades have led to the accumulation of large amounts of genomic and neuroimaging data, enabling novel strategies in precision medicine.… (more)

Subjects/Keywords: Sparse models; False discovery rate control; Data reuse

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gossmann, A. (2018). Regaining control of false findings in feature selection, classification, and prediction on neuroimaging and genomics data. (Thesis). Tulane University. Retrieved from https://digitallibrary.tulane.edu/islandora/object/tulane:80099

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Gossmann, Alexej. “Regaining control of false findings in feature selection, classification, and prediction on neuroimaging and genomics data.” 2018. Thesis, Tulane University. Accessed January 18, 2020. https://digitallibrary.tulane.edu/islandora/object/tulane:80099.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Gossmann, Alexej. “Regaining control of false findings in feature selection, classification, and prediction on neuroimaging and genomics data.” 2018. Web. 18 Jan 2020.

Vancouver:

Gossmann A. Regaining control of false findings in feature selection, classification, and prediction on neuroimaging and genomics data. [Internet] [Thesis]. Tulane University; 2018. [cited 2020 Jan 18]. Available from: https://digitallibrary.tulane.edu/islandora/object/tulane:80099.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Gossmann A. Regaining control of false findings in feature selection, classification, and prediction on neuroimaging and genomics data. [Thesis]. Tulane University; 2018. Available from: https://digitallibrary.tulane.edu/islandora/object/tulane:80099

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Guelph

13. Bak, Stephen. Generalized linear regression model with LASSO, group LASSO, and sparse group LASSO regularization methods for finding bacteria associated with colorectal cancer using microbiome data .

Degree: 2017, University of Guelph

 With ever increasing advancements in microbiome sequencing technologies, the need for efficient statistical modelling of these systems has become apparent. Most microbiome data is filled… (more)

Subjects/Keywords: LASSO; regression; Microbiome; data; cancer; colon; regularization; multinomial; binomial; sparse; group

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bak, S. (2017). Generalized linear regression model with LASSO, group LASSO, and sparse group LASSO regularization methods for finding bacteria associated with colorectal cancer using microbiome data . (Thesis). University of Guelph. Retrieved from https://atrium.lib.uoguelph.ca/xmlui/handle/10214/12096

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Bak, Stephen. “Generalized linear regression model with LASSO, group LASSO, and sparse group LASSO regularization methods for finding bacteria associated with colorectal cancer using microbiome data .” 2017. Thesis, University of Guelph. Accessed January 18, 2020. https://atrium.lib.uoguelph.ca/xmlui/handle/10214/12096.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Bak, Stephen. “Generalized linear regression model with LASSO, group LASSO, and sparse group LASSO regularization methods for finding bacteria associated with colorectal cancer using microbiome data .” 2017. Web. 18 Jan 2020.

Vancouver:

Bak S. Generalized linear regression model with LASSO, group LASSO, and sparse group LASSO regularization methods for finding bacteria associated with colorectal cancer using microbiome data . [Internet] [Thesis]. University of Guelph; 2017. [cited 2020 Jan 18]. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/12096.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Bak S. Generalized linear regression model with LASSO, group LASSO, and sparse group LASSO regularization methods for finding bacteria associated with colorectal cancer using microbiome data . [Thesis]. University of Guelph; 2017. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/12096

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Arizona State University

14. Thulasiram, Ramesh L. Sparse Learning Package with Stability Selection and Application to Alzheimer's Disease.

Degree: MS, Computer Science, 2011, Arizona State University

Sparse learning is a technique in machine learning for feature selection and dimensionality reduction, to find a sparse set of the most relevant features. In… (more)

Subjects/Keywords: Computer Science; Statistics; Mathematics; Data Mining; Lasso; Machine Learning; Sparse Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Thulasiram, R. L. (2011). Sparse Learning Package with Stability Selection and Application to Alzheimer's Disease. (Masters Thesis). Arizona State University. Retrieved from http://repository.asu.edu/items/9486

Chicago Manual of Style (16th Edition):

Thulasiram, Ramesh L. “Sparse Learning Package with Stability Selection and Application to Alzheimer's Disease.” 2011. Masters Thesis, Arizona State University. Accessed January 18, 2020. http://repository.asu.edu/items/9486.

MLA Handbook (7th Edition):

Thulasiram, Ramesh L. “Sparse Learning Package with Stability Selection and Application to Alzheimer's Disease.” 2011. Web. 18 Jan 2020.

Vancouver:

Thulasiram RL. Sparse Learning Package with Stability Selection and Application to Alzheimer's Disease. [Internet] [Masters thesis]. Arizona State University; 2011. [cited 2020 Jan 18]. Available from: http://repository.asu.edu/items/9486.

Council of Science Editors:

Thulasiram RL. Sparse Learning Package with Stability Selection and Application to Alzheimer's Disease. [Masters Thesis]. Arizona State University; 2011. Available from: http://repository.asu.edu/items/9486


University of Texas – Austin

15. Kim, Youngchun. Signal acquisition challenges in mobile systems.

Degree: PhD, Electrical and Computer Engineering, 2018, University of Texas – Austin

 In recent decades, the advent of mobile computing has changed human lives by providing information that was not available in the past. The mobile computing… (more)

Subjects/Keywords: Sparse signal processing; Compressed sensing; Random sampling; Data converter; Sequential detection

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kim, Y. (2018). Signal acquisition challenges in mobile systems. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/68089

Chicago Manual of Style (16th Edition):

Kim, Youngchun. “Signal acquisition challenges in mobile systems.” 2018. Doctoral Dissertation, University of Texas – Austin. Accessed January 18, 2020. http://hdl.handle.net/2152/68089.

MLA Handbook (7th Edition):

Kim, Youngchun. “Signal acquisition challenges in mobile systems.” 2018. Web. 18 Jan 2020.

Vancouver:

Kim Y. Signal acquisition challenges in mobile systems. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2018. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/2152/68089.

Council of Science Editors:

Kim Y. Signal acquisition challenges in mobile systems. [Doctoral Dissertation]. University of Texas – Austin; 2018. Available from: http://hdl.handle.net/2152/68089


Princeton University

16. Chung, Neo Christopher Honghoon. Statistical Inference of Variables Driving Systematic Variation in High-Dimensional Biological Data .

Degree: PhD, 2014, Princeton University

 Modern genomic technologies collect an ever-increasing amount of information (e.g., gene expression and genotypes) about model organisms and humans. Systematic patterns of variation in such… (more)

Subjects/Keywords: data; jackstraw; latent variable model; principal component analysis; resampling; sparse pca

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chung, N. C. H. (2014). Statistical Inference of Variables Driving Systematic Variation in High-Dimensional Biological Data . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01rv042w30x

Chicago Manual of Style (16th Edition):

Chung, Neo Christopher Honghoon. “Statistical Inference of Variables Driving Systematic Variation in High-Dimensional Biological Data .” 2014. Doctoral Dissertation, Princeton University. Accessed January 18, 2020. http://arks.princeton.edu/ark:/88435/dsp01rv042w30x.

MLA Handbook (7th Edition):

Chung, Neo Christopher Honghoon. “Statistical Inference of Variables Driving Systematic Variation in High-Dimensional Biological Data .” 2014. Web. 18 Jan 2020.

Vancouver:

Chung NCH. Statistical Inference of Variables Driving Systematic Variation in High-Dimensional Biological Data . [Internet] [Doctoral dissertation]. Princeton University; 2014. [cited 2020 Jan 18]. Available from: http://arks.princeton.edu/ark:/88435/dsp01rv042w30x.

Council of Science Editors:

Chung NCH. Statistical Inference of Variables Driving Systematic Variation in High-Dimensional Biological Data . [Doctoral Dissertation]. Princeton University; 2014. Available from: http://arks.princeton.edu/ark:/88435/dsp01rv042w30x

17. Tran, Loc. High Dimensional Data Set Analysis Using a Large-Scale Manifold Learning Approach.

Degree: PhD, Electrical/Computer Engineering, 2014, Old Dominion University

  Because of technological advances, a trend occurs for data sets increasing in size and dimensionality. Processing these large scale data sets is challenging for… (more)

Subjects/Keywords: Manifold learning; Sparse learning; Manifolds; Big data; Computer Engineering; Computer Sciences

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tran, L. (2014). High Dimensional Data Set Analysis Using a Large-Scale Manifold Learning Approach. (Doctoral Dissertation). Old Dominion University. Retrieved from 9781321316513 ; https://digitalcommons.odu.edu/ece_etds/186

Chicago Manual of Style (16th Edition):

Tran, Loc. “High Dimensional Data Set Analysis Using a Large-Scale Manifold Learning Approach.” 2014. Doctoral Dissertation, Old Dominion University. Accessed January 18, 2020. 9781321316513 ; https://digitalcommons.odu.edu/ece_etds/186.

MLA Handbook (7th Edition):

Tran, Loc. “High Dimensional Data Set Analysis Using a Large-Scale Manifold Learning Approach.” 2014. Web. 18 Jan 2020.

Vancouver:

Tran L. High Dimensional Data Set Analysis Using a Large-Scale Manifold Learning Approach. [Internet] [Doctoral dissertation]. Old Dominion University; 2014. [cited 2020 Jan 18]. Available from: 9781321316513 ; https://digitalcommons.odu.edu/ece_etds/186.

Council of Science Editors:

Tran L. High Dimensional Data Set Analysis Using a Large-Scale Manifold Learning Approach. [Doctoral Dissertation]. Old Dominion University; 2014. Available from: 9781321316513 ; https://digitalcommons.odu.edu/ece_etds/186


Halmstad University

18. Vogetseder, Georg. Functional Analysis of Real World Truck Fuel Consumption Data.

Degree: Computer and Electrical Engineering (IDE), 2008, Halmstad University

  This thesis covers the analysis of sparse and irregular fuel consumption data of long distance haulage articulate trucks. It is shown that this kind… (more)

Subjects/Keywords: PCA; Clustering; Sparse data

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Vogetseder, G. (2008). Functional Analysis of Real World Truck Fuel Consumption Data. (Thesis). Halmstad University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-1148

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Vogetseder, Georg. “Functional Analysis of Real World Truck Fuel Consumption Data.” 2008. Thesis, Halmstad University. Accessed January 18, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-1148.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Vogetseder, Georg. “Functional Analysis of Real World Truck Fuel Consumption Data.” 2008. Web. 18 Jan 2020.

Vancouver:

Vogetseder G. Functional Analysis of Real World Truck Fuel Consumption Data. [Internet] [Thesis]. Halmstad University; 2008. [cited 2020 Jan 18]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-1148.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Vogetseder G. Functional Analysis of Real World Truck Fuel Consumption Data. [Thesis]. Halmstad University; 2008. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-1148

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Georgia State University

19. Wu, Xiaolong. Optimizing Sparse Matrix-Matrix Multiplication on a Heterogeneous CPU-GPU Platform.

Degree: MS, Computer Science, 2015, Georgia State University

Sparse Matrix-Matrix multiplication (SpMM) is a fundamental operation over irregular data, which is widely used in graph algorithms, such as finding minimum spanning trees… (more)

Subjects/Keywords: Sparse matrix-matrix multiplication; Data locality; Pipelining; GPU

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wu, X. (2015). Optimizing Sparse Matrix-Matrix Multiplication on a Heterogeneous CPU-GPU Platform. (Thesis). Georgia State University. Retrieved from https://scholarworks.gsu.edu/cs_theses/84

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Wu, Xiaolong. “Optimizing Sparse Matrix-Matrix Multiplication on a Heterogeneous CPU-GPU Platform.” 2015. Thesis, Georgia State University. Accessed January 18, 2020. https://scholarworks.gsu.edu/cs_theses/84.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Wu, Xiaolong. “Optimizing Sparse Matrix-Matrix Multiplication on a Heterogeneous CPU-GPU Platform.” 2015. Web. 18 Jan 2020.

Vancouver:

Wu X. Optimizing Sparse Matrix-Matrix Multiplication on a Heterogeneous CPU-GPU Platform. [Internet] [Thesis]. Georgia State University; 2015. [cited 2020 Jan 18]. Available from: https://scholarworks.gsu.edu/cs_theses/84.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wu X. Optimizing Sparse Matrix-Matrix Multiplication on a Heterogeneous CPU-GPU Platform. [Thesis]. Georgia State University; 2015. Available from: https://scholarworks.gsu.edu/cs_theses/84

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Iowa State University

20. Zhu, Weicheng. Topics in sparse functional data analysis.

Degree: 2018, Iowa State University

 This dissertation consists of three research papers that address different problems in modeling sparse functional data. The first paper (Chapter 2) focuses on the statistical… (more)

Subjects/Keywords: HMRI; image imputation; R; sparse functional data; STFIT; Statistics and Probability

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhu, W. (2018). Topics in sparse functional data analysis. (Thesis). Iowa State University. Retrieved from https://lib.dr.iastate.edu/etd/17378

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Zhu, Weicheng. “Topics in sparse functional data analysis.” 2018. Thesis, Iowa State University. Accessed January 18, 2020. https://lib.dr.iastate.edu/etd/17378.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Zhu, Weicheng. “Topics in sparse functional data analysis.” 2018. Web. 18 Jan 2020.

Vancouver:

Zhu W. Topics in sparse functional data analysis. [Internet] [Thesis]. Iowa State University; 2018. [cited 2020 Jan 18]. Available from: https://lib.dr.iastate.edu/etd/17378.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhu W. Topics in sparse functional data analysis. [Thesis]. Iowa State University; 2018. Available from: https://lib.dr.iastate.edu/etd/17378

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

21. Gullipalli, Deep Kumar. Data envelopment analysis with sparse data.

Degree: MS, Department of Industrial & Manufacturing Systems Engineering, 2011, Kansas State University

 Quest for continuous improvement among the organizations and issue of missing data for data analysis are never ending. This thesis brings these two topics under… (more)

Subjects/Keywords: Data envelopment analysis; Sparse data; Missing values; Healthcare; Clustering; Fuzzy Set Theory; Industrial Engineering (0546)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gullipalli, D. K. (2011). Data envelopment analysis with sparse data. (Masters Thesis). Kansas State University. Retrieved from http://hdl.handle.net/2097/13092

Chicago Manual of Style (16th Edition):

Gullipalli, Deep Kumar. “Data envelopment analysis with sparse data.” 2011. Masters Thesis, Kansas State University. Accessed January 18, 2020. http://hdl.handle.net/2097/13092.

MLA Handbook (7th Edition):

Gullipalli, Deep Kumar. “Data envelopment analysis with sparse data.” 2011. Web. 18 Jan 2020.

Vancouver:

Gullipalli DK. Data envelopment analysis with sparse data. [Internet] [Masters thesis]. Kansas State University; 2011. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/2097/13092.

Council of Science Editors:

Gullipalli DK. Data envelopment analysis with sparse data. [Masters Thesis]. Kansas State University; 2011. Available from: http://hdl.handle.net/2097/13092


University of Lethbridge

22. University of Lethbridge. Faculty of Arts and Science. A Computational study of sparse or structured matrix operations .

Degree: 2018, University of Lethbridge

 Matrix computation is an important area in high-performance scientific computing. Major computer manufacturers and vendors typically provide architecture- aware implementation libraries such as Basic Linear… (more)

Subjects/Keywords: Sparse matrices  – Data processing; Java (Computer program language); Algebras, linear; High performance computing; Mathematical optimization  – Data processing; Numerical calculations  – Data processing; sparse data structure; CRS; Compressed Row Storage; JSA; Java Sparse Array; diagonal; BLAS; Basic Linear Algebra Subroutines

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Science, U. o. L. F. o. A. a. (2018). A Computational study of sparse or structured matrix operations . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/5268

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Science, University of Lethbridge. Faculty of Arts and. “A Computational study of sparse or structured matrix operations .” 2018. Thesis, University of Lethbridge. Accessed January 18, 2020. http://hdl.handle.net/10133/5268.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Science, University of Lethbridge. Faculty of Arts and. “A Computational study of sparse or structured matrix operations .” 2018. Web. 18 Jan 2020.

Vancouver:

Science UoLFoAa. A Computational study of sparse or structured matrix operations . [Internet] [Thesis]. University of Lethbridge; 2018. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/10133/5268.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Science UoLFoAa. A Computational study of sparse or structured matrix operations . [Thesis]. University of Lethbridge; 2018. Available from: http://hdl.handle.net/10133/5268

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of South Florida

23. Quintero, Michael C. Constructing a Clinical Research Data Management System.

Degree: 2017, University of South Florida

 Clinical study data is usually collected without knowing what kind of data is going to be collected in advance. In addition, all of the possible… (more)

Subjects/Keywords: Sparse Data Storage; Entity Attribute Value Data Model; Database Modeling; Wide Tables; Clinical Study Data; Computer Sciences; Databases and Information Systems

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Quintero, M. C. (2017). Constructing a Clinical Research Data Management System. (Thesis). University of South Florida. Retrieved from https://scholarcommons.usf.edu/etd/7081

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Quintero, Michael C. “Constructing a Clinical Research Data Management System.” 2017. Thesis, University of South Florida. Accessed January 18, 2020. https://scholarcommons.usf.edu/etd/7081.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Quintero, Michael C. “Constructing a Clinical Research Data Management System.” 2017. Web. 18 Jan 2020.

Vancouver:

Quintero MC. Constructing a Clinical Research Data Management System. [Internet] [Thesis]. University of South Florida; 2017. [cited 2020 Jan 18]. Available from: https://scholarcommons.usf.edu/etd/7081.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Quintero MC. Constructing a Clinical Research Data Management System. [Thesis]. University of South Florida; 2017. Available from: https://scholarcommons.usf.edu/etd/7081

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Minnesota

24. Yi, Feng. Selected topics of high-dimensional sparse modeling.

Degree: PhD, Statistics, 2013, University of Minnesota

 In this thesis we study three problems over high-dimensional sparse modeling. We first discuss the problem of high-dimensional covariance matrix estimation. Nowadays, massive high-dimensional data(more)

Subjects/Keywords: Covariance matrix; Factor analysis; High-dimensional data analysis; Non-parametric method; Sparse modeling

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yi, F. (2013). Selected topics of high-dimensional sparse modeling. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/161965

Chicago Manual of Style (16th Edition):

Yi, Feng. “Selected topics of high-dimensional sparse modeling.” 2013. Doctoral Dissertation, University of Minnesota. Accessed January 18, 2020. http://hdl.handle.net/11299/161965.

MLA Handbook (7th Edition):

Yi, Feng. “Selected topics of high-dimensional sparse modeling.” 2013. Web. 18 Jan 2020.

Vancouver:

Yi F. Selected topics of high-dimensional sparse modeling. [Internet] [Doctoral dissertation]. University of Minnesota; 2013. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/11299/161965.

Council of Science Editors:

Yi F. Selected topics of high-dimensional sparse modeling. [Doctoral Dissertation]. University of Minnesota; 2013. Available from: http://hdl.handle.net/11299/161965


Rice University

25. Yang, Yongchao. Harnessing data structure for health monitoring and assessment of civil structures: sparse representation and low-rank structure.

Degree: PhD, Engineering, 2014, Rice University

 Civil structures are subjected to ambient loads, natural hazards, and man-made extreme events, which can cause deterioration, damage, and even catastrophic failure of structures. Dense… (more)

Subjects/Keywords: Structural health monitoring; system identification; damage detection; data-driven methods; sparse representation; blind source separation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yang, Y. (2014). Harnessing data structure for health monitoring and assessment of civil structures: sparse representation and low-rank structure. (Doctoral Dissertation). Rice University. Retrieved from http://hdl.handle.net/1911/87779

Chicago Manual of Style (16th Edition):

Yang, Yongchao. “Harnessing data structure for health monitoring and assessment of civil structures: sparse representation and low-rank structure.” 2014. Doctoral Dissertation, Rice University. Accessed January 18, 2020. http://hdl.handle.net/1911/87779.

MLA Handbook (7th Edition):

Yang, Yongchao. “Harnessing data structure for health monitoring and assessment of civil structures: sparse representation and low-rank structure.” 2014. Web. 18 Jan 2020.

Vancouver:

Yang Y. Harnessing data structure for health monitoring and assessment of civil structures: sparse representation and low-rank structure. [Internet] [Doctoral dissertation]. Rice University; 2014. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/1911/87779.

Council of Science Editors:

Yang Y. Harnessing data structure for health monitoring and assessment of civil structures: sparse representation and low-rank structure. [Doctoral Dissertation]. Rice University; 2014. Available from: http://hdl.handle.net/1911/87779


University of Rochester

26. Song, Yanwei. Energy efficient data movement with sparse representation.

Degree: PhD, 2016, University of Rochester

 Energy efficiency is one of the most significant requirements in the study of computer systems, from mobile devices to large-scale data centers. Data movement is… (more)

Subjects/Keywords: Data movement; Energy efficient; Memory interface; On-chip interconnect; Opportunistic coding; Sparse representation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Song, Y. (2016). Energy efficient data movement with sparse representation. (Doctoral Dissertation). University of Rochester. Retrieved from http://hdl.handle.net/1802/30637

Chicago Manual of Style (16th Edition):

Song, Yanwei. “Energy efficient data movement with sparse representation.” 2016. Doctoral Dissertation, University of Rochester. Accessed January 18, 2020. http://hdl.handle.net/1802/30637.

MLA Handbook (7th Edition):

Song, Yanwei. “Energy efficient data movement with sparse representation.” 2016. Web. 18 Jan 2020.

Vancouver:

Song Y. Energy efficient data movement with sparse representation. [Internet] [Doctoral dissertation]. University of Rochester; 2016. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/1802/30637.

Council of Science Editors:

Song Y. Energy efficient data movement with sparse representation. [Doctoral Dissertation]. University of Rochester; 2016. Available from: http://hdl.handle.net/1802/30637


Case Western Reserve University

27. ChangHyun, Lee. PSG Data Compression And Decompression Based On Compressed Sensing.

Degree: MSs (Engineering), EECS - System and Control Engineering, 2011, Case Western Reserve University

 In the thesis, the compression and decompression scheme based on Compressive Sensing (CS) is developed for multichannel polysomnography(PSG) data. This thesis is composed of three… (more)

Subjects/Keywords: Electrical Engineering; Health Care; PSG; Sparse Representation; Compressed Sensing; Data Compression; Convex Optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

ChangHyun, L. (2011). PSG Data Compression And Decompression Based On Compressed Sensing. (Masters Thesis). Case Western Reserve University. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=case1310065394

Chicago Manual of Style (16th Edition):

ChangHyun, Lee. “PSG Data Compression And Decompression Based On Compressed Sensing.” 2011. Masters Thesis, Case Western Reserve University. Accessed January 18, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=case1310065394.

MLA Handbook (7th Edition):

ChangHyun, Lee. “PSG Data Compression And Decompression Based On Compressed Sensing.” 2011. Web. 18 Jan 2020.

Vancouver:

ChangHyun L. PSG Data Compression And Decompression Based On Compressed Sensing. [Internet] [Masters thesis]. Case Western Reserve University; 2011. [cited 2020 Jan 18]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=case1310065394.

Council of Science Editors:

ChangHyun L. PSG Data Compression And Decompression Based On Compressed Sensing. [Masters Thesis]. Case Western Reserve University; 2011. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=case1310065394

28. Purdy, David Gregory. Sparse Models for Sparse Data: Methods, Limitations, Visualizations, and Ensembles.

Degree: Statistics, 2012, University of California – Berkeley

 Significant recent advances in many areas of data collection and processing have introduced many challenges for modeling such data. Data sets have exploded in the… (more)

Subjects/Keywords: Statistics; Computer science; Machine Learning; Model Diagnostics; Recommendation Systems; Sparse Data; Statistics; Visualization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Purdy, D. G. (2012). Sparse Models for Sparse Data: Methods, Limitations, Visualizations, and Ensembles. (Thesis). University of California – Berkeley. Retrieved from http://www.escholarship.org/uc/item/9qb472v2

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Purdy, David Gregory. “Sparse Models for Sparse Data: Methods, Limitations, Visualizations, and Ensembles.” 2012. Thesis, University of California – Berkeley. Accessed January 18, 2020. http://www.escholarship.org/uc/item/9qb472v2.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Purdy, David Gregory. “Sparse Models for Sparse Data: Methods, Limitations, Visualizations, and Ensembles.” 2012. Web. 18 Jan 2020.

Vancouver:

Purdy DG. Sparse Models for Sparse Data: Methods, Limitations, Visualizations, and Ensembles. [Internet] [Thesis]. University of California – Berkeley; 2012. [cited 2020 Jan 18]. Available from: http://www.escholarship.org/uc/item/9qb472v2.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Purdy DG. Sparse Models for Sparse Data: Methods, Limitations, Visualizations, and Ensembles. [Thesis]. University of California – Berkeley; 2012. Available from: http://www.escholarship.org/uc/item/9qb472v2

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Virginia Tech

29. Cain, Christopher Hawthorn. Real Time SLAM Using Compressed Occupancy Grids For a Low Cost Autonomous Underwater Vehicle.

Degree: PhD, Mechanical Engineering, 2014, Virginia Tech

 The research presented in this dissertation pertains to the development of a real time SLAM solution that can be performed by a low cost autonomous… (more)

Subjects/Keywords: Autonomous Vehicles; SLAM; Occupancy Grids; Haar Wavelet Transform; Compressed Sensing; Sparse Signal Reconstruction; Data Compression

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Cain, C. H. (2014). Real Time SLAM Using Compressed Occupancy Grids For a Low Cost Autonomous Underwater Vehicle. (Doctoral Dissertation). Virginia Tech. Retrieved from http://hdl.handle.net/10919/47920

Chicago Manual of Style (16th Edition):

Cain, Christopher Hawthorn. “Real Time SLAM Using Compressed Occupancy Grids For a Low Cost Autonomous Underwater Vehicle.” 2014. Doctoral Dissertation, Virginia Tech. Accessed January 18, 2020. http://hdl.handle.net/10919/47920.

MLA Handbook (7th Edition):

Cain, Christopher Hawthorn. “Real Time SLAM Using Compressed Occupancy Grids For a Low Cost Autonomous Underwater Vehicle.” 2014. Web. 18 Jan 2020.

Vancouver:

Cain CH. Real Time SLAM Using Compressed Occupancy Grids For a Low Cost Autonomous Underwater Vehicle. [Internet] [Doctoral dissertation]. Virginia Tech; 2014. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/10919/47920.

Council of Science Editors:

Cain CH. Real Time SLAM Using Compressed Occupancy Grids For a Low Cost Autonomous Underwater Vehicle. [Doctoral Dissertation]. Virginia Tech; 2014. Available from: http://hdl.handle.net/10919/47920

30. Seetharaman, Indu. Consistent bi-level variable selection via composite group bridge penalized regression.

Degree: MS, Department of Statistics, 2013, Kansas State University

 We study the composite group bridge penalized regression methods for conducting bilevel variable selection in high dimensional linear regression models with a diverging number of… (more)

Subjects/Keywords: Bi-level variable selection; High-dimensional data; Oracle property; Penalized regression; Sparse models; Statistics (0463)

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7 Sample image

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Seetharaman, I. (2013). Consistent bi-level variable selection via composite group bridge penalized regression. (Masters Thesis). Kansas State University. Retrieved from http://hdl.handle.net/2097/15980

Chicago Manual of Style (16th Edition):

Seetharaman, Indu. “Consistent bi-level variable selection via composite group bridge penalized regression.” 2013. Masters Thesis, Kansas State University. Accessed January 18, 2020. http://hdl.handle.net/2097/15980.

MLA Handbook (7th Edition):

Seetharaman, Indu. “Consistent bi-level variable selection via composite group bridge penalized regression.” 2013. Web. 18 Jan 2020.

Vancouver:

Seetharaman I. Consistent bi-level variable selection via composite group bridge penalized regression. [Internet] [Masters thesis]. Kansas State University; 2013. [cited 2020 Jan 18]. Available from: http://hdl.handle.net/2097/15980.

Council of Science Editors:

Seetharaman I. Consistent bi-level variable selection via composite group bridge penalized regression. [Masters Thesis]. Kansas State University; 2013. Available from: http://hdl.handle.net/2097/15980

[1] [2] [3] [4] [5]

.