Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(sparse matrices)`

.
Showing records 1 – 30 of
98 total matches.

Search Limiters

Dates

- 2017 – 2021 (25)
- 2012 – 2016 (39)
- 2007 – 2011 (23)

Degrees

- PhD (20)
- Docteur es (17)
- MS (11)

▼ Search Limiters

University of Lethbridge

1.
Hasan, Mahmudul.
DSJM : a software toolkit for direct determination of *sparse* Jacobian * matrices*
.

Degree: 2011, University of Lethbridge

URL: http://hdl.handle.net/10133/3216

► DSJM is a software toolkit written in portable C++ that enables direct determination of *sparse* Jacobian *matrices* whose sparsity pattern is a priori known. Using…
(more)

Subjects/Keywords: Sparse matrices; Sparse matrices – Computer programs; Jacobians – Data processing; Dissertations, Academic

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Hasan, M. (2011). DSJM : a software toolkit for direct determination of sparse Jacobian matrices . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/3216

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Hasan, Mahmudul. “DSJM : a software toolkit for direct determination of sparse Jacobian matrices .” 2011. Thesis, University of Lethbridge. Accessed April 12, 2021. http://hdl.handle.net/10133/3216.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Hasan, Mahmudul. “DSJM : a software toolkit for direct determination of sparse Jacobian matrices .” 2011. Web. 12 Apr 2021.

Vancouver:

Hasan M. DSJM : a software toolkit for direct determination of sparse Jacobian matrices . [Internet] [Thesis]. University of Lethbridge; 2011. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10133/3216.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hasan M. DSJM : a software toolkit for direct determination of sparse Jacobian matrices . [Thesis]. University of Lethbridge; 2011. Available from: http://hdl.handle.net/10133/3216

Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology

2.
Zhang, Weibin.
Regularized and *sparse* models for low resource speech recognition.

Degree: 2013, Hong Kong University of Science and Technology

URL: http://repository.ust.hk/ir/Record/1783.1-62242 ; https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html

► The performance of modern speech recognition systems depends heavily on the availability of sufficient training data. Although the recognition accuracy of a speech recognition system…
(more)

Subjects/Keywords: Automatic speech recognition ; Mathematical models ; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhang, W. (2013). Regularized and sparse models for low resource speech recognition. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-62242 ; https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Zhang, Weibin. “Regularized and sparse models for low resource speech recognition.” 2013. Thesis, Hong Kong University of Science and Technology. Accessed April 12, 2021. http://repository.ust.hk/ir/Record/1783.1-62242 ; https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Zhang, Weibin. “Regularized and sparse models for low resource speech recognition.” 2013. Web. 12 Apr 2021.

Vancouver:

Zhang W. Regularized and sparse models for low resource speech recognition. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2013. [cited 2021 Apr 12]. Available from: http://repository.ust.hk/ir/Record/1783.1-62242 ; https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhang W. Regularized and sparse models for low resource speech recognition. [Thesis]. Hong Kong University of Science and Technology; 2013. Available from: http://repository.ust.hk/ir/Record/1783.1-62242 ; https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

3. De Lara, Nathan. Algorithmic and software contributions to graph mining : Contributions algorithmiques et logicielle à l'apprentissage automatique sur les graphes.

Degree: Docteur es, Informatique, données, IA, 2020, Institut polytechnique de Paris

URL: http://www.theses.fr/2020IPPAT029

►

Depuis l'invention du PageRank par Google pour les requêtes Web à la fin des années 1990, les algorithmes de graphe font partie de notre quotidien.… (more)

Subjects/Keywords: Graphes; Apprentissage automatique; Matrices creuses; Graphs; Machine learning; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

De Lara, N. (2020). Algorithmic and software contributions to graph mining : Contributions algorithmiques et logicielle à l'apprentissage automatique sur les graphes. (Doctoral Dissertation). Institut polytechnique de Paris. Retrieved from http://www.theses.fr/2020IPPAT029

Chicago Manual of Style (16^{th} Edition):

De Lara, Nathan. “Algorithmic and software contributions to graph mining : Contributions algorithmiques et logicielle à l'apprentissage automatique sur les graphes.” 2020. Doctoral Dissertation, Institut polytechnique de Paris. Accessed April 12, 2021. http://www.theses.fr/2020IPPAT029.

MLA Handbook (7^{th} Edition):

De Lara, Nathan. “Algorithmic and software contributions to graph mining : Contributions algorithmiques et logicielle à l'apprentissage automatique sur les graphes.” 2020. Web. 12 Apr 2021.

Vancouver:

De Lara N. Algorithmic and software contributions to graph mining : Contributions algorithmiques et logicielle à l'apprentissage automatique sur les graphes. [Internet] [Doctoral dissertation]. Institut polytechnique de Paris; 2020. [cited 2021 Apr 12]. Available from: http://www.theses.fr/2020IPPAT029.

Council of Science Editors:

De Lara N. Algorithmic and software contributions to graph mining : Contributions algorithmiques et logicielle à l'apprentissage automatique sur les graphes. [Doctoral Dissertation]. Institut polytechnique de Paris; 2020. Available from: http://www.theses.fr/2020IPPAT029

Colorado State University

4.
Dinkins, Stephanie.
Model for predicting the performance of *sparse* matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A.

Degree: MS(M.S.), Computer Science, 2012, Colorado State University

URL: http://hdl.handle.net/10217/65303

► *Sparse* matrix vector multiply (SpMV) is an important computation that is used in many scientific and structural engineering applications. *Sparse* computations like SpMV require the…
(more)

Subjects/Keywords: data locality; Manhattan distance; performance model; sparse matrices; sparse matrix vector multiply; SpMV

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Dinkins, S. (2012). Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A. (Masters Thesis). Colorado State University. Retrieved from http://hdl.handle.net/10217/65303

Chicago Manual of Style (16^{th} Edition):

Dinkins, Stephanie. “Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A.” 2012. Masters Thesis, Colorado State University. Accessed April 12, 2021. http://hdl.handle.net/10217/65303.

MLA Handbook (7^{th} Edition):

Dinkins, Stephanie. “Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A.” 2012. Web. 12 Apr 2021.

Vancouver:

Dinkins S. Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A. [Internet] [Masters thesis]. Colorado State University; 2012. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10217/65303.

Council of Science Editors:

Dinkins S. Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A. [Masters Thesis]. Colorado State University; 2012. Available from: http://hdl.handle.net/10217/65303

5. Ajamian, Tzila. Exploration de l’Acquisition Comprimée appliquée à la Réflectométrie : Exploration of Compressive Sampling for Wire Diagnosis Systems Based on Reflectometry.

Degree: Docteur es, Signal, Image, Vision, 2019, Ecole centrale de Nantes

URL: http://www.theses.fr/2019ECDN0040

►

La réflectométrie, une technique utilisée en diagnostic filaire, permet la détection et la localisation de défauts des câbles. Alors que, les Convertisseurs Analogique-Numérique sont indispensables… (more)

Subjects/Keywords: Diagnostic filaire; Acquisition comprimée; Matrices parcimonieuses; Wire diagnosis; Compressive Sampling; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ajamian, T. (2019). Exploration de l’Acquisition Comprimée appliquée à la Réflectométrie : Exploration of Compressive Sampling for Wire Diagnosis Systems Based on Reflectometry. (Doctoral Dissertation). Ecole centrale de Nantes. Retrieved from http://www.theses.fr/2019ECDN0040

Chicago Manual of Style (16^{th} Edition):

Ajamian, Tzila. “Exploration de l’Acquisition Comprimée appliquée à la Réflectométrie : Exploration of Compressive Sampling for Wire Diagnosis Systems Based on Reflectometry.” 2019. Doctoral Dissertation, Ecole centrale de Nantes. Accessed April 12, 2021. http://www.theses.fr/2019ECDN0040.

MLA Handbook (7^{th} Edition):

Ajamian, Tzila. “Exploration de l’Acquisition Comprimée appliquée à la Réflectométrie : Exploration of Compressive Sampling for Wire Diagnosis Systems Based on Reflectometry.” 2019. Web. 12 Apr 2021.

Vancouver:

Ajamian T. Exploration de l’Acquisition Comprimée appliquée à la Réflectométrie : Exploration of Compressive Sampling for Wire Diagnosis Systems Based on Reflectometry. [Internet] [Doctoral dissertation]. Ecole centrale de Nantes; 2019. [cited 2021 Apr 12]. Available from: http://www.theses.fr/2019ECDN0040.

Council of Science Editors:

Ajamian T. Exploration de l’Acquisition Comprimée appliquée à la Réflectométrie : Exploration of Compressive Sampling for Wire Diagnosis Systems Based on Reflectometry. [Doctoral Dissertation]. Ecole centrale de Nantes; 2019. Available from: http://www.theses.fr/2019ECDN0040

University of Alberta

6.
Rivasplata, Omar D.
Smallest singular value of *sparse* random * matrices*.

Degree: PhD, Department of Mathematical and Statistical Sciences, 2012, University of Alberta

URL: https://era.library.ualberta.ca/files/nc580m941

► In this thesis probability estimates on the smallest singular value of random *matrices* with independent entries are extended to a class of *sparse* random *matrices*.…
(more)

Subjects/Keywords: incompressible vectors; deviation inequalities; sparse matrices; random matrices; singular values; compressible vectors; invertibility of random matrices; sub-Gaussian random variables

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Rivasplata, O. D. (2012). Smallest singular value of sparse random matrices. (Doctoral Dissertation). University of Alberta. Retrieved from https://era.library.ualberta.ca/files/nc580m941

Chicago Manual of Style (16^{th} Edition):

Rivasplata, Omar D. “Smallest singular value of sparse random matrices.” 2012. Doctoral Dissertation, University of Alberta. Accessed April 12, 2021. https://era.library.ualberta.ca/files/nc580m941.

MLA Handbook (7^{th} Edition):

Rivasplata, Omar D. “Smallest singular value of sparse random matrices.” 2012. Web. 12 Apr 2021.

Vancouver:

Rivasplata OD. Smallest singular value of sparse random matrices. [Internet] [Doctoral dissertation]. University of Alberta; 2012. [cited 2021 Apr 12]. Available from: https://era.library.ualberta.ca/files/nc580m941.

Council of Science Editors:

Rivasplata OD. Smallest singular value of sparse random matrices. [Doctoral Dissertation]. University of Alberta; 2012. Available from: https://era.library.ualberta.ca/files/nc580m941

University of Johannesburg

7.
Chifamba, Saymore.
A study of the performance of a *sparse* grid cross section representation methodology as applied to MOX fuel.

Degree: 2015, University of Johannesburg

URL: http://hdl.handle.net/10210/15086

►

M.Phil. (Energy Studies)

Nodal diffusion methods are often used to calculate the distribution of neutrons in a nuclear reactor core. They require few-group homogenized neutron… (more)

Subjects/Keywords: Neclear power plants; Nuclear reactor kinetics; Sparse matrices; Uranium - Isotopes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Chifamba, S. (2015). A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/15086

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Chifamba, Saymore. “A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel.” 2015. Thesis, University of Johannesburg. Accessed April 12, 2021. http://hdl.handle.net/10210/15086.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Chifamba, Saymore. “A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel.” 2015. Web. 12 Apr 2021.

Vancouver:

Chifamba S. A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel. [Internet] [Thesis]. University of Johannesburg; 2015. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10210/15086.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chifamba S. A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel. [Thesis]. University of Johannesburg; 2015. Available from: http://hdl.handle.net/10210/15086

Not specified: Masters Thesis or Doctoral Dissertation

Harvard University

8. Huang, Jiaoyang. Spectral Statistics of Random d-Regular Graphs.

Degree: PhD, 2019, Harvard University

URL: http://nrs.harvard.edu/urn-3:HUL.InstRepos:42029716

►

In this thesis we study the uniform random d-regular graphs on N vertices from a random matrix theory point of view. In the first part… (more)

Subjects/Keywords: sparse random graphs; random matrices; eigenvalue statistics; eigenvector statistics.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Huang, J. (2019). Spectral Statistics of Random d-Regular Graphs. (Doctoral Dissertation). Harvard University. Retrieved from http://nrs.harvard.edu/urn-3:HUL.InstRepos:42029716

Chicago Manual of Style (16^{th} Edition):

Huang, Jiaoyang. “Spectral Statistics of Random d-Regular Graphs.” 2019. Doctoral Dissertation, Harvard University. Accessed April 12, 2021. http://nrs.harvard.edu/urn-3:HUL.InstRepos:42029716.

MLA Handbook (7^{th} Edition):

Huang, Jiaoyang. “Spectral Statistics of Random d-Regular Graphs.” 2019. Web. 12 Apr 2021.

Vancouver:

Huang J. Spectral Statistics of Random d-Regular Graphs. [Internet] [Doctoral dissertation]. Harvard University; 2019. [cited 2021 Apr 12]. Available from: http://nrs.harvard.edu/urn-3:HUL.InstRepos:42029716.

Council of Science Editors:

Huang J. Spectral Statistics of Random d-Regular Graphs. [Doctoral Dissertation]. Harvard University; 2019. Available from: http://nrs.harvard.edu/urn-3:HUL.InstRepos:42029716

University of Minnesota

9.
Kalantzis, Vasileios.
Domain decomposition algorithms for the solution of *sparse* symmetric generalized eigenvalue problems.

Degree: PhD, Computer Science, 2018, University of Minnesota

URL: http://hdl.handle.net/11299/201170

► This dissertation focuses on the design, implementation, and evaluation of domain decomposition techniques for the solution of large and *sparse* algebraic symmetric generalized eigenvalue problems.…
(more)

Subjects/Keywords: Domain decomposition; Eigenvalues; Schur complement; Sparse matrices; Symmetric generalized eigenvalue problem

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Kalantzis, V. (2018). Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/201170

Chicago Manual of Style (16^{th} Edition):

Kalantzis, Vasileios. “Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems.” 2018. Doctoral Dissertation, University of Minnesota. Accessed April 12, 2021. http://hdl.handle.net/11299/201170.

MLA Handbook (7^{th} Edition):

Kalantzis, Vasileios. “Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems.” 2018. Web. 12 Apr 2021.

Vancouver:

Kalantzis V. Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems. [Internet] [Doctoral dissertation]. University of Minnesota; 2018. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/11299/201170.

Council of Science Editors:

Kalantzis V. Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems. [Doctoral Dissertation]. University of Minnesota; 2018. Available from: http://hdl.handle.net/11299/201170

Texas Tech University

10. Iyer, Ashok. Numerican implementation of a continuation algorithm for the eigenvalue problem.

Degree: Electrical and Computer Engineering, 1980, Texas Tech University

URL: http://hdl.handle.net/2346/20354

Subjects/Keywords: Sparse matrices; Algorithms; Eigenvalues

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Iyer, A. (1980). Numerican implementation of a continuation algorithm for the eigenvalue problem. (Thesis). Texas Tech University. Retrieved from http://hdl.handle.net/2346/20354

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Iyer, Ashok. “Numerican implementation of a continuation algorithm for the eigenvalue problem.” 1980. Thesis, Texas Tech University. Accessed April 12, 2021. http://hdl.handle.net/2346/20354.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Iyer, Ashok. “Numerican implementation of a continuation algorithm for the eigenvalue problem.” 1980. Web. 12 Apr 2021.

Vancouver:

Iyer A. Numerican implementation of a continuation algorithm for the eigenvalue problem. [Internet] [Thesis]. Texas Tech University; 1980. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/2346/20354.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Iyer A. Numerican implementation of a continuation algorithm for the eigenvalue problem. [Thesis]. Texas Tech University; 1980. Available from: http://hdl.handle.net/2346/20354

Not specified: Masters Thesis or Doctoral Dissertation

University of Hong Kong

11.
李明飞.
* Sparse* representation and
fast processing of massive data.

Degree: 2012, University of Hong Kong

URL: http://hdl.handle.net/10722/181480

► Many computational problems involve massive data. A reasonable solution to those problems should be able to store and process the data in a effective manner.…
(more)

Subjects/Keywords: Data mining.; Sparse matrices.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

李明飞.. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong. Retrieved from http://hdl.handle.net/10722/181480

Note: this citation may be lacking information needed for this citation format:

Author name may be incomplete

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

李明飞.. “Sparse representation and fast processing of massive data.” 2012. Thesis, University of Hong Kong. Accessed April 12, 2021. http://hdl.handle.net/10722/181480.

Note: this citation may be lacking information needed for this citation format:

Author name may be incomplete

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

李明飞.. “Sparse representation and fast processing of massive data.” 2012. Web. 12 Apr 2021.

Note: this citation may be lacking information needed for this citation format:

Author name may be incomplete

Vancouver:

李明飞.. Sparse representation and fast processing of massive data. [Internet] [Thesis]. University of Hong Kong; 2012. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10722/181480.

Note: this citation may be lacking information needed for this citation format:

Author name may be incomplete

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

李明飞.. Sparse representation and fast processing of massive data. [Thesis]. University of Hong Kong; 2012. Available from: http://hdl.handle.net/10722/181480

Author name may be incomplete

Not specified: Masters Thesis or Doctoral Dissertation

University of Hong Kong

12.
宁立.
Influence on information
networks and *sparse* representation of metric spaces: y Li
Ning.

Degree: 2013, University of Hong Kong

URL: http://hdl.handle.net/10722/191190

► As the social networking applications become popular and have attracted more and more attention, it becomes possible (or easier) to mine information networks of a…
(more)

Subjects/Keywords: Information networks.; Sparse matrices.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

宁立. (2013). Influence on information networks and sparse representation of metric spaces: y Li Ning. (Thesis). University of Hong Kong. Retrieved from http://hdl.handle.net/10722/191190

Author name may be incomplete

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

宁立. “Influence on information networks and sparse representation of metric spaces: y Li Ning.” 2013. Thesis, University of Hong Kong. Accessed April 12, 2021. http://hdl.handle.net/10722/191190.

Author name may be incomplete

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

宁立. “Influence on information networks and sparse representation of metric spaces: y Li Ning.” 2013. Web. 12 Apr 2021.

Note: this citation may be lacking information needed for this citation format:

Author name may be incomplete

Vancouver:

宁立. Influence on information networks and sparse representation of metric spaces: y Li Ning. [Internet] [Thesis]. University of Hong Kong; 2013. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10722/191190.

Author name may be incomplete

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

宁立. Influence on information networks and sparse representation of metric spaces: y Li Ning. [Thesis]. University of Hong Kong; 2013. Available from: http://hdl.handle.net/10722/191190

Author name may be incomplete

Not specified: Masters Thesis or Doctoral Dissertation

University of Lethbridge

13. Zulkarnine, Ahmed Tahsin. Design structure and iterative release analysis of scientific software .

Degree: 2012, University of Lethbridge

URL: http://hdl.handle.net/10133/3256

► One of the main objectives of software development in scientific computing is efficiency. Being focused on highly specialized application domain, important software quality metrics, e.g.,…
(more)

Subjects/Keywords: Science – Computer programs; Computer software – Development; Computer software – Testing; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zulkarnine, A. T. (2012). Design structure and iterative release analysis of scientific software . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/3256

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Zulkarnine, Ahmed Tahsin. “Design structure and iterative release analysis of scientific software .” 2012. Thesis, University of Lethbridge. Accessed April 12, 2021. http://hdl.handle.net/10133/3256.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Zulkarnine, Ahmed Tahsin. “Design structure and iterative release analysis of scientific software .” 2012. Web. 12 Apr 2021.

Vancouver:

Zulkarnine AT. Design structure and iterative release analysis of scientific software . [Internet] [Thesis]. University of Lethbridge; 2012. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10133/3256.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zulkarnine AT. Design structure and iterative release analysis of scientific software . [Thesis]. University of Lethbridge; 2012. Available from: http://hdl.handle.net/10133/3256

Not specified: Masters Thesis or Doctoral Dissertation

University of Georgia

14.
Liu, Yang.
Non-convex optimization for linear system with pregaussian *matrices* and recovery from multiple measurements.

Degree: 2014, University of Georgia

URL: http://hdl.handle.net/10724/26701

► The extremal singular values of random *matrices* in ell_{2}-norm, including Gaussian random *matrices*, Bernoulli random *matrices*, subgaussian random *matrices*, etc, have attracted major research interest…
(more)

Subjects/Keywords: Optimization; Random Matrices; Sparse Recovery; Null Space Property

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Liu, Y. (2014). Non-convex optimization for linear system with pregaussian matrices and recovery from multiple measurements. (Thesis). University of Georgia. Retrieved from http://hdl.handle.net/10724/26701

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Liu, Yang. “Non-convex optimization for linear system with pregaussian matrices and recovery from multiple measurements.” 2014. Thesis, University of Georgia. Accessed April 12, 2021. http://hdl.handle.net/10724/26701.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Liu, Yang. “Non-convex optimization for linear system with pregaussian matrices and recovery from multiple measurements.” 2014. Web. 12 Apr 2021.

Vancouver:

Liu Y. Non-convex optimization for linear system with pregaussian matrices and recovery from multiple measurements. [Internet] [Thesis]. University of Georgia; 2014. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10724/26701.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu Y. Non-convex optimization for linear system with pregaussian matrices and recovery from multiple measurements. [Thesis]. University of Georgia; 2014. Available from: http://hdl.handle.net/10724/26701

Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology

15. Qin, Bo CSE. On graph sparsifiers, graph sketches, fast linear solvers and network flow optimization.

Degree: 2017, Hong Kong University of Science and Technology

URL: http://repository.ust.hk/ir/Record/1783.1-104956 ; https://doi.org/10.14711/thesis-991012530668903412 ; http://repository.ust.hk/ir/bitstream/1783.1-104956/1/th_redirect.html

► Most graph algorithms run faster, sometimes by orders of magnitude, on *sparse* graphs (graphs containing few edges). By approximating a dense input graph by a…
(more)

Subjects/Keywords: Electronic data processing ; Mathematical models ; Big data ; Data processing ; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Qin, B. C. (2017). On graph sparsifiers, graph sketches, fast linear solvers and network flow optimization. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-104956 ; https://doi.org/10.14711/thesis-991012530668903412 ; http://repository.ust.hk/ir/bitstream/1783.1-104956/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Qin, Bo CSE. “On graph sparsifiers, graph sketches, fast linear solvers and network flow optimization.” 2017. Thesis, Hong Kong University of Science and Technology. Accessed April 12, 2021. http://repository.ust.hk/ir/Record/1783.1-104956 ; https://doi.org/10.14711/thesis-991012530668903412 ; http://repository.ust.hk/ir/bitstream/1783.1-104956/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Qin, Bo CSE. “On graph sparsifiers, graph sketches, fast linear solvers and network flow optimization.” 2017. Web. 12 Apr 2021.

Vancouver:

Qin BC. On graph sparsifiers, graph sketches, fast linear solvers and network flow optimization. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2017. [cited 2021 Apr 12]. Available from: http://repository.ust.hk/ir/Record/1783.1-104956 ; https://doi.org/10.14711/thesis-991012530668903412 ; http://repository.ust.hk/ir/bitstream/1783.1-104956/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Qin BC. On graph sparsifiers, graph sketches, fast linear solvers and network flow optimization. [Thesis]. Hong Kong University of Science and Technology; 2017. Available from: http://repository.ust.hk/ir/Record/1783.1-104956 ; https://doi.org/10.14711/thesis-991012530668903412 ; http://repository.ust.hk/ir/bitstream/1783.1-104956/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

University of Lethbridge

16.
University of Lethbridge. Faculty of Arts and Science.
An improved implementation of sparsity detection of *sparse* derivative * matrices*
.

Degree: 2018, University of Lethbridge

URL: http://hdl.handle.net/10133/5266

► Optimization is a crucial branch of research with application in numerous domain. Determination of sparsity is a vital stream of optimization research with potentials for…
(more)

Subjects/Keywords: Jacobians; Combinatorial optimization; Sparse matrices – Data processing; Graph coloring; Parallel programs (Computer programs); Matix devrivatives; sparse data structure; CPR algorithm; sparse derivative matrices; Jacobian matrix; multilevel algorithm; parallel implementation

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Science, U. o. L. F. o. A. a. (2018). An improved implementation of sparsity detection of sparse derivative matrices . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/5266

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “An improved implementation of sparsity detection of sparse derivative matrices .” 2018. Thesis, University of Lethbridge. Accessed April 12, 2021. http://hdl.handle.net/10133/5266.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “An improved implementation of sparsity detection of sparse derivative matrices .” 2018. Web. 12 Apr 2021.

Vancouver:

Science UoLFoAa. An improved implementation of sparsity detection of sparse derivative matrices . [Internet] [Thesis]. University of Lethbridge; 2018. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10133/5266.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Science UoLFoAa. An improved implementation of sparsity detection of sparse derivative matrices . [Thesis]. University of Lethbridge; 2018. Available from: http://hdl.handle.net/10133/5266

Not specified: Masters Thesis or Doctoral Dissertation

17.
Falco, Aurélien.
Bridging the Gap Between H-*Matrices* and *Sparse* Direct Methods for the Solution of Large Linear Systems : Combler l’écart entre H-*Matrices* et méthodes directes creuses pour la résolution de systèmes linéaires de grandes tailles.

Degree: Docteur es, Informatique, 2019, Bordeaux

URL: http://www.theses.fr/2019BORD0090

► De nombreux phénomènes physiques peuvent être étudiés au moyen de modélisations et de simulations numériques, courantes dans les applications scientifiques. Pour être calculable sur un…
(more)

Subjects/Keywords: Matrices creuses; H-Matrices; Compression de rang faible; Algèbre linéaire; Eléments finis; Couplage FEM/BEM; Sparse matrices; H-Matrices; Low-Rank compression; Linear algebra; Finite elements; FEM/BEM coupling

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Falco, A. (2019). Bridging the Gap Between H-Matrices and Sparse Direct Methods for the Solution of Large Linear Systems : Combler l’écart entre H-Matrices et méthodes directes creuses pour la résolution de systèmes linéaires de grandes tailles. (Doctoral Dissertation). Bordeaux. Retrieved from http://www.theses.fr/2019BORD0090

Chicago Manual of Style (16^{th} Edition):

Falco, Aurélien. “Bridging the Gap Between H-Matrices and Sparse Direct Methods for the Solution of Large Linear Systems : Combler l’écart entre H-Matrices et méthodes directes creuses pour la résolution de systèmes linéaires de grandes tailles.” 2019. Doctoral Dissertation, Bordeaux. Accessed April 12, 2021. http://www.theses.fr/2019BORD0090.

MLA Handbook (7^{th} Edition):

Falco, Aurélien. “Bridging the Gap Between H-Matrices and Sparse Direct Methods for the Solution of Large Linear Systems : Combler l’écart entre H-Matrices et méthodes directes creuses pour la résolution de systèmes linéaires de grandes tailles.” 2019. Web. 12 Apr 2021.

Vancouver:

Falco A. Bridging the Gap Between H-Matrices and Sparse Direct Methods for the Solution of Large Linear Systems : Combler l’écart entre H-Matrices et méthodes directes creuses pour la résolution de systèmes linéaires de grandes tailles. [Internet] [Doctoral dissertation]. Bordeaux; 2019. [cited 2021 Apr 12]. Available from: http://www.theses.fr/2019BORD0090.

Council of Science Editors:

Falco A. Bridging the Gap Between H-Matrices and Sparse Direct Methods for the Solution of Large Linear Systems : Combler l’écart entre H-Matrices et méthodes directes creuses pour la résolution de systèmes linéaires de grandes tailles. [Doctoral Dissertation]. Bordeaux; 2019. Available from: http://www.theses.fr/2019BORD0090

Anna University

18.
Arathi, P.
Bandwidth reduction of *sparse* symmetric
*matrices*; -.

Degree: Science and Humanities, 2014, Anna University

URL: http://shodhganga.inflibnet.ac.in/handle/10603/24966

Subjects/Keywords: Science and humanities; Sparse Symmetric matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Arathi, P. (2014). Bandwidth reduction of sparse symmetric matrices; -. (Thesis). Anna University. Retrieved from http://shodhganga.inflibnet.ac.in/handle/10603/24966

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Arathi, P. “Bandwidth reduction of sparse symmetric matrices; -.” 2014. Thesis, Anna University. Accessed April 12, 2021. http://shodhganga.inflibnet.ac.in/handle/10603/24966.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Arathi, P. “Bandwidth reduction of sparse symmetric matrices; -.” 2014. Web. 12 Apr 2021.

Vancouver:

Arathi P. Bandwidth reduction of sparse symmetric matrices; -. [Internet] [Thesis]. Anna University; 2014. [cited 2021 Apr 12]. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/24966.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Arathi P. Bandwidth reduction of sparse symmetric matrices; -. [Thesis]. Anna University; 2014. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/24966

Not specified: Masters Thesis or Doctoral Dissertation

North Carolina State University

19.
Luniya, Sonali R.
SPICE Like *Sparse* Transient Analysis.

Degree: MS, Computer Engineering, 2003, North Carolina State University

URL: http://www.lib.ncsu.edu/resolver/1840.16/1209

► A state variable transient circuit analysis using *sparse* *matrices* is developed. The equations are formulated using time discretization based on Newton's iterative method of equations…
(more)

Subjects/Keywords: fREEDA; NeoCAD; Sparse Matrices; State Variables

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Luniya, S. R. (2003). SPICE Like Sparse Transient Analysis. (Thesis). North Carolina State University. Retrieved from http://www.lib.ncsu.edu/resolver/1840.16/1209

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Luniya, Sonali R. “SPICE Like Sparse Transient Analysis.” 2003. Thesis, North Carolina State University. Accessed April 12, 2021. http://www.lib.ncsu.edu/resolver/1840.16/1209.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Luniya, Sonali R. “SPICE Like Sparse Transient Analysis.” 2003. Web. 12 Apr 2021.

Vancouver:

Luniya SR. SPICE Like Sparse Transient Analysis. [Internet] [Thesis]. North Carolina State University; 2003. [cited 2021 Apr 12]. Available from: http://www.lib.ncsu.edu/resolver/1840.16/1209.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Luniya SR. SPICE Like Sparse Transient Analysis. [Thesis]. North Carolina State University; 2003. Available from: http://www.lib.ncsu.edu/resolver/1840.16/1209

Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology

20.
Wang, Yaqing CSE.
Learning convolutional *sparse* representations.

Degree: 2019, Hong Kong University of Science and Technology

URL: http://repository.ust.hk/ir/Record/1783.1-102367 ; https://doi.org/10.14711/thesis-991012757567703412 ; http://repository.ust.hk/ir/bitstream/1783.1-102367/1/th_redirect.html

► Learning *sparse* representations by *sparse* coding has been used in many applications for decades. Recently, convolutional *sparse* coding (CSC) improves *sparse* coding by learning a…
(more)

Subjects/Keywords: Sparse matrices ; Convolutions (Mathematics) ; Electronic data processing ; Mathematical models ; Knowledge representation (Information theory) ; Machine learning

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, Y. C. (2019). Learning convolutional sparse representations. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-102367 ; https://doi.org/10.14711/thesis-991012757567703412 ; http://repository.ust.hk/ir/bitstream/1783.1-102367/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Wang, Yaqing CSE. “Learning convolutional sparse representations.” 2019. Thesis, Hong Kong University of Science and Technology. Accessed April 12, 2021. http://repository.ust.hk/ir/Record/1783.1-102367 ; https://doi.org/10.14711/thesis-991012757567703412 ; http://repository.ust.hk/ir/bitstream/1783.1-102367/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Wang, Yaqing CSE. “Learning convolutional sparse representations.” 2019. Web. 12 Apr 2021.

Vancouver:

Wang YC. Learning convolutional sparse representations. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2019. [cited 2021 Apr 12]. Available from: http://repository.ust.hk/ir/Record/1783.1-102367 ; https://doi.org/10.14711/thesis-991012757567703412 ; http://repository.ust.hk/ir/bitstream/1783.1-102367/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang YC. Learning convolutional sparse representations. [Thesis]. Hong Kong University of Science and Technology; 2019. Available from: http://repository.ust.hk/ir/Record/1783.1-102367 ; https://doi.org/10.14711/thesis-991012757567703412 ; http://repository.ust.hk/ir/bitstream/1783.1-102367/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology

21. Sun, Wanting. Joint development of disparity tuning and vergence control.

Degree: 2011, Hong Kong University of Science and Technology

URL: http://repository.ust.hk/ir/Record/1783.1-7402 ; https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html

► Behavior and sensory perception are mutually dependent. Sensory perception drives behavior, but behavior also influences the development of sensory perception, by altering the statistics of…
(more)

Subjects/Keywords: Binocular vision ; Computer vision ; Reinforcement learning ; Eye – Movements – Computer simulation ; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sun, W. (2011). Joint development of disparity tuning and vergence control. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-7402 ; https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Sun, Wanting. “Joint development of disparity tuning and vergence control.” 2011. Thesis, Hong Kong University of Science and Technology. Accessed April 12, 2021. http://repository.ust.hk/ir/Record/1783.1-7402 ; https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Sun, Wanting. “Joint development of disparity tuning and vergence control.” 2011. Web. 12 Apr 2021.

Vancouver:

Sun W. Joint development of disparity tuning and vergence control. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2011. [cited 2021 Apr 12]. Available from: http://repository.ust.hk/ir/Record/1783.1-7402 ; https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sun W. Joint development of disparity tuning and vergence control. [Thesis]. Hong Kong University of Science and Technology; 2011. Available from: http://repository.ust.hk/ir/Record/1783.1-7402 ; https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology

22. Benidis, Konstantinos ECE. High-dimensional sparsity methods in machine learning and finance.

Degree: 2018, Hong Kong University of Science and Technology

URL: http://repository.ust.hk/ir/Record/1783.1-92257 ; https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html

► Sparsity has been successfully applied in almost all the fields of science and engineering, especially in high-dimensional applications, where a *sparse* representation can reduce the…
(more)

Subjects/Keywords: Signal processing ; Mathematical models ; Machine learning ; Portfolio management ; Sparse matrices ; Principal components analysis

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Benidis, K. E. (2018). High-dimensional sparsity methods in machine learning and finance. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-92257 ; https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Benidis, Konstantinos ECE. “High-dimensional sparsity methods in machine learning and finance.” 2018. Thesis, Hong Kong University of Science and Technology. Accessed April 12, 2021. http://repository.ust.hk/ir/Record/1783.1-92257 ; https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Benidis, Konstantinos ECE. “High-dimensional sparsity methods in machine learning and finance.” 2018. Web. 12 Apr 2021.

Vancouver:

Benidis KE. High-dimensional sparsity methods in machine learning and finance. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2018. [cited 2021 Apr 12]. Available from: http://repository.ust.hk/ir/Record/1783.1-92257 ; https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Benidis KE. High-dimensional sparsity methods in machine learning and finance. [Thesis]. Hong Kong University of Science and Technology; 2018. Available from: http://repository.ust.hk/ir/Record/1783.1-92257 ; https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Florida Atlantic University

23.
Hahn, William E.
* Sparse* Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections.

Degree: 2016, Florida Atlantic University

URL: http://purl.flvc.org/fau/fd/FA00004713 ; (URL) http://purl.flvc.org/fau/fd/FA00004713

►

Summary: For an 8-bit grayscale image patch of size n x n, the number of distinguishable signals is 256(n2). Natural images (e.g.,photographs of a natural… (more)

Subjects/Keywords: Artificial intelligence; Expert systems (Computer science); Image processing – Digital techniques – Mathematics; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Hahn, W. E. (2016). Sparse Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections. (Thesis). Florida Atlantic University. Retrieved from http://purl.flvc.org/fau/fd/FA00004713 ; (URL) http://purl.flvc.org/fau/fd/FA00004713

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Hahn, William E. “Sparse Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections.” 2016. Thesis, Florida Atlantic University. Accessed April 12, 2021. http://purl.flvc.org/fau/fd/FA00004713 ; (URL) http://purl.flvc.org/fau/fd/FA00004713.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Hahn, William E. “Sparse Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections.” 2016. Web. 12 Apr 2021.

Vancouver:

Hahn WE. Sparse Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections. [Internet] [Thesis]. Florida Atlantic University; 2016. [cited 2021 Apr 12]. Available from: http://purl.flvc.org/fau/fd/FA00004713 ; (URL) http://purl.flvc.org/fau/fd/FA00004713.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hahn WE. Sparse Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections. [Thesis]. Florida Atlantic University; 2016. Available from: http://purl.flvc.org/fau/fd/FA00004713 ; (URL) http://purl.flvc.org/fau/fd/FA00004713

Not specified: Masters Thesis or Doctoral Dissertation

University of Minnesota

24. Cherian, Anoop. Similarity search in visual data.

Degree: PhD, Computer science, 2013, University of Minnesota

URL: http://purl.umn.edu/144455

► Contemporary times have witnessed a significant increase in the amount of data available on the Internet. Organizing such big data so that it is easily…
(more)

Subjects/Keywords: Covariance matrices; Dictionary learning; Dirichlet process; Jensen-bregman logdet divergence; Nearest neighbors; Sparse coding

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Cherian, A. (2013). Similarity search in visual data. (Doctoral Dissertation). University of Minnesota. Retrieved from http://purl.umn.edu/144455

Chicago Manual of Style (16^{th} Edition):

Cherian, Anoop. “Similarity search in visual data.” 2013. Doctoral Dissertation, University of Minnesota. Accessed April 12, 2021. http://purl.umn.edu/144455.

MLA Handbook (7^{th} Edition):

Cherian, Anoop. “Similarity search in visual data.” 2013. Web. 12 Apr 2021.

Vancouver:

Cherian A. Similarity search in visual data. [Internet] [Doctoral dissertation]. University of Minnesota; 2013. [cited 2021 Apr 12]. Available from: http://purl.umn.edu/144455.

Council of Science Editors:

Cherian A. Similarity search in visual data. [Doctoral Dissertation]. University of Minnesota; 2013. Available from: http://purl.umn.edu/144455

Columbia University

25. Kunz, Nicholas Hyeong. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.

Degree: 2019, Columbia University

URL: https://doi.org/10.7916/d8-anph-k173

► This study focused on submarket modeling with unsupervised learning and geographic information system fundamentals to better understand urbanism at the neighborhood scale. A Spatially Constrained…
(more)

Subjects/Keywords: City planning; Neighborhood planning; Real estate development – Planning; Cluster analysis; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Kunz, N. H. (2019). Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. (Masters Thesis). Columbia University. Retrieved from https://doi.org/10.7916/d8-anph-k173

Chicago Manual of Style (16^{th} Edition):

Kunz, Nicholas Hyeong. “Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.” 2019. Masters Thesis, Columbia University. Accessed April 12, 2021. https://doi.org/10.7916/d8-anph-k173.

MLA Handbook (7^{th} Edition):

Kunz, Nicholas Hyeong. “Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.” 2019. Web. 12 Apr 2021.

Vancouver:

Kunz NH. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. [Internet] [Masters thesis]. Columbia University; 2019. [cited 2021 Apr 12]. Available from: https://doi.org/10.7916/d8-anph-k173.

Council of Science Editors:

Kunz NH. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. [Masters Thesis]. Columbia University; 2019. Available from: https://doi.org/10.7916/d8-anph-k173

Columbia University

26. Kunz, Nicholas Hyeong. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.

Degree: 2019, Columbia University

URL: https://doi.org/10.7916/d8-vf26-dc73

► This study focused on submarket modeling with unsupervised learning and geographic information system fundamentals to better understand urbanism at the neighborhood scale. A Spatially Constrained…
(more)

Subjects/Keywords: City planning; Neighborhood planning; Real estate development – Planning; Cluster analysis; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Kunz, N. H. (2019). Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. (Masters Thesis). Columbia University. Retrieved from https://doi.org/10.7916/d8-vf26-dc73

Chicago Manual of Style (16^{th} Edition):

Kunz, Nicholas Hyeong. “Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.” 2019. Masters Thesis, Columbia University. Accessed April 12, 2021. https://doi.org/10.7916/d8-vf26-dc73.

MLA Handbook (7^{th} Edition):

Kunz, Nicholas Hyeong. “Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.” 2019. Web. 12 Apr 2021.

Vancouver:

Kunz NH. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. [Internet] [Masters thesis]. Columbia University; 2019. [cited 2021 Apr 12]. Available from: https://doi.org/10.7916/d8-vf26-dc73.

Council of Science Editors:

Kunz NH. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. [Masters Thesis]. Columbia University; 2019. Available from: https://doi.org/10.7916/d8-vf26-dc73

Columbia University

27. Kunz, Nicholas Hyeong. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.

Degree: 2019, Columbia University

URL: https://doi.org/10.7916/d8-b01b-hc66

► This study focused on submarket modeling with unsupervised learning and geographic information system fundamentals to better understand urbanism at the neighborhood scale. A Spatially Constrained…
(more)

Subjects/Keywords: City planning; Neighborhood planning; Real estate development – Planning; Cluster analysis; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Kunz, N. H. (2019). Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. (Masters Thesis). Columbia University. Retrieved from https://doi.org/10.7916/d8-b01b-hc66

Chicago Manual of Style (16^{th} Edition):

Kunz, Nicholas Hyeong. “Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.” 2019. Masters Thesis, Columbia University. Accessed April 12, 2021. https://doi.org/10.7916/d8-b01b-hc66.

MLA Handbook (7^{th} Edition):

Kunz, Nicholas Hyeong. “Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.” 2019. Web. 12 Apr 2021.

Vancouver:

Kunz NH. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. [Internet] [Masters thesis]. Columbia University; 2019. [cited 2021 Apr 12]. Available from: https://doi.org/10.7916/d8-b01b-hc66.

Council of Science Editors:

Kunz NH. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. [Masters Thesis]. Columbia University; 2019. Available from: https://doi.org/10.7916/d8-b01b-hc66

Columbia University

28. Kunz, Nicholas Hyeong. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.

Degree: 2019, Columbia University

URL: https://doi.org/10.7916/d8-rj87-yx32

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Kunz, N. H. (2019). Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. (Masters Thesis). Columbia University. Retrieved from https://doi.org/10.7916/d8-rj87-yx32

Chicago Manual of Style (16^{th} Edition):

Kunz, Nicholas Hyeong. “Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change.” 2019. Masters Thesis, Columbia University. Accessed April 12, 2021. https://doi.org/10.7916/d8-rj87-yx32.

MLA Handbook (7^{th} Edition):

Vancouver:

Kunz NH. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. [Internet] [Masters thesis]. Columbia University; 2019. [cited 2021 Apr 12]. Available from: https://doi.org/10.7916/d8-rj87-yx32.

Council of Science Editors:

Kunz NH. Unsupervised Learning for Submarket Modeling: A Proxy for Neighborhood Change. [Masters Thesis]. Columbia University; 2019. Available from: https://doi.org/10.7916/d8-rj87-yx32

University of Lethbridge

29.
University of Lethbridge. Faculty of Arts and Science.
A computational study of *sparse* matrix storage schemes
.

Degree: 2008, University of Lethbridge

URL: http://hdl.handle.net/10133/777

► The efficiency of linear algebra operations for *sparse* *matrices* on modern high performance computing system is often constrained by the available memory bandwidth. We are…
(more)

Subjects/Keywords: Sparse matrices – Data processing; Dissertations, Academic

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Science, U. o. L. F. o. A. a. (2008). A computational study of sparse matrix storage schemes . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/777

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “A computational study of sparse matrix storage schemes .” 2008. Thesis, University of Lethbridge. Accessed April 12, 2021. http://hdl.handle.net/10133/777.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “A computational study of sparse matrix storage schemes .” 2008. Web. 12 Apr 2021.

Vancouver:

Science UoLFoAa. A computational study of sparse matrix storage schemes . [Internet] [Thesis]. University of Lethbridge; 2008. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10133/777.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Science UoLFoAa. A computational study of sparse matrix storage schemes . [Thesis]. University of Lethbridge; 2008. Available from: http://hdl.handle.net/10133/777

Not specified: Masters Thesis or Doctoral Dissertation

University of Lethbridge

30.
University of Lethbridge. Faculty of Arts and Science.
On the determination of *sparse* Hessian *matrices* using multi-coloring
.

Degree: 2016, University of Lethbridge

URL: http://hdl.handle.net/10133/4782

► Efficient determination of large *sparse* Hessian *matrices* leads to solving many optimization problems. Exploiting sparsity and symmetry of the Hessian matrix can reduce the number…
(more)

Subjects/Keywords: direct determination method; multi-coloring; sparse matrix; symmetric Hessian matrices; symmetry-exploiting

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Science, U. o. L. F. o. A. a. (2016). On the determination of sparse Hessian matrices using multi-coloring . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/4782

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “On the determination of sparse Hessian matrices using multi-coloring .” 2016. Thesis, University of Lethbridge. Accessed April 12, 2021. http://hdl.handle.net/10133/4782.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “On the determination of sparse Hessian matrices using multi-coloring .” 2016. Web. 12 Apr 2021.

Vancouver:

Science UoLFoAa. On the determination of sparse Hessian matrices using multi-coloring . [Internet] [Thesis]. University of Lethbridge; 2016. [cited 2021 Apr 12]. Available from: http://hdl.handle.net/10133/4782.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Science UoLFoAa. On the determination of sparse Hessian matrices using multi-coloring . [Thesis]. University of Lethbridge; 2016. Available from: http://hdl.handle.net/10133/4782

Not specified: Masters Thesis or Doctoral Dissertation