Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(Sparse matrices)`

.
Showing records 1 – 30 of
81 total matches.

Search Limiters

Dates

- 2015 – 2019 (19)
- 2010 – 2014 (36)
- 2005 – 2009 (17)

Degrees

- PhD (17)
- Docteur es (12)
- MS (12)

▼ Search Limiters

University of Hong Kong

1.
Li, Mingfei.
* Sparse* representation and fast processing of massive
data.

Degree: M. Phil., 2012, University of Hong Kong

URL: Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480

►

Many computational problems involve massive data. A reasonable solution to those problems should be able to store and process the data in a effective manner.… (more)

Subjects/Keywords: Data mining.; Sparse matrices.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Li, M. (2012). Sparse representation and fast processing of massive data. (Masters Thesis). University of Hong Kong. Retrieved from Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480

Chicago Manual of Style (16^{th} Edition):

Li, Mingfei. “Sparse representation and fast processing of massive data.” 2012. Masters Thesis, University of Hong Kong. Accessed April 22, 2019. Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480.

MLA Handbook (7^{th} Edition):

Li, Mingfei. “Sparse representation and fast processing of massive data.” 2012. Web. 22 Apr 2019.

Vancouver:

Li M. Sparse representation and fast processing of massive data. [Internet] [Masters thesis]. University of Hong Kong; 2012. [cited 2019 Apr 22]. Available from: Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480.

Council of Science Editors:

Li M. Sparse representation and fast processing of massive data. [Masters Thesis]. University of Hong Kong; 2012. Available from: Li, M. [李明飞]. (2012). Sparse representation and fast processing of massive data. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b4961797 ; http://dx.doi.org/10.5353/th_b4961797 ; http://hdl.handle.net/10722/181480

University of Hong Kong

2.
Ning, Li.
Influence on information networks and *sparse*
representation of metric spaces: y Li Ning.

Degree: PhD, 2013, University of Hong Kong

URL: Ning, L. [宁立]. (2013). Influence on information networks and sparse representation of metric spaces / y Li Ning. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5066215 ; http://dx.doi.org/10.5353/th_b5066215 ; http://hdl.handle.net/10722/191190

►

As the social networking applications become popular and have attracted more and more attention, it becomes possible (or easier) to mine information networks of a… (more)

Subjects/Keywords: Information networks.; Sparse matrices.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ning, L. (2013). Influence on information networks and sparse representation of metric spaces: y Li Ning. (Doctoral Dissertation). University of Hong Kong. Retrieved from Ning, L. [宁立]. (2013). Influence on information networks and sparse representation of metric spaces / y Li Ning. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5066215 ; http://dx.doi.org/10.5353/th_b5066215 ; http://hdl.handle.net/10722/191190

Chicago Manual of Style (16^{th} Edition):

Ning, Li. “Influence on information networks and sparse representation of metric spaces: y Li Ning.” 2013. Doctoral Dissertation, University of Hong Kong. Accessed April 22, 2019. Ning, L. [宁立]. (2013). Influence on information networks and sparse representation of metric spaces / y Li Ning. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5066215 ; http://dx.doi.org/10.5353/th_b5066215 ; http://hdl.handle.net/10722/191190.

MLA Handbook (7^{th} Edition):

Ning, Li. “Influence on information networks and sparse representation of metric spaces: y Li Ning.” 2013. Web. 22 Apr 2019.

Vancouver:

Ning L. Influence on information networks and sparse representation of metric spaces: y Li Ning. [Internet] [Doctoral dissertation]. University of Hong Kong; 2013. [cited 2019 Apr 22]. Available from: Ning, L. [宁立]. (2013). Influence on information networks and sparse representation of metric spaces / y Li Ning. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5066215 ; http://dx.doi.org/10.5353/th_b5066215 ; http://hdl.handle.net/10722/191190.

Council of Science Editors:

Ning L. Influence on information networks and sparse representation of metric spaces: y Li Ning. [Doctoral Dissertation]. University of Hong Kong; 2013. Available from: Ning, L. [宁立]. (2013). Influence on information networks and sparse representation of metric spaces / y Li Ning. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5066215 ; http://dx.doi.org/10.5353/th_b5066215 ; http://hdl.handle.net/10722/191190

University of Lethbridge

3.
Hasan, Mahmudul.
DSJM : a software toolkit for direct determination of *sparse* Jacobian * matrices*
.

Degree: 2011, University of Lethbridge

URL: http://hdl.handle.net/10133/3216

► DSJM is a software toolkit written in portable C++ that enables direct determination of *sparse* Jacobian *matrices* whose sparsity pattern is a priori known. Using…
(more)

Subjects/Keywords: Sparse matrices; Sparse matrices – Computer programs; Jacobians – Data processing; Dissertations, Academic

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Hasan, M. (2011). DSJM : a software toolkit for direct determination of sparse Jacobian matrices . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/3216

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Hasan, Mahmudul. “DSJM : a software toolkit for direct determination of sparse Jacobian matrices .” 2011. Thesis, University of Lethbridge. Accessed April 22, 2019. http://hdl.handle.net/10133/3216.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Hasan, Mahmudul. “DSJM : a software toolkit for direct determination of sparse Jacobian matrices .” 2011. Web. 22 Apr 2019.

Vancouver:

Hasan M. DSJM : a software toolkit for direct determination of sparse Jacobian matrices . [Internet] [Thesis]. University of Lethbridge; 2011. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/10133/3216.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hasan M. DSJM : a software toolkit for direct determination of sparse Jacobian matrices . [Thesis]. University of Lethbridge; 2011. Available from: http://hdl.handle.net/10133/3216

Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology

4.
Zhang, Weibin.
Regularized and *sparse* models for low resource speech recognition.

Degree: 2013, Hong Kong University of Science and Technology

URL: https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html

► The performance of modern speech recognition systems depends heavily on the availability of sufficient training data. Although the recognition accuracy of a speech recognition system…
(more)

Subjects/Keywords: Automatic speech recognition; Mathematical models; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhang, W. (2013). Regularized and sparse models for low resource speech recognition. (Thesis). Hong Kong University of Science and Technology. Retrieved from https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Zhang, Weibin. “Regularized and sparse models for low resource speech recognition.” 2013. Thesis, Hong Kong University of Science and Technology. Accessed April 22, 2019. https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Zhang, Weibin. “Regularized and sparse models for low resource speech recognition.” 2013. Web. 22 Apr 2019.

Vancouver:

Zhang W. Regularized and sparse models for low resource speech recognition. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2013. [cited 2019 Apr 22]. Available from: https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhang W. Regularized and sparse models for low resource speech recognition. [Thesis]. Hong Kong University of Science and Technology; 2013. Available from: https://doi.org/10.14711/thesis-b1256202 ; http://repository.ust.hk/ir/bitstream/1783.1-62242/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

University of Texas – Austin

5. Schmitz, Phillip Gordon. Fast direct algorithms for elliptic equations via hierarchical matrix compression.

Degree: Mathematics, 2010, University of Texas – Austin

URL: http://hdl.handle.net/2152/ETD-UT-2010-08-1847

► We present a fast direct algorithm for the solution of linear systems arising from elliptic equations. We extend the work of Xia et al. (2009)…
(more)

Subjects/Keywords: Fast algorithms; Hierarchical matrices; Sparse; Direct; Elliptic

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Schmitz, P. G. (2010). Fast direct algorithms for elliptic equations via hierarchical matrix compression. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/ETD-UT-2010-08-1847

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Schmitz, Phillip Gordon. “Fast direct algorithms for elliptic equations via hierarchical matrix compression.” 2010. Thesis, University of Texas – Austin. Accessed April 22, 2019. http://hdl.handle.net/2152/ETD-UT-2010-08-1847.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Schmitz, Phillip Gordon. “Fast direct algorithms for elliptic equations via hierarchical matrix compression.” 2010. Web. 22 Apr 2019.

Vancouver:

Schmitz PG. Fast direct algorithms for elliptic equations via hierarchical matrix compression. [Internet] [Thesis]. University of Texas – Austin; 2010. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/2152/ETD-UT-2010-08-1847.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Schmitz PG. Fast direct algorithms for elliptic equations via hierarchical matrix compression. [Thesis]. University of Texas – Austin; 2010. Available from: http://hdl.handle.net/2152/ETD-UT-2010-08-1847

Not specified: Masters Thesis or Doctoral Dissertation

Drexel University

6.
Cunningham, Kevin.
High-performance architectures for accelerating *sparse* LU computation.

Degree: 2011, Drexel University

URL: http://hdl.handle.net/1860/3718

►

*Sparse* Lower-Upper (LU) Triangular Decomposition is important to many di erent applications, including power system analysis. High-performance *sparse* linear algebra software packages, executing on general-purpose…
(more)

Subjects/Keywords: Computer engineering; Computer architecture; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Cunningham, K. (2011). High-performance architectures for accelerating sparse LU computation. (Thesis). Drexel University. Retrieved from http://hdl.handle.net/1860/3718

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Cunningham, Kevin. “High-performance architectures for accelerating sparse LU computation.” 2011. Thesis, Drexel University. Accessed April 22, 2019. http://hdl.handle.net/1860/3718.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Cunningham, Kevin. “High-performance architectures for accelerating sparse LU computation.” 2011. Web. 22 Apr 2019.

Vancouver:

Cunningham K. High-performance architectures for accelerating sparse LU computation. [Internet] [Thesis]. Drexel University; 2011. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/1860/3718.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Cunningham K. High-performance architectures for accelerating sparse LU computation. [Thesis]. Drexel University; 2011. Available from: http://hdl.handle.net/1860/3718

Not specified: Masters Thesis or Doctoral Dissertation

University of Florida

7.
Phillips, Adam.
Exploration of *Sparse* Matrix Expansion.

Degree: 2012, University of Florida

URL: http://ufdc.ufl.edu/AA00057283

► Currently, *sparse* matrix algorithms are tested and benchmarked using either existing *matrices* from real-world problems, or by generating entirely random *matrices*. Using existing *matrices* has…
(more)

Subjects/Keywords: Algorithms; Grid refinement; Histograms; Mathematics; Matrices; Octahedrons; Power laws; Software; Tetrahedrons; Triangles; Algorithms; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Phillips, A. (2012). Exploration of Sparse Matrix Expansion. (Thesis). University of Florida. Retrieved from http://ufdc.ufl.edu/AA00057283

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Phillips, Adam. “Exploration of Sparse Matrix Expansion.” 2012. Thesis, University of Florida. Accessed April 22, 2019. http://ufdc.ufl.edu/AA00057283.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Phillips, Adam. “Exploration of Sparse Matrix Expansion.” 2012. Web. 22 Apr 2019.

Vancouver:

Phillips A. Exploration of Sparse Matrix Expansion. [Internet] [Thesis]. University of Florida; 2012. [cited 2019 Apr 22]. Available from: http://ufdc.ufl.edu/AA00057283.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Phillips A. Exploration of Sparse Matrix Expansion. [Thesis]. University of Florida; 2012. Available from: http://ufdc.ufl.edu/AA00057283

Not specified: Masters Thesis or Doctoral Dissertation

University of Alberta

8.
Rivasplata, Omar D.
Smallest singular value of *sparse* random * matrices*.

Degree: PhD, Department of Mathematical and Statistical Sciences, 2012, University of Alberta

URL: https://era.library.ualberta.ca/files/nc580m941

► In this thesis probability estimates on the smallest singular value of random *matrices* with independent entries are extended to a class of *sparse* random *matrices*.…
(more)

Subjects/Keywords: incompressible vectors; deviation inequalities; sparse matrices; random matrices; singular values; compressible vectors; invertibility of random matrices; sub-Gaussian random variables

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Rivasplata, O. D. (2012). Smallest singular value of sparse random matrices. (Doctoral Dissertation). University of Alberta. Retrieved from https://era.library.ualberta.ca/files/nc580m941

Chicago Manual of Style (16^{th} Edition):

Rivasplata, Omar D. “Smallest singular value of sparse random matrices.” 2012. Doctoral Dissertation, University of Alberta. Accessed April 22, 2019. https://era.library.ualberta.ca/files/nc580m941.

MLA Handbook (7^{th} Edition):

Rivasplata, Omar D. “Smallest singular value of sparse random matrices.” 2012. Web. 22 Apr 2019.

Vancouver:

Rivasplata OD. Smallest singular value of sparse random matrices. [Internet] [Doctoral dissertation]. University of Alberta; 2012. [cited 2019 Apr 22]. Available from: https://era.library.ualberta.ca/files/nc580m941.

Council of Science Editors:

Rivasplata OD. Smallest singular value of sparse random matrices. [Doctoral Dissertation]. University of Alberta; 2012. Available from: https://era.library.ualberta.ca/files/nc580m941

Penn State University

9. Toth, Brice Alan. Cost Effective Machine Learning Approaches for Linear Solver Selection.

Degree: MS, Computer Science and Engineering, 2009, Penn State University

URL: https://etda.libraries.psu.edu/catalog/9559

► Numerical simulations are important in many areas of science and engineering. These simulations often involve the solution of large, *sparse* systems of linear equations. The…
(more)

Subjects/Keywords: machine learning; linear solvers; sparse matrices; cost reduction

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Toth, B. A. (2009). Cost Effective Machine Learning Approaches for Linear Solver Selection. (Masters Thesis). Penn State University. Retrieved from https://etda.libraries.psu.edu/catalog/9559

Chicago Manual of Style (16^{th} Edition):

Toth, Brice Alan. “Cost Effective Machine Learning Approaches for Linear Solver Selection.” 2009. Masters Thesis, Penn State University. Accessed April 22, 2019. https://etda.libraries.psu.edu/catalog/9559.

MLA Handbook (7^{th} Edition):

Toth, Brice Alan. “Cost Effective Machine Learning Approaches for Linear Solver Selection.” 2009. Web. 22 Apr 2019.

Vancouver:

Toth BA. Cost Effective Machine Learning Approaches for Linear Solver Selection. [Internet] [Masters thesis]. Penn State University; 2009. [cited 2019 Apr 22]. Available from: https://etda.libraries.psu.edu/catalog/9559.

Council of Science Editors:

Toth BA. Cost Effective Machine Learning Approaches for Linear Solver Selection. [Masters Thesis]. Penn State University; 2009. Available from: https://etda.libraries.psu.edu/catalog/9559

University of Lethbridge

10. Zulkarnine, Ahmed Tahsin. Design structure and iterative release analysis of scientific software .

Degree: 2012, University of Lethbridge

URL: http://hdl.handle.net/10133/3256

► One of the main objectives of software development in scientific computing is efficiency. Being focused on highly specialized application domain, important software quality metrics, e.g.,…
(more)

Subjects/Keywords: Science – Computer programs; Computer software – Development; Computer software – Testing; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zulkarnine, A. T. (2012). Design structure and iterative release analysis of scientific software . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/3256

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Zulkarnine, Ahmed Tahsin. “Design structure and iterative release analysis of scientific software .” 2012. Thesis, University of Lethbridge. Accessed April 22, 2019. http://hdl.handle.net/10133/3256.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Zulkarnine, Ahmed Tahsin. “Design structure and iterative release analysis of scientific software .” 2012. Web. 22 Apr 2019.

Vancouver:

Zulkarnine AT. Design structure and iterative release analysis of scientific software . [Internet] [Thesis]. University of Lethbridge; 2012. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/10133/3256.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zulkarnine AT. Design structure and iterative release analysis of scientific software . [Thesis]. University of Lethbridge; 2012. Available from: http://hdl.handle.net/10133/3256

Not specified: Masters Thesis or Doctoral Dissertation

University of Minnesota

11.
Kalantzis, Vasileios.
Domain decomposition algorithms for the solution of *sparse* symmetric generalized eigenvalue problems.

Degree: PhD, Computer Science, 2018, University of Minnesota

URL: http://hdl.handle.net/11299/201170

► This dissertation focuses on the design, implementation, and evaluation of domain decomposition techniques for the solution of large and *sparse* algebraic symmetric generalized eigenvalue problems.…
(more)

Subjects/Keywords: Domain decomposition; Eigenvalues; Schur complement; Sparse matrices; Symmetric generalized eigenvalue problem

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Kalantzis, V. (2018). Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/201170

Chicago Manual of Style (16^{th} Edition):

Kalantzis, Vasileios. “Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems.” 2018. Doctoral Dissertation, University of Minnesota. Accessed April 22, 2019. http://hdl.handle.net/11299/201170.

MLA Handbook (7^{th} Edition):

Kalantzis, Vasileios. “Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems.” 2018. Web. 22 Apr 2019.

Vancouver:

Kalantzis V. Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems. [Internet] [Doctoral dissertation]. University of Minnesota; 2018. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/11299/201170.

Council of Science Editors:

Kalantzis V. Domain decomposition algorithms for the solution of sparse symmetric generalized eigenvalue problems. [Doctoral Dissertation]. University of Minnesota; 2018. Available from: http://hdl.handle.net/11299/201170

University of Johannesburg

12.
Chifamba, Saymore.
A study of the performance of a *sparse* grid cross section representation methodology as applied to MOX fuel.

Degree: 2015, University of Johannesburg

URL: http://hdl.handle.net/10210/15086

►

M.Phil. (Energy Studies)

Nodal diffusion methods are often used to calculate the distribution of neutrons in a nuclear reactor core. They require few-group homogenized neutron… (more)

Subjects/Keywords: Neclear power plants; Nuclear reactor kinetics; Sparse matrices; Uranium - Isotopes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Chifamba, S. (2015). A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/15086

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Chifamba, Saymore. “A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel.” 2015. Thesis, University of Johannesburg. Accessed April 22, 2019. http://hdl.handle.net/10210/15086.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Chifamba, Saymore. “A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel.” 2015. Web. 22 Apr 2019.

Vancouver:

Chifamba S. A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel. [Internet] [Thesis]. University of Johannesburg; 2015. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/10210/15086.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chifamba S. A study of the performance of a sparse grid cross section representation methodology as applied to MOX fuel. [Thesis]. University of Johannesburg; 2015. Available from: http://hdl.handle.net/10210/15086

Not specified: Masters Thesis or Doctoral Dissertation

University of Lethbridge

13.
University of Lethbridge. Faculty of Arts and Science.
An improved implementation of sparsity detection of *sparse* derivative * matrices*
.

Degree: 2018, University of Lethbridge

URL: http://hdl.handle.net/10133/5266

► Optimization is a crucial branch of research with application in numerous domain. Determination of sparsity is a vital stream of optimization research with potentials for…
(more)

Subjects/Keywords: Jacobians; Combinatorial optimization; Sparse matrices – Data processing; Graph coloring; Parallel programs (Computer programs); Matix devrivatives; sparse data structure; CPR algorithm; sparse derivative matrices; Jacobian matrix; multilevel algorithm; parallel implementation

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Science, U. o. L. F. o. A. a. (2018). An improved implementation of sparsity detection of sparse derivative matrices . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/5266

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “An improved implementation of sparsity detection of sparse derivative matrices .” 2018. Thesis, University of Lethbridge. Accessed April 22, 2019. http://hdl.handle.net/10133/5266.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “An improved implementation of sparsity detection of sparse derivative matrices .” 2018. Web. 22 Apr 2019.

Vancouver:

Science UoLFoAa. An improved implementation of sparsity detection of sparse derivative matrices . [Internet] [Thesis]. University of Lethbridge; 2018. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/10133/5266.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Science UoLFoAa. An improved implementation of sparsity detection of sparse derivative matrices . [Thesis]. University of Lethbridge; 2018. Available from: http://hdl.handle.net/10133/5266

Not specified: Masters Thesis or Doctoral Dissertation

Anna University

14.
Arathi, P.
Bandwidth reduction of *sparse* symmetric
*matrices*; -.

Degree: Science and Humanities, 2014, Anna University

URL: http://shodhganga.inflibnet.ac.in/handle/10603/24966

Subjects/Keywords: Science and humanities; Sparse Symmetric matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Arathi, P. (2014). Bandwidth reduction of sparse symmetric matrices; -. (Thesis). Anna University. Retrieved from http://shodhganga.inflibnet.ac.in/handle/10603/24966

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Arathi, P. “Bandwidth reduction of sparse symmetric matrices; -.” 2014. Thesis, Anna University. Accessed April 22, 2019. http://shodhganga.inflibnet.ac.in/handle/10603/24966.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Arathi, P. “Bandwidth reduction of sparse symmetric matrices; -.” 2014. Web. 22 Apr 2019.

Vancouver:

Arathi P. Bandwidth reduction of sparse symmetric matrices; -. [Internet] [Thesis]. Anna University; 2014. [cited 2019 Apr 22]. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/24966.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Arathi P. Bandwidth reduction of sparse symmetric matrices; -. [Thesis]. Anna University; 2014. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/24966

Not specified: Masters Thesis or Doctoral Dissertation

EPFL

15. Haghighatshoar, Saeid. Compressed Sensing of Memoryless Sources: A Deterministic Hadamard Construction.

Degree: 2014, EPFL

URL: http://infoscience.epfl.ch/record/203719

► Compressed sensing is a new trend in signal processing for efficient sampling and signal acquisition. The idea is that most real-world signals have a *sparse*…
(more)

Subjects/Keywords: Compressed sensing; Hadamard matrices; Deterministic matrix construction; Sparse Fast Hadamard Transform (SFHT); Distributed compressed sensing

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Haghighatshoar, S. (2014). Compressed Sensing of Memoryless Sources: A Deterministic Hadamard Construction. (Thesis). EPFL. Retrieved from http://infoscience.epfl.ch/record/203719

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Haghighatshoar, Saeid. “Compressed Sensing of Memoryless Sources: A Deterministic Hadamard Construction.” 2014. Thesis, EPFL. Accessed April 22, 2019. http://infoscience.epfl.ch/record/203719.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Haghighatshoar, Saeid. “Compressed Sensing of Memoryless Sources: A Deterministic Hadamard Construction.” 2014. Web. 22 Apr 2019.

Vancouver:

Haghighatshoar S. Compressed Sensing of Memoryless Sources: A Deterministic Hadamard Construction. [Internet] [Thesis]. EPFL; 2014. [cited 2019 Apr 22]. Available from: http://infoscience.epfl.ch/record/203719.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Haghighatshoar S. Compressed Sensing of Memoryless Sources: A Deterministic Hadamard Construction. [Thesis]. EPFL; 2014. Available from: http://infoscience.epfl.ch/record/203719

Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology

16. Sun, Wanting. Joint development of disparity tuning and vergence control.

Degree: 2011, Hong Kong University of Science and Technology

URL: https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html

► Behavior and sensory perception are mutually dependent. Sensory perception drives behavior, but behavior also influences the development of sensory perception, by altering the statistics of…
(more)

Subjects/Keywords: Binocular vision; Computer vision; Reinforcement learning; Eye – Movements – Computer simulation; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sun, W. (2011). Joint development of disparity tuning and vergence control. (Thesis). Hong Kong University of Science and Technology. Retrieved from https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Sun, Wanting. “Joint development of disparity tuning and vergence control.” 2011. Thesis, Hong Kong University of Science and Technology. Accessed April 22, 2019. https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Sun, Wanting. “Joint development of disparity tuning and vergence control.” 2011. Web. 22 Apr 2019.

Vancouver:

Sun W. Joint development of disparity tuning and vergence control. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2011. [cited 2019 Apr 22]. Available from: https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sun W. Joint development of disparity tuning and vergence control. [Thesis]. Hong Kong University of Science and Technology; 2011. Available from: https://doi.org/10.14711/thesis-b1155636 ; http://repository.ust.hk/ir/bitstream/1783.1-7402/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

University of Lethbridge

17.
University of Lethbridge. Faculty of Arts and Science.
A computational study of *sparse* matrix storage schemes
.

Degree: 2008, University of Lethbridge

URL: http://hdl.handle.net/10133/777

► The efficiency of linear algebra operations for *sparse* *matrices* on modern high performance computing system is often constrained by the available memory bandwidth. We are…
(more)

Subjects/Keywords: Sparse matrices – Data processing; Dissertations, Academic

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Science, U. o. L. F. o. A. a. (2008). A computational study of sparse matrix storage schemes . (Thesis). University of Lethbridge. Retrieved from http://hdl.handle.net/10133/777

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “A computational study of sparse matrix storage schemes .” 2008. Thesis, University of Lethbridge. Accessed April 22, 2019. http://hdl.handle.net/10133/777.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Science, University of Lethbridge. Faculty of Arts and. “A computational study of sparse matrix storage schemes .” 2008. Web. 22 Apr 2019.

Vancouver:

Science UoLFoAa. A computational study of sparse matrix storage schemes . [Internet] [Thesis]. University of Lethbridge; 2008. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/10133/777.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Science UoLFoAa. A computational study of sparse matrix storage schemes . [Thesis]. University of Lethbridge; 2008. Available from: http://hdl.handle.net/10133/777

Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology

18. Benidis, Konstantinos ECE. High-dimensional sparsity methods in machine learning and finance.

Degree: 2018, Hong Kong University of Science and Technology

URL: https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html

► Sparsity has been successfully applied in almost all the fields of science and engineering, especially in high-dimensional applications, where a *sparse* representation can reduce the…
(more)

Subjects/Keywords: Signal processing; Mathematical models; Machine learning; Portfolio management; Sparse matrices; Principal components analysis

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Benidis, K. E. (2018). High-dimensional sparsity methods in machine learning and finance. (Thesis). Hong Kong University of Science and Technology. Retrieved from https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Benidis, Konstantinos ECE. “High-dimensional sparsity methods in machine learning and finance.” 2018. Thesis, Hong Kong University of Science and Technology. Accessed April 22, 2019. https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Benidis, Konstantinos ECE. “High-dimensional sparsity methods in machine learning and finance.” 2018. Web. 22 Apr 2019.

Vancouver:

Benidis KE. High-dimensional sparsity methods in machine learning and finance. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2018. [cited 2019 Apr 22]. Available from: https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Benidis KE. High-dimensional sparsity methods in machine learning and finance. [Thesis]. Hong Kong University of Science and Technology; 2018. Available from: https://doi.org/10.14711/thesis-991012588465403412 ; http://repository.ust.hk/ir/bitstream/1783.1-92257/1/th_redirect.html

Not specified: Masters Thesis or Doctoral Dissertation

University of Minnesota

19. Cherian, Anoop. Similarity search in visual data.

Degree: PhD, Computer science, 2013, University of Minnesota

URL: http://purl.umn.edu/144455

► Contemporary times have witnessed a significant increase in the amount of data available on the Internet. Organizing such big data so that it is easily…
(more)

Subjects/Keywords: Covariance matrices; Dictionary learning; Dirichlet process; Jensen-bregman logdet divergence; Nearest neighbors; Sparse coding

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Cherian, A. (2013). Similarity search in visual data. (Doctoral Dissertation). University of Minnesota. Retrieved from http://purl.umn.edu/144455

Chicago Manual of Style (16^{th} Edition):

Cherian, Anoop. “Similarity search in visual data.” 2013. Doctoral Dissertation, University of Minnesota. Accessed April 22, 2019. http://purl.umn.edu/144455.

MLA Handbook (7^{th} Edition):

Cherian, Anoop. “Similarity search in visual data.” 2013. Web. 22 Apr 2019.

Vancouver:

Cherian A. Similarity search in visual data. [Internet] [Doctoral dissertation]. University of Minnesota; 2013. [cited 2019 Apr 22]. Available from: http://purl.umn.edu/144455.

Council of Science Editors:

Cherian A. Similarity search in visual data. [Doctoral Dissertation]. University of Minnesota; 2013. Available from: http://purl.umn.edu/144455

North Carolina State University

20.
Luniya, Sonali R.
SPICE Like *Sparse* Transient Analysis.

Degree: MS, Computer Engineering, 2003, North Carolina State University

URL: http://www.lib.ncsu.edu/resolver/1840.16/1209

► A state variable transient circuit analysis using *sparse* *matrices* is developed. The equations are formulated using time discretization based on Newton's iterative method of equations…
(more)

Subjects/Keywords: fREEDA; NeoCAD; Sparse Matrices; State Variables

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Luniya, S. R. (2003). SPICE Like Sparse Transient Analysis. (Thesis). North Carolina State University. Retrieved from http://www.lib.ncsu.edu/resolver/1840.16/1209

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Luniya, Sonali R. “SPICE Like Sparse Transient Analysis.” 2003. Thesis, North Carolina State University. Accessed April 22, 2019. http://www.lib.ncsu.edu/resolver/1840.16/1209.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Luniya, Sonali R. “SPICE Like Sparse Transient Analysis.” 2003. Web. 22 Apr 2019.

Vancouver:

Luniya SR. SPICE Like Sparse Transient Analysis. [Internet] [Thesis]. North Carolina State University; 2003. [cited 2019 Apr 22]. Available from: http://www.lib.ncsu.edu/resolver/1840.16/1209.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Luniya SR. SPICE Like Sparse Transient Analysis. [Thesis]. North Carolina State University; 2003. Available from: http://www.lib.ncsu.edu/resolver/1840.16/1209

Not specified: Masters Thesis or Doctoral Dissertation

Colorado State University

21.
Dinkins, Stephanie.
Model for predicting the performance of *sparse* matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A.

Degree: MS(M.S.), Computer Science, 2007, Colorado State University

URL: http://hdl.handle.net/10217/65303

► *Sparse* matrix vector multiply (SpMV) is an important computation that is used in many scientific and structural engineering applications. *Sparse* computations like SpMV require the…
(more)

Subjects/Keywords: data locality; Manhattan distance; performance model; sparse matrices; sparse matrix vector multiply; SpMV

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Dinkins, S. (2007). Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A. (Masters Thesis). Colorado State University. Retrieved from http://hdl.handle.net/10217/65303

Chicago Manual of Style (16^{th} Edition):

Dinkins, Stephanie. “Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A.” 2007. Masters Thesis, Colorado State University. Accessed April 22, 2019. http://hdl.handle.net/10217/65303.

MLA Handbook (7^{th} Edition):

Dinkins, Stephanie. “Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A.” 2007. Web. 22 Apr 2019.

Vancouver:

Dinkins S. Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A. [Internet] [Masters thesis]. Colorado State University; 2007. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/10217/65303.

Council of Science Editors:

Dinkins S. Model for predicting the performance of sparse matrix vector multiply (SpMV) using memory bandwidth requirements and data locality, A. [Masters Thesis]. Colorado State University; 2007. Available from: http://hdl.handle.net/10217/65303

22. Juslin, Kaj. A Companion Model Approach to Modelling and Simulation of Industrial Processes.

Degree: 2005, VTT Technical Research Centre of Finland

URL: http://lib.tkk.fi/Diss/2005/isbn9513866602/

►

Modelling and simulation provides for huge possibilities if broadly taken up by engineers as a working method. However, when considering the launching of modelling and… (more)

Subjects/Keywords: industrial processes; process simulation; simulation models; simulation software; software implementation; systems architecture; model specification; structured graphs; companion models; sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Juslin, K. (2005). A Companion Model Approach to Modelling and Simulation of Industrial Processes. (Thesis). VTT Technical Research Centre of Finland. Retrieved from http://lib.tkk.fi/Diss/2005/isbn9513866602/

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Juslin, Kaj. “A Companion Model Approach to Modelling and Simulation of Industrial Processes.” 2005. Thesis, VTT Technical Research Centre of Finland. Accessed April 22, 2019. http://lib.tkk.fi/Diss/2005/isbn9513866602/.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Juslin, Kaj. “A Companion Model Approach to Modelling and Simulation of Industrial Processes.” 2005. Web. 22 Apr 2019.

Vancouver:

Juslin K. A Companion Model Approach to Modelling and Simulation of Industrial Processes. [Internet] [Thesis]. VTT Technical Research Centre of Finland; 2005. [cited 2019 Apr 22]. Available from: http://lib.tkk.fi/Diss/2005/isbn9513866602/.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Juslin K. A Companion Model Approach to Modelling and Simulation of Industrial Processes. [Thesis]. VTT Technical Research Centre of Finland; 2005. Available from: http://lib.tkk.fi/Diss/2005/isbn9513866602/

Not specified: Masters Thesis or Doctoral Dissertation

University of California – Berkeley

23. Hoemmen, Mark. Communication-avoiding Krylov subspace methods.

Degree: Computer Science, 2010, University of California – Berkeley

URL: http://www.escholarship.org/uc/item/7757521k

► The cost of an algorithm includes both arithmetic and communication.We use "communication" in a general sense to mean data movement,either between levels of a memory…
(more)

Subjects/Keywords: Computer Science; Mathematics; communication-avoiding algorithms; iterative methods; linear algebra; numerical methods; parallel algorithms; sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Hoemmen, M. (2010). Communication-avoiding Krylov subspace methods. (Thesis). University of California – Berkeley. Retrieved from http://www.escholarship.org/uc/item/7757521k

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Hoemmen, Mark. “Communication-avoiding Krylov subspace methods.” 2010. Thesis, University of California – Berkeley. Accessed April 22, 2019. http://www.escholarship.org/uc/item/7757521k.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Hoemmen, Mark. “Communication-avoiding Krylov subspace methods.” 2010. Web. 22 Apr 2019.

Vancouver:

Hoemmen M. Communication-avoiding Krylov subspace methods. [Internet] [Thesis]. University of California – Berkeley; 2010. [cited 2019 Apr 22]. Available from: http://www.escholarship.org/uc/item/7757521k.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hoemmen M. Communication-avoiding Krylov subspace methods. [Thesis]. University of California – Berkeley; 2010. Available from: http://www.escholarship.org/uc/item/7757521k

Not specified: Masters Thesis or Doctoral Dissertation

University of British Columbia

24.
Cavers, Ian Alfred.
Tiebreaking the minimum degree algorithm for ordering *sparse* symmetric positive definite * matrices*
.

Degree: 1987, University of British Columbia

URL: http://hdl.handle.net/2429/27855

► The minimum degree algorithm is known as an effective scheme for identifying a fill reduced ordering for symmetric, positive definite, *sparse* linear systems, to be…
(more)

Subjects/Keywords: Algorithms; Linear systems – Data processing; Sparse matrices

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Cavers, I. A. (1987). Tiebreaking the minimum degree algorithm for ordering sparse symmetric positive definite matrices . (Thesis). University of British Columbia. Retrieved from http://hdl.handle.net/2429/27855

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Cavers, Ian Alfred. “Tiebreaking the minimum degree algorithm for ordering sparse symmetric positive definite matrices .” 1987. Thesis, University of British Columbia. Accessed April 22, 2019. http://hdl.handle.net/2429/27855.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Cavers, Ian Alfred. “Tiebreaking the minimum degree algorithm for ordering sparse symmetric positive definite matrices .” 1987. Web. 22 Apr 2019.

Vancouver:

Cavers IA. Tiebreaking the minimum degree algorithm for ordering sparse symmetric positive definite matrices . [Internet] [Thesis]. University of British Columbia; 1987. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/2429/27855.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Cavers IA. Tiebreaking the minimum degree algorithm for ordering sparse symmetric positive definite matrices . [Thesis]. University of British Columbia; 1987. Available from: http://hdl.handle.net/2429/27855

Not specified: Masters Thesis or Doctoral Dissertation

Universitat Politècnica de València

25. GUERRERO FLORES, DANNY JOEL. On Updating Preconditioners for the Iterative Solution of Linear Systems .

Degree: 2018, Universitat Politècnica de València

URL: http://hdl.handle.net/10251/104923

► El tema principal de esta tesis es el desarrollo de técnicas de actualización de precondicionadores para resolver sistemas lineales de gran tamaño y dispersos Ax=b…
(more)

Subjects/Keywords: Iterative methods; skew-symmetric matrices; sparse linear systems; preconditioning; low-rank update; least squares problems; rank deficient

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

GUERRERO FLORES, D. J. (2018). On Updating Preconditioners for the Iterative Solution of Linear Systems . (Doctoral Dissertation). Universitat Politècnica de València. Retrieved from http://hdl.handle.net/10251/104923

Chicago Manual of Style (16^{th} Edition):

GUERRERO FLORES, DANNY JOEL. “On Updating Preconditioners for the Iterative Solution of Linear Systems .” 2018. Doctoral Dissertation, Universitat Politècnica de València. Accessed April 22, 2019. http://hdl.handle.net/10251/104923.

MLA Handbook (7^{th} Edition):

GUERRERO FLORES, DANNY JOEL. “On Updating Preconditioners for the Iterative Solution of Linear Systems .” 2018. Web. 22 Apr 2019.

Vancouver:

GUERRERO FLORES DJ. On Updating Preconditioners for the Iterative Solution of Linear Systems . [Internet] [Doctoral dissertation]. Universitat Politècnica de València; 2018. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/10251/104923.

Council of Science Editors:

GUERRERO FLORES DJ. On Updating Preconditioners for the Iterative Solution of Linear Systems . [Doctoral Dissertation]. Universitat Politècnica de València; 2018. Available from: http://hdl.handle.net/10251/104923

University of Tennessee – Knoxville

26.
Li, ShuangJiang.
Distributed Data Aggregation for *Sparse* Recovery in Wireless Sensor Networks.

Degree: MS, Computer Engineering, 2011, University of Tennessee – Knoxville

URL: https://trace.tennessee.edu/utk_gradthes/1078

► We consider the approximate *sparse* recovery problem in Wireless Sensor Networks (WSNs) using Compressed Sensing/Compressive Sampling (CS). The goal is to recover the n…
(more)

Subjects/Keywords: Distributed Compressed Sensing; Sparse Binary Matrices; Data Recovery; Expander Graph; Computer Engineering; Digital Communications and Networking

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Li, S. (2011). Distributed Data Aggregation for Sparse Recovery in Wireless Sensor Networks. (Thesis). University of Tennessee – Knoxville. Retrieved from https://trace.tennessee.edu/utk_gradthes/1078

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Li, ShuangJiang. “Distributed Data Aggregation for Sparse Recovery in Wireless Sensor Networks.” 2011. Thesis, University of Tennessee – Knoxville. Accessed April 22, 2019. https://trace.tennessee.edu/utk_gradthes/1078.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Li, ShuangJiang. “Distributed Data Aggregation for Sparse Recovery in Wireless Sensor Networks.” 2011. Web. 22 Apr 2019.

Vancouver:

Li S. Distributed Data Aggregation for Sparse Recovery in Wireless Sensor Networks. [Internet] [Thesis]. University of Tennessee – Knoxville; 2011. [cited 2019 Apr 22]. Available from: https://trace.tennessee.edu/utk_gradthes/1078.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Li S. Distributed Data Aggregation for Sparse Recovery in Wireless Sensor Networks. [Thesis]. University of Tennessee – Knoxville; 2011. Available from: https://trace.tennessee.edu/utk_gradthes/1078

Not specified: Masters Thesis or Doctoral Dissertation

University of Arizona

27.
Jung, Ho-Won.
Direct *sparse* matrix methods for interior point algorithms.

Degree: 1990, University of Arizona

URL: http://hdl.handle.net/10150/185133

► Recent advances in linear programming solution methodology have focused on interior point algorithms. These are powerful new methods, achieving significant reductions in computer time for…
(more)

Subjects/Keywords: Sparse matrices; Trees (Graph theory); Linear programming.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Jung, H. (1990). Direct sparse matrix methods for interior point algorithms. (Doctoral Dissertation). University of Arizona. Retrieved from http://hdl.handle.net/10150/185133

Chicago Manual of Style (16^{th} Edition):

Jung, Ho-Won. “Direct sparse matrix methods for interior point algorithms. ” 1990. Doctoral Dissertation, University of Arizona. Accessed April 22, 2019. http://hdl.handle.net/10150/185133.

MLA Handbook (7^{th} Edition):

Jung, Ho-Won. “Direct sparse matrix methods for interior point algorithms. ” 1990. Web. 22 Apr 2019.

Vancouver:

Jung H. Direct sparse matrix methods for interior point algorithms. [Internet] [Doctoral dissertation]. University of Arizona; 1990. [cited 2019 Apr 22]. Available from: http://hdl.handle.net/10150/185133.

Council of Science Editors:

Jung H. Direct sparse matrix methods for interior point algorithms. [Doctoral Dissertation]. University of Arizona; 1990. Available from: http://hdl.handle.net/10150/185133

University of Florida

28.
Rajamanickam, Sivasank.
Efficient Algorithms for *Sparse* Singular Value Decomposition.

Degree: PhD, Computer Engineering - Computer and Information Science and Engineering, 2009, University of Florida

URL: http://ufdc.ufl.edu/UFE0041153

► Singular value decomposition is a problem that is used in a wide variety of applications like latent semantic indexing, collaborative filtering and gene expression analysis.…
(more)

Subjects/Keywords: Algorithms; Axes of rotation; Bandwidth; Crop rotation; Data models; Factorization; Libraries; Mathematics; Matrices; Rubble; band, bidiagonalization, blocked, reduction, sparse, svd

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Rajamanickam, S. (2009). Efficient Algorithms for Sparse Singular Value Decomposition. (Doctoral Dissertation). University of Florida. Retrieved from http://ufdc.ufl.edu/UFE0041153

Chicago Manual of Style (16^{th} Edition):

Rajamanickam, Sivasank. “Efficient Algorithms for Sparse Singular Value Decomposition.” 2009. Doctoral Dissertation, University of Florida. Accessed April 22, 2019. http://ufdc.ufl.edu/UFE0041153.

MLA Handbook (7^{th} Edition):

Rajamanickam, Sivasank. “Efficient Algorithms for Sparse Singular Value Decomposition.” 2009. Web. 22 Apr 2019.

Vancouver:

Rajamanickam S. Efficient Algorithms for Sparse Singular Value Decomposition. [Internet] [Doctoral dissertation]. University of Florida; 2009. [cited 2019 Apr 22]. Available from: http://ufdc.ufl.edu/UFE0041153.

Council of Science Editors:

Rajamanickam S. Efficient Algorithms for Sparse Singular Value Decomposition. [Doctoral Dissertation]. University of Florida; 2009. Available from: http://ufdc.ufl.edu/UFE0041153

University of Florida

29.
Yeralan, Sencer Nuri.
High Performance Computing with *Sparse* *Matrices* and GPU Accelerators.

Degree: PhD, Computer Engineering - Computer and Information Science and Engineering, 2014, University of Florida

URL: http://ufdc.ufl.edu/UFE0046270

► *Sparse* matrix factorization relies on high quality fill-reducing orderings in order to limit the amount of superfluous flops and memory required to compute and store…
(more)

Subjects/Keywords: Algorithms; Computer memory; Factorization; Launches; Mathematics; Matrices; Separators; Tessellations; Tiles; Vertices; fill-reducing-ordering – gpu – linear-algebra – qr-factorization – sparse-matrix

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Yeralan, S. N. (2014). High Performance Computing with Sparse Matrices and GPU Accelerators. (Doctoral Dissertation). University of Florida. Retrieved from http://ufdc.ufl.edu/UFE0046270

Chicago Manual of Style (16^{th} Edition):

Yeralan, Sencer Nuri. “High Performance Computing with Sparse Matrices and GPU Accelerators.” 2014. Doctoral Dissertation, University of Florida. Accessed April 22, 2019. http://ufdc.ufl.edu/UFE0046270.

MLA Handbook (7^{th} Edition):

Yeralan, Sencer Nuri. “High Performance Computing with Sparse Matrices and GPU Accelerators.” 2014. Web. 22 Apr 2019.

Vancouver:

Yeralan SN. High Performance Computing with Sparse Matrices and GPU Accelerators. [Internet] [Doctoral dissertation]. University of Florida; 2014. [cited 2019 Apr 22]. Available from: http://ufdc.ufl.edu/UFE0046270.

Council of Science Editors:

Yeralan SN. High Performance Computing with Sparse Matrices and GPU Accelerators. [Doctoral Dissertation]. University of Florida; 2014. Available from: http://ufdc.ufl.edu/UFE0046270

University of Florida

30.
Chi, Yu-Tseh.
Block, Group, and Affine Regularized *Sparse* Coding and Dictionary Learning.

Degree: PhD, Computer Engineering - Computer and Information Science and Engineering, 2013, University of Florida

URL: http://ufdc.ufl.edu/UFE0044995

► I first propose a novel approach for *sparse* coding that further improves upon the *sparse* representation-based classification (SRC) framework. This proposed framework, affine constrained group…
(more)

Subjects/Keywords: Algorithms; Atoms; Computer vision; Datasets; Error rates; Image classification; Linear programming; Machine learning; Matrices; Signals; algorithm – coding – dictionary – learning – optimization – sparse

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Chi, Y. (2013). Block, Group, and Affine Regularized Sparse Coding and Dictionary Learning. (Doctoral Dissertation). University of Florida. Retrieved from http://ufdc.ufl.edu/UFE0044995

Chicago Manual of Style (16^{th} Edition):

Chi, Yu-Tseh. “Block, Group, and Affine Regularized Sparse Coding and Dictionary Learning.” 2013. Doctoral Dissertation, University of Florida. Accessed April 22, 2019. http://ufdc.ufl.edu/UFE0044995.

MLA Handbook (7^{th} Edition):

Chi, Yu-Tseh. “Block, Group, and Affine Regularized Sparse Coding and Dictionary Learning.” 2013. Web. 22 Apr 2019.

Vancouver:

Chi Y. Block, Group, and Affine Regularized Sparse Coding and Dictionary Learning. [Internet] [Doctoral dissertation]. University of Florida; 2013. [cited 2019 Apr 22]. Available from: http://ufdc.ufl.edu/UFE0044995.

Council of Science Editors:

Chi Y. Block, Group, and Affine Regularized Sparse Coding and Dictionary Learning. [Doctoral Dissertation]. University of Florida; 2013. Available from: http://ufdc.ufl.edu/UFE0044995