Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for +publisher:"University of Texas – Austin" +contributor:("Dhillon, Inderjit"). Showing records 1 – 30 of 41 total matches.

[1] [2]

Search Limiters

Last 2 Years | English Only

▼ Search Limiters


University of Texas – Austin

1. Johnson, Christopher Carroll. Greedy structure learning of Markov Random Fields.

Degree: MSin Computer Sciences, Computer Science, 2011, University of Texas – Austin

 Probabilistic graphical models are used in a variety of domains to capture and represent general dependencies in joint probability distributions. In this document we examine… (more)

Subjects/Keywords: Machine learning; Graphical models; Markov Random Fields; Structure learning; Probability; Uncertainty; Greedy algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Johnson, C. C. (2011). Greedy structure learning of Markov Random Fields. (Masters Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/ETD-UT-2011-08-4331

Chicago Manual of Style (16th Edition):

Johnson, Christopher Carroll. “Greedy structure learning of Markov Random Fields.” 2011. Masters Thesis, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/ETD-UT-2011-08-4331.

MLA Handbook (7th Edition):

Johnson, Christopher Carroll. “Greedy structure learning of Markov Random Fields.” 2011. Web. 16 Apr 2021.

Vancouver:

Johnson CC. Greedy structure learning of Markov Random Fields. [Internet] [Masters thesis]. University of Texas – Austin; 2011. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/ETD-UT-2011-08-4331.

Council of Science Editors:

Johnson CC. Greedy structure learning of Markov Random Fields. [Masters Thesis]. University of Texas – Austin; 2011. Available from: http://hdl.handle.net/2152/ETD-UT-2011-08-4331

2. Sustik, Mátyás Attila. Structured numerical problems in contemporary applications.

Degree: PhD, Computer Science, 2013, University of Texas – Austin

 The presence of structure in a computational problem can often be exploited and can lead to a more efficient numerical algorithm. In this dissertation, we… (more)

Subjects/Keywords: Matrix computation; Inverse eigenvalue problem; Equiangular frame; Bregman divergence; Zero-finding; Divide-and-conquer eigensolver

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sustik, M. A. (2013). Structured numerical problems in contemporary applications. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/21855

Chicago Manual of Style (16th Edition):

Sustik, Mátyás Attila. “Structured numerical problems in contemporary applications.” 2013. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/21855.

MLA Handbook (7th Edition):

Sustik, Mátyás Attila. “Structured numerical problems in contemporary applications.” 2013. Web. 16 Apr 2021.

Vancouver:

Sustik MA. Structured numerical problems in contemporary applications. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2013. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/21855.

Council of Science Editors:

Sustik MA. Structured numerical problems in contemporary applications. [Doctoral Dissertation]. University of Texas – Austin; 2013. Available from: http://hdl.handle.net/2152/21855

3. Kulis, Brian Joseph. Scalable kernel methods for machine learning.

Degree: PhD, Computer Sciences, 2008, University of Texas – Austin

 Machine learning techniques are now essential for a diverse set of applications in computer vision, natural language processing, software analysis, and many other domains. As… (more)

Subjects/Keywords: Machine learning; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kulis, B. J. (2008). Scalable kernel methods for machine learning. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/18243

Chicago Manual of Style (16th Edition):

Kulis, Brian Joseph. “Scalable kernel methods for machine learning.” 2008. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/18243.

MLA Handbook (7th Edition):

Kulis, Brian Joseph. “Scalable kernel methods for machine learning.” 2008. Web. 16 Apr 2021.

Vancouver:

Kulis BJ. Scalable kernel methods for machine learning. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2008. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/18243.

Council of Science Editors:

Kulis BJ. Scalable kernel methods for machine learning. [Doctoral Dissertation]. University of Texas – Austin; 2008. Available from: http://hdl.handle.net/2152/18243

4. Cho, Hyuk. Co-clustering algorithms : extensions and applications.

Degree: PhD, Computer Sciences, 2008, University of Texas – Austin

 Co-clustering is rather a recent paradigm for unsupervised data analysis, but it has become increasingly popular because of its potential to discover latent local patterns,… (more)

Subjects/Keywords: Cluster analysis – Data processing; Algorithms

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Cho, H. (2008). Co-clustering algorithms : extensions and applications. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/17809

Chicago Manual of Style (16th Edition):

Cho, Hyuk. “Co-clustering algorithms : extensions and applications.” 2008. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/17809.

MLA Handbook (7th Edition):

Cho, Hyuk. “Co-clustering algorithms : extensions and applications.” 2008. Web. 16 Apr 2021.

Vancouver:

Cho H. Co-clustering algorithms : extensions and applications. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2008. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/17809.

Council of Science Editors:

Cho H. Co-clustering algorithms : extensions and applications. [Doctoral Dissertation]. University of Texas – Austin; 2008. Available from: http://hdl.handle.net/2152/17809


University of Texas – Austin

5. -7606-1366. Provable non-convex optimization for learning parametric models.

Degree: PhD, Computational Science, Engineering, and Mathematics, 2018, University of Texas – Austin

 Non-convex optimization plays an important role in recent advances of machine learning. A large number of machine learning tasks are performed by solving a non-convex… (more)

Subjects/Keywords: Numerical optimization; Machine learning; Statistics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-7606-1366. (2018). Provable non-convex optimization for learning parametric models. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/69011

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-7606-1366. “Provable non-convex optimization for learning parametric models.” 2018. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/69011.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-7606-1366. “Provable non-convex optimization for learning parametric models.” 2018. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-7606-1366. Provable non-convex optimization for learning parametric models. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2018. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/69011.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-7606-1366. Provable non-convex optimization for learning parametric models. [Doctoral Dissertation]. University of Texas – Austin; 2018. Available from: http://hdl.handle.net/2152/69011

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete


University of Texas – Austin

6. -8687-0258. A multi-scale framework for graph based machine learning problems.

Degree: PhD, Computer Science, 2017, University of Texas – Austin

 Graph data have become essential in representing and modeling relationships between entities and complex network structures in various domains such as social networks and recommender… (more)

Subjects/Keywords: Machine learning; Data mining; Spectral decomposition; Low rank approximation; Link prediction; Social network analysis; Recommender systems; Collaborative filtering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-8687-0258. (2017). A multi-scale framework for graph based machine learning problems. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/47407

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-8687-0258. “A multi-scale framework for graph based machine learning problems.” 2017. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/47407.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-8687-0258. “A multi-scale framework for graph based machine learning problems.” 2017. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-8687-0258. A multi-scale framework for graph based machine learning problems. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2017. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/47407.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-8687-0258. A multi-scale framework for graph based machine learning problems. [Doctoral Dissertation]. University of Texas – Austin; 2017. Available from: http://hdl.handle.net/2152/47407

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete


University of Texas – Austin

7. Das, Shreepriya. Algorithms for next generation sequencing data analysis.

Degree: PhD, Electrical and Computer Engineering, 2015, University of Texas – Austin

 The field of genomics has witnessed tremendous achievements in the past two decades. The advances in sequencing technology have enabled acquisition of massive amounts of… (more)

Subjects/Keywords: Basecalling; Haplotyping; Bioinformatics; Computational biology

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Das, S. (2015). Algorithms for next generation sequencing data analysis. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/33328

Chicago Manual of Style (16th Edition):

Das, Shreepriya. “Algorithms for next generation sequencing data analysis.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/33328.

MLA Handbook (7th Edition):

Das, Shreepriya. “Algorithms for next generation sequencing data analysis.” 2015. Web. 16 Apr 2021.

Vancouver:

Das S. Algorithms for next generation sequencing data analysis. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/33328.

Council of Science Editors:

Das S. Algorithms for next generation sequencing data analysis. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/33328


University of Texas – Austin

8. Chen, Mei-Yen, Ph. D. The development of bias in perceptual and financial decision-making.

Degree: PhD, Psychology, 2014, University of Texas – Austin

 Decisions are prone to bias. This can be seen in daily choices. For instance, when the markets are plunging, investors tend to sell stocks instead… (more)

Subjects/Keywords: fMRI; Reinforcement learning; Drift-diffusion model; Decision-making

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chen, Mei-Yen, P. D. (2014). The development of bias in perceptual and financial decision-making. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/44086

Chicago Manual of Style (16th Edition):

Chen, Mei-Yen, Ph D. “The development of bias in perceptual and financial decision-making.” 2014. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/44086.

MLA Handbook (7th Edition):

Chen, Mei-Yen, Ph D. “The development of bias in perceptual and financial decision-making.” 2014. Web. 16 Apr 2021.

Vancouver:

Chen, Mei-Yen PD. The development of bias in perceptual and financial decision-making. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2014. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/44086.

Council of Science Editors:

Chen, Mei-Yen PD. The development of bias in perceptual and financial decision-making. [Doctoral Dissertation]. University of Texas – Austin; 2014. Available from: http://hdl.handle.net/2152/44086


University of Texas – Austin

9. Stober, Jeremy Michael. Sensorimotor embedding : a developmental approach to learning geometry.

Degree: PhD, Computer Science, 2015, University of Texas – Austin

 A human infant facing the blooming, buzzing confusion of the senses grows up to be an adult with common-sense knowledge of geometry; this knowledge then… (more)

Subjects/Keywords: Sensorimotor; Ai; Robotics; Development

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Stober, J. M. (2015). Sensorimotor embedding : a developmental approach to learning geometry. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/30532

Chicago Manual of Style (16th Edition):

Stober, Jeremy Michael. “Sensorimotor embedding : a developmental approach to learning geometry.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/30532.

MLA Handbook (7th Edition):

Stober, Jeremy Michael. “Sensorimotor embedding : a developmental approach to learning geometry.” 2015. Web. 16 Apr 2021.

Vancouver:

Stober JM. Sensorimotor embedding : a developmental approach to learning geometry. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/30532.

Council of Science Editors:

Stober JM. Sensorimotor embedding : a developmental approach to learning geometry. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/30532


University of Texas – Austin

10. Whang, Joyce Jiyoung. Overlapping community detection in massive social networks.

Degree: PhD, Computer science, 2015, University of Texas – Austin

 Massive social networks have become increasingly popular in recent years. Community detection is one of the most important techniques for the analysis of such complex… (more)

Subjects/Keywords: Community detection; Clustering; Social networks; Overlapping communities; Overlapping clusters; Non-exhaustive clustering; Seed expansion; K-means; Semidefinite programming; Co-clustering; PageRank; Data-driven algorithm; Scalable computing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Whang, J. J. (2015). Overlapping community detection in massive social networks. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/33272

Chicago Manual of Style (16th Edition):

Whang, Joyce Jiyoung. “Overlapping community detection in massive social networks.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/33272.

MLA Handbook (7th Edition):

Whang, Joyce Jiyoung. “Overlapping community detection in massive social networks.” 2015. Web. 16 Apr 2021.

Vancouver:

Whang JJ. Overlapping community detection in massive social networks. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/33272.

Council of Science Editors:

Whang JJ. Overlapping community detection in massive social networks. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/33272


University of Texas – Austin

11. Sui, Xin, Ph.D. Principled control of approximate programs.

Degree: PhD, Computer science, 2015, University of Texas – Austin

 In conventional computing, most programs are treated as implementations of mathematical functions for which there is an exact output that must computed from a given… (more)

Subjects/Keywords: Approximate computing; Error model; Cost model; Tunable approximate algorithms; Energy efficiency; Information efficiency; Systematic; Control; Tunable approximate programs

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sui, Xin, P. D. (2015). Principled control of approximate programs. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/43722

Chicago Manual of Style (16th Edition):

Sui, Xin, Ph D. “Principled control of approximate programs.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/43722.

MLA Handbook (7th Edition):

Sui, Xin, Ph D. “Principled control of approximate programs.” 2015. Web. 16 Apr 2021.

Vancouver:

Sui, Xin PD. Principled control of approximate programs. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/43722.

Council of Science Editors:

Sui, Xin PD. Principled control of approximate programs. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/43722


University of Texas – Austin

12. -1624-1733. Exploiting structure in large-scale optimization for machine learning.

Degree: PhD, Computer science, 2015, University of Texas – Austin

 With an immense growth of data, there is a great need for solving large-scale machine learning problems. Classical optimization algorithms usually cannot scale up due… (more)

Subjects/Keywords: Machine learning; Optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-1624-1733. (2015). Exploiting structure in large-scale optimization for machine learning. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/31381

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-1624-1733. “Exploiting structure in large-scale optimization for machine learning.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/31381.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-1624-1733. “Exploiting structure in large-scale optimization for machine learning.” 2015. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-1624-1733. Exploiting structure in large-scale optimization for machine learning. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/31381.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-1624-1733. Exploiting structure in large-scale optimization for machine learning. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/31381

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

13. Chiang, Kai-Yang. Statistical analysis for modeling dyadic interactions using machine learning methods.

Degree: PhD, Computer Science, 2017, University of Texas – Austin

 Modeling dyadic interactions between entities is one of the fundamental problems in machine learning with many real-world applications, including recommender systems, data clustering, social network… (more)

Subjects/Keywords: Dyadic interaction modeling; Statistical machine learning; Signed network analysis; Signed graph clustering; Dyadic rank aggregation; Matrix completion; Robust PCA; Side information

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chiang, K. (2017). Statistical analysis for modeling dyadic interactions using machine learning methods. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/47368

Chicago Manual of Style (16th Edition):

Chiang, Kai-Yang. “Statistical analysis for modeling dyadic interactions using machine learning methods.” 2017. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/47368.

MLA Handbook (7th Edition):

Chiang, Kai-Yang. “Statistical analysis for modeling dyadic interactions using machine learning methods.” 2017. Web. 16 Apr 2021.

Vancouver:

Chiang K. Statistical analysis for modeling dyadic interactions using machine learning methods. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2017. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/47368.

Council of Science Editors:

Chiang K. Statistical analysis for modeling dyadic interactions using machine learning methods. [Doctoral Dissertation]. University of Texas – Austin; 2017. Available from: http://hdl.handle.net/2152/47368


University of Texas – Austin

14. Yu, Hsiang-Fu. Scalable algorithms for latent variable models in machine learning.

Degree: PhD, Computer science, 2016, University of Texas – Austin

 Latent variable modeling (LVM) is a popular approach in many machine learning applications, such as recommender systems and topic modeling, due to its ability to… (more)

Subjects/Keywords: Latent variable modeling; Matrix factorization; Algorithms; Data

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yu, H. (2016). Scalable algorithms for latent variable models in machine learning. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/41635

Chicago Manual of Style (16th Edition):

Yu, Hsiang-Fu. “Scalable algorithms for latent variable models in machine learning.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/41635.

MLA Handbook (7th Edition):

Yu, Hsiang-Fu. “Scalable algorithms for latent variable models in machine learning.” 2016. Web. 16 Apr 2021.

Vancouver:

Yu H. Scalable algorithms for latent variable models in machine learning. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/41635.

Council of Science Editors:

Yu H. Scalable algorithms for latent variable models in machine learning. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/41635


University of Texas – Austin

15. Si, Si, Ph.D. Large-scale non-linear prediction with applications.

Degree: PhD, Computer science, 2016, University of Texas – Austin

 With an immense growth in data, there is a great need for training and testing machine learning models on very large data sets. Several standard… (more)

Subjects/Keywords: Kernel methods; Classification; Decision trees

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Si, Si, P. D. (2016). Large-scale non-linear prediction with applications. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/43583

Chicago Manual of Style (16th Edition):

Si, Si, Ph D. “Large-scale non-linear prediction with applications.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/43583.

MLA Handbook (7th Edition):

Si, Si, Ph D. “Large-scale non-linear prediction with applications.” 2016. Web. 16 Apr 2021.

Vancouver:

Si, Si PD. Large-scale non-linear prediction with applications. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/43583.

Council of Science Editors:

Si, Si PD. Large-scale non-linear prediction with applications. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/43583


University of Texas – Austin

16. -6435-245X. Learning with positive and unlabeled examples.

Degree: PhD, Computer Science, 2015, University of Texas – Austin

 Developing partially supervised models is becoming increasingly relevant in the context of modern machine learning applications, where supervision often comes at a cost. In particular,… (more)

Subjects/Keywords: PU learning; Learning theory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-6435-245X. (2015). Learning with positive and unlabeled examples. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/32826

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-6435-245X. “Learning with positive and unlabeled examples.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/32826.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-6435-245X. “Learning with positive and unlabeled examples.” 2015. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-6435-245X. Learning with positive and unlabeled examples. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/32826.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-6435-245X. Learning with positive and unlabeled examples. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/32826

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

17. -4493-3358. Appropriate, accessible and appealing probabilistic graphical models.

Degree: PhD, Computer Science, 2017, University of Texas – Austin

 Appropriate - Many multivariate probabilistic models either use independent distributions or dependent Gaussian distributions. Yet, many real-world datasets contain count-valued or non-negative skewed data, e.g.… (more)

Subjects/Keywords: Graphical models; Topic models; Poisson; Count data; Visualization; Human computer interaction

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-4493-3358. (2017). Appropriate, accessible and appealing probabilistic graphical models. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/62986

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-4493-3358. “Appropriate, accessible and appealing probabilistic graphical models.” 2017. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/62986.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-4493-3358. “Appropriate, accessible and appealing probabilistic graphical models.” 2017. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-4493-3358. Appropriate, accessible and appealing probabilistic graphical models. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2017. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/62986.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-4493-3358. Appropriate, accessible and appealing probabilistic graphical models. [Doctoral Dissertation]. University of Texas – Austin; 2017. Available from: http://hdl.handle.net/2152/62986

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete


University of Texas – Austin

18. -3365-2702. Computational discovery of genetic targets and interactions : applications to lung cancer.

Degree: PhD, Computational science, engineering, and mathematics, 2016, University of Texas – Austin

 We present new modes of computational drug discovery in each of the three key themes of target identification, mechanism, and therapy regimen design. In identifying… (more)

Subjects/Keywords: Drug target; Data mining

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-3365-2702. (2016). Computational discovery of genetic targets and interactions : applications to lung cancer. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/40285

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-3365-2702. “Computational discovery of genetic targets and interactions : applications to lung cancer.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/40285.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-3365-2702. “Computational discovery of genetic targets and interactions : applications to lung cancer.” 2016. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-3365-2702. Computational discovery of genetic targets and interactions : applications to lung cancer. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/40285.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-3365-2702. Computational discovery of genetic targets and interactions : applications to lung cancer. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/40285

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete


University of Texas – Austin

19. -0634-6435. Provably effective algorithms for min-max optimization.

Degree: PhD, Computational Science, Engineering, and Mathematics, 2020, University of Texas – Austin

 Many fundamental machine learning tasks can be formulated as min-max optimization. This motivates us to design effective and efficient first-order methods that provably converge to… (more)

Subjects/Keywords: Min-max optimization; WGAN

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-0634-6435. (2020). Provably effective algorithms for min-max optimization. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://dx.doi.org/10.26153/tsw/10153

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-0634-6435. “Provably effective algorithms for min-max optimization.” 2020. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://dx.doi.org/10.26153/tsw/10153.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-0634-6435. “Provably effective algorithms for min-max optimization.” 2020. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-0634-6435. Provably effective algorithms for min-max optimization. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2020. [cited 2021 Apr 16]. Available from: http://dx.doi.org/10.26153/tsw/10153.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-0634-6435. Provably effective algorithms for min-max optimization. [Doctoral Dissertation]. University of Texas – Austin; 2020. Available from: http://dx.doi.org/10.26153/tsw/10153

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

20. Wan, Shaohua. Learning to recognize egocentric activities using RGB-D data.

Degree: PhD, Electrical and Computer engineering, 2015, University of Texas – Austin

 There are two recent trends that are changing the landscape of vision-based activity recognition. On one hand, wearable cameras have become widely used for recording… (more)

Subjects/Keywords: Computer vision; Egocentric activity recognition; RGB-D camera

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wan, S. (2015). Learning to recognize egocentric activities using RGB-D data. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/33314

Chicago Manual of Style (16th Edition):

Wan, Shaohua. “Learning to recognize egocentric activities using RGB-D data.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/33314.

MLA Handbook (7th Edition):

Wan, Shaohua. “Learning to recognize egocentric activities using RGB-D data.” 2015. Web. 16 Apr 2021.

Vancouver:

Wan S. Learning to recognize egocentric activities using RGB-D data. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/33314.

Council of Science Editors:

Wan S. Learning to recognize egocentric activities using RGB-D data. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/33314


University of Texas – Austin

21. -6232-4946. Data-scalable Hessian preconditioning for distributed parameter PDE-constrained inverse problems.

Degree: PhD, Computational Science, Engineering, and Mathematics, 2019, University of Texas – Austin

 Hessian preconditioners are the key to efficient numerical solution of large-scale distributed parameter PDE-constrained inverse problems with highly informative data. Such inverse problems arise in… (more)

Subjects/Keywords: Inverse problems; Hessian; KKT matrix; Preconditioning; Data scalability; Numerical optimization; Augmented Lagrangian; Product-convolution; Domain decomposition; Hierarchical matrix

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-6232-4946. (2019). Data-scalable Hessian preconditioning for distributed parameter PDE-constrained inverse problems. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://dx.doi.org/10.26153/tsw/2663

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-6232-4946. “Data-scalable Hessian preconditioning for distributed parameter PDE-constrained inverse problems.” 2019. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://dx.doi.org/10.26153/tsw/2663.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-6232-4946. “Data-scalable Hessian preconditioning for distributed parameter PDE-constrained inverse problems.” 2019. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-6232-4946. Data-scalable Hessian preconditioning for distributed parameter PDE-constrained inverse problems. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2019. [cited 2021 Apr 16]. Available from: http://dx.doi.org/10.26153/tsw/2663.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-6232-4946. Data-scalable Hessian preconditioning for distributed parameter PDE-constrained inverse problems. [Doctoral Dissertation]. University of Texas – Austin; 2019. Available from: http://dx.doi.org/10.26153/tsw/2663

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete


University of Texas – Austin

22. Vijayanarasimhan, Sudheendra. Active visual category learning.

Degree: PhD, Computer Science, 2011, University of Texas – Austin

 Visual recognition research develops algorithms and representations to autonomously recognize visual entities such as objects, actions, and attributes. The traditional protocol involves manually collecting training… (more)

Subjects/Keywords: Artificial intelligence; Active learning; Object recognition; Object detection; Cost-sensitive learning; Multi-level learning; Budgeted learning; Large-scale active learning; Live learning; Machine learning; Visual recognition system

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Vijayanarasimhan, S. (2011). Active visual category learning. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/ETD-UT-2011-05-3014

Chicago Manual of Style (16th Edition):

Vijayanarasimhan, Sudheendra. “Active visual category learning.” 2011. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/ETD-UT-2011-05-3014.

MLA Handbook (7th Edition):

Vijayanarasimhan, Sudheendra. “Active visual category learning.” 2011. Web. 16 Apr 2021.

Vancouver:

Vijayanarasimhan S. Active visual category learning. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2011. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/ETD-UT-2011-05-3014.

Council of Science Editors:

Vijayanarasimhan S. Active visual category learning. [Doctoral Dissertation]. University of Texas – Austin; 2011. Available from: http://hdl.handle.net/2152/ETD-UT-2011-05-3014


University of Texas – Austin

23. -3192-3281. Efficient deep learning for sequence data.

Degree: PhD, Computational Science, Engineering, and Mathematics, 2020, University of Texas – Austin

 Deep learning has achieved great success in many sequence learning tasks such as machine translation, speech recognition, and time series prediction. Powerful deep sequence learning… (more)

Subjects/Keywords: Machine learning; Deep neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-3192-3281. (2020). Efficient deep learning for sequence data. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://dx.doi.org/10.26153/tsw/10140

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-3192-3281. “Efficient deep learning for sequence data.” 2020. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://dx.doi.org/10.26153/tsw/10140.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-3192-3281. “Efficient deep learning for sequence data.” 2020. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-3192-3281. Efficient deep learning for sequence data. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2020. [cited 2021 Apr 16]. Available from: http://dx.doi.org/10.26153/tsw/10140.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-3192-3281. Efficient deep learning for sequence data. [Doctoral Dissertation]. University of Texas – Austin; 2020. Available from: http://dx.doi.org/10.26153/tsw/10140

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

24. -5603-2533. Efficient algorithms for flow models coupled with geomechanics for porous media applications.

Degree: PhD, Computational Science, Engineering, and Mathematics, 2016, University of Texas – Austin

 The coupling between subsurface flow and reservoir geomechanics plays a critical role in obtaining accurate results for models involving reservoir deformation, surface subsidence, well stability,… (more)

Subjects/Keywords: Poroelasticity; Biot system; Fixed-stress split iterative coupling; Undrained split iterative coupling; Explicit coupling; Single rate scheme; Multirate scheme; Banach fixed-point contraction; Fractured poroelastic media; A priori error estiamtes; Global inexact Newton methods

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-5603-2533. (2016). Efficient algorithms for flow models coupled with geomechanics for porous media applications. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/46503

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

-5603-2533. “Efficient algorithms for flow models coupled with geomechanics for porous media applications.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/46503.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

-5603-2533. “Efficient algorithms for flow models coupled with geomechanics for porous media applications.” 2016. Web. 16 Apr 2021.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-5603-2533. Efficient algorithms for flow models coupled with geomechanics for porous media applications. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/46503.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

-5603-2533. Efficient algorithms for flow models coupled with geomechanics for porous media applications. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/46503

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

25. Hussmann, Jeffrey Alan. Expanding the applications of high-throughput DNA sequencing.

Degree: PhD, Computational and applied mathematics, 2015, University of Texas – Austin

 DNA sequencing is the process of determining the identities of the nucleotides that make up a molecule of DNA. The rapid pace of advancements in… (more)

Subjects/Keywords: DNA sequencing; Translation dynamics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hussmann, J. A. (2015). Expanding the applications of high-throughput DNA sequencing. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/31375

Chicago Manual of Style (16th Edition):

Hussmann, Jeffrey Alan. “Expanding the applications of high-throughput DNA sequencing.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/31375.

MLA Handbook (7th Edition):

Hussmann, Jeffrey Alan. “Expanding the applications of high-throughput DNA sequencing.” 2015. Web. 16 Apr 2021.

Vancouver:

Hussmann JA. Expanding the applications of high-throughput DNA sequencing. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/31375.

Council of Science Editors:

Hussmann JA. Expanding the applications of high-throughput DNA sequencing. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/31375


University of Texas – Austin

26. Bhojanapalli, Venkata Sesha Pavana Srinadh. Large scale matrix factorization with guarantees: sampling and bi-linearity.

Degree: PhD, Electrical and Computer Engineering, 2015, University of Texas – Austin

 Low rank matrix factorization is an important step in many high dimensional machine learning algorithms. Traditional algorithms for factorization do not scale well with the… (more)

Subjects/Keywords: Matrix completion; Non-convex optimization; Low rank approximation; Semi-definite optimization; Tensor factorization; Scalable algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bhojanapalli, V. S. P. S. (2015). Large scale matrix factorization with guarantees: sampling and bi-linearity. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/32832

Chicago Manual of Style (16th Edition):

Bhojanapalli, Venkata Sesha Pavana Srinadh. “Large scale matrix factorization with guarantees: sampling and bi-linearity.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/32832.

MLA Handbook (7th Edition):

Bhojanapalli, Venkata Sesha Pavana Srinadh. “Large scale matrix factorization with guarantees: sampling and bi-linearity.” 2015. Web. 16 Apr 2021.

Vancouver:

Bhojanapalli VSPS. Large scale matrix factorization with guarantees: sampling and bi-linearity. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/32832.

Council of Science Editors:

Bhojanapalli VSPS. Large scale matrix factorization with guarantees: sampling and bi-linearity. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/32832

27. Davis, Jason Victor. Mining statistical correlations with applications to software analysis.

Degree: PhD, Computer Sciences, 2008, University of Texas – Austin

 Machine learning, data mining, and statistical methods work by representing real-world objects in terms of feature sets that best describe them. This thesis addresses problems… (more)

Subjects/Keywords: Data mining; Machine learning; Computer algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Davis, J. V. (2008). Mining statistical correlations with applications to software analysis. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/18340

Chicago Manual of Style (16th Edition):

Davis, Jason Victor. “Mining statistical correlations with applications to software analysis.” 2008. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/18340.

MLA Handbook (7th Edition):

Davis, Jason Victor. “Mining statistical correlations with applications to software analysis.” 2008. Web. 16 Apr 2021.

Vancouver:

Davis JV. Mining statistical correlations with applications to software analysis. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2008. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/18340.

Council of Science Editors:

Davis JV. Mining statistical correlations with applications to software analysis. [Doctoral Dissertation]. University of Texas – Austin; 2008. Available from: http://hdl.handle.net/2152/18340


University of Texas – Austin

28. Guan, Yuqiang. Large-scale clustering: algorithms and applications.

Degree: PhD, Computer Sciences, 2006, University of Texas – Austin

 Clustering is a central problem in unsupervised learning for discovering interesting patterns in the underlying data. Though there have been numerous studies on clustering methods,… (more)

Subjects/Keywords: Computer algorithms; Cluster analysis

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Guan, Y. (2006). Large-scale clustering: algorithms and applications. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/2497

Chicago Manual of Style (16th Edition):

Guan, Yuqiang. “Large-scale clustering: algorithms and applications.” 2006. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/2497.

MLA Handbook (7th Edition):

Guan, Yuqiang. “Large-scale clustering: algorithms and applications.” 2006. Web. 16 Apr 2021.

Vancouver:

Guan Y. Large-scale clustering: algorithms and applications. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2006. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/2497.

Council of Science Editors:

Guan Y. Large-scale clustering: algorithms and applications. [Doctoral Dissertation]. University of Texas – Austin; 2006. Available from: http://hdl.handle.net/2152/2497


University of Texas – Austin

29. Sra, Suvrit, 1976-. Matrix nearness problems in data mining.

Degree: PhD, Computer Sciences, 2007, University of Texas – Austin

 This thesis addresses some fundamental problems in data mining and machine learning that may be cast as matrix nearness problems. Some exam- ples of well-known… (more)

Subjects/Keywords: Data mining; Machine learning; Matrices

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sra, Suvrit, 1. (2007). Matrix nearness problems in data mining. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/3313

Chicago Manual of Style (16th Edition):

Sra, Suvrit, 1976-. “Matrix nearness problems in data mining.” 2007. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/3313.

MLA Handbook (7th Edition):

Sra, Suvrit, 1976-. “Matrix nearness problems in data mining.” 2007. Web. 16 Apr 2021.

Vancouver:

Sra, Suvrit 1. Matrix nearness problems in data mining. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2007. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/3313.

Council of Science Editors:

Sra, Suvrit 1. Matrix nearness problems in data mining. [Doctoral Dissertation]. University of Texas – Austin; 2007. Available from: http://hdl.handle.net/2152/3313


University of Texas – Austin

30. Jain, Prateek. Large scale optimization methods for metric and kernel learning.

Degree: PhD, Computer Sciences, 2009, University of Texas – Austin

 A large number of machine learning algorithms are critically dependent on the underlying distance/metric/similarity function. Learning an appropriate distance function is therefore crucial to the… (more)

Subjects/Keywords: Rank minimization; Metric learning; Kernel learning; Fast similarity search; Locality sensitive hashing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jain, P. (2009). Large scale optimization methods for metric and kernel learning. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/27132

Chicago Manual of Style (16th Edition):

Jain, Prateek. “Large scale optimization methods for metric and kernel learning.” 2009. Doctoral Dissertation, University of Texas – Austin. Accessed April 16, 2021. http://hdl.handle.net/2152/27132.

MLA Handbook (7th Edition):

Jain, Prateek. “Large scale optimization methods for metric and kernel learning.” 2009. Web. 16 Apr 2021.

Vancouver:

Jain P. Large scale optimization methods for metric and kernel learning. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2009. [cited 2021 Apr 16]. Available from: http://hdl.handle.net/2152/27132.

Council of Science Editors:

Jain P. Large scale optimization methods for metric and kernel learning. [Doctoral Dissertation]. University of Texas – Austin; 2009. Available from: http://hdl.handle.net/2152/27132

[1] [2]

.