Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for +publisher:"University of Texas – Austin" +contributor:("Dhillon, Inderjit S."). Showing records 1 – 23 of 23 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


University of Texas – Austin

1. Jain, Prateek. Large scale optimization methods for metric and kernel learning.

Degree: Computer Sciences, 2009, University of Texas – Austin

 A large number of machine learning algorithms are critically dependent on the underlying distance/metric/similarity function. Learning an appropriate distance function is therefore crucial to the… (more)

Subjects/Keywords: Rank minimization; Metric learning; Kernel learning; Fast similarity search; Locality sensitive hashing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jain, P. (2009). Large scale optimization methods for metric and kernel learning. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/27132

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Jain, Prateek. “Large scale optimization methods for metric and kernel learning.” 2009. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/27132.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Jain, Prateek. “Large scale optimization methods for metric and kernel learning.” 2009. Web. 16 Feb 2019.

Vancouver:

Jain P. Large scale optimization methods for metric and kernel learning. [Internet] [Thesis]. University of Texas – Austin; 2009. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/27132.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Jain P. Large scale optimization methods for metric and kernel learning. [Thesis]. University of Texas – Austin; 2009. Available from: http://hdl.handle.net/2152/27132

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

2. Sustik, Mátyás Attila. Structured numerical problems in contemporary applications.

Degree: Computer Sciences, 2013, University of Texas – Austin

 The presence of structure in a computational problem can often be exploited and can lead to a more efficient numerical algorithm. In this dissertation, we… (more)

Subjects/Keywords: Matrix computation; Inverse eigenvalue problem; Equiangular frame; Bregman divergence; Zero-finding; Divide-and-conquer eigensolver

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sustik, M. A. (2013). Structured numerical problems in contemporary applications. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/21855

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sustik, Mátyás Attila. “Structured numerical problems in contemporary applications.” 2013. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/21855.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sustik, Mátyás Attila. “Structured numerical problems in contemporary applications.” 2013. Web. 16 Feb 2019.

Vancouver:

Sustik MA. Structured numerical problems in contemporary applications. [Internet] [Thesis]. University of Texas – Austin; 2013. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/21855.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sustik MA. Structured numerical problems in contemporary applications. [Thesis]. University of Texas – Austin; 2013. Available from: http://hdl.handle.net/2152/21855

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

3. Kulis, Brian Joseph. Scalable kernel methods for machine learning.

Degree: Computer Sciences, 2008, University of Texas – Austin

 Machine learning techniques are now essential for a diverse set of applications in computer vision, natural language processing, software analysis, and many other domains. As… (more)

Subjects/Keywords: Machine learning; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kulis, B. J. (2008). Scalable kernel methods for machine learning. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/18243

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Kulis, Brian Joseph. “Scalable kernel methods for machine learning.” 2008. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/18243.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Kulis, Brian Joseph. “Scalable kernel methods for machine learning.” 2008. Web. 16 Feb 2019.

Vancouver:

Kulis BJ. Scalable kernel methods for machine learning. [Internet] [Thesis]. University of Texas – Austin; 2008. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/18243.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Kulis BJ. Scalable kernel methods for machine learning. [Thesis]. University of Texas – Austin; 2008. Available from: http://hdl.handle.net/2152/18243

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

4. Cho, Hyuk. Co-clustering algorithms : extensions and applications.

Degree: Computer Sciences, 2008, University of Texas – Austin

 Co-clustering is rather a recent paradigm for unsupervised data analysis, but it has become increasingly popular because of its potential to discover latent local patterns,… (more)

Subjects/Keywords: Cluster analysis – Data processing; Algorithms

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Cho, H. (2008). Co-clustering algorithms : extensions and applications. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/17809

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Cho, Hyuk. “Co-clustering algorithms : extensions and applications.” 2008. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/17809.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Cho, Hyuk. “Co-clustering algorithms : extensions and applications.” 2008. Web. 16 Feb 2019.

Vancouver:

Cho H. Co-clustering algorithms : extensions and applications. [Internet] [Thesis]. University of Texas – Austin; 2008. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/17809.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Cho H. Co-clustering algorithms : extensions and applications. [Thesis]. University of Texas – Austin; 2008. Available from: http://hdl.handle.net/2152/17809

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

5. Davis, Jason Victor. Mining statistical correlations with applications to software analysis.

Degree: Computer Sciences, 2008, University of Texas – Austin

 Machine learning, data mining, and statistical methods work by representing real-world objects in terms of feature sets that best describe them. This thesis addresses problems… (more)

Subjects/Keywords: Data mining; Machine learning; Computer algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Davis, J. V. (2008). Mining statistical correlations with applications to software analysis. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/18340

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Davis, Jason Victor. “Mining statistical correlations with applications to software analysis.” 2008. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/18340.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Davis, Jason Victor. “Mining statistical correlations with applications to software analysis.” 2008. Web. 16 Feb 2019.

Vancouver:

Davis JV. Mining statistical correlations with applications to software analysis. [Internet] [Thesis]. University of Texas – Austin; 2008. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/18340.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Davis JV. Mining statistical correlations with applications to software analysis. [Thesis]. University of Texas – Austin; 2008. Available from: http://hdl.handle.net/2152/18340

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

6. -6435-245X. Learning with positive and unlabeled examples.

Degree: Computer Sciences, 2015, University of Texas – Austin

 Developing partially supervised models is becoming increasingly relevant in the context of modern machine learning applications, where supervision often comes at a cost. In particular,… (more)

Subjects/Keywords: PU learning; Learning theory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-6435-245X. (2015). Learning with positive and unlabeled examples. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/32826

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

-6435-245X. “Learning with positive and unlabeled examples.” 2015. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/32826.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

-6435-245X. “Learning with positive and unlabeled examples.” 2015. Web. 16 Feb 2019.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-6435-245X. Learning with positive and unlabeled examples. [Internet] [Thesis]. University of Texas – Austin; 2015. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/32826.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

-6435-245X. Learning with positive and unlabeled examples. [Thesis]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/32826

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

7. Whang, Joyce Jiyoung. Overlapping community detection in massive social networks.

Degree: Computer Sciences, 2015, University of Texas – Austin

 Massive social networks have become increasingly popular in recent years. Community detection is one of the most important techniques for the analysis of such complex… (more)

Subjects/Keywords: Community detection; Clustering; Social networks; Overlapping communities; Overlapping clusters; Non-exhaustive clustering; Seed expansion; K-means; Semidefinite programming; Co-clustering; PageRank; Data-driven algorithm; Scalable computing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Whang, J. J. (2015). Overlapping community detection in massive social networks. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/33272

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Whang, Joyce Jiyoung. “Overlapping community detection in massive social networks.” 2015. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/33272.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Whang, Joyce Jiyoung. “Overlapping community detection in massive social networks.” 2015. Web. 16 Feb 2019.

Vancouver:

Whang JJ. Overlapping community detection in massive social networks. [Internet] [Thesis]. University of Texas – Austin; 2015. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/33272.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Whang JJ. Overlapping community detection in massive social networks. [Thesis]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/33272

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

8. Das, Shreepriya. Algorithms for next generation sequencing data analysis.

Degree: Electrical and Computer Engineering, 2015, University of Texas – Austin

 The field of genomics has witnessed tremendous achievements in the past two decades. The advances in sequencing technology have enabled acquisition of massive amounts of… (more)

Subjects/Keywords: Basecalling; Haplotyping; Bioinformatics; Computational biology

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Das, S. (2015). Algorithms for next generation sequencing data analysis. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/33328

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Das, Shreepriya. “Algorithms for next generation sequencing data analysis.” 2015. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/33328.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Das, Shreepriya. “Algorithms for next generation sequencing data analysis.” 2015. Web. 16 Feb 2019.

Vancouver:

Das S. Algorithms for next generation sequencing data analysis. [Internet] [Thesis]. University of Texas – Austin; 2015. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/33328.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Das S. Algorithms for next generation sequencing data analysis. [Thesis]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/33328

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

9. Yu, Hsiang-Fu. Scalable algorithms for latent variable models in machine learning.

Degree: Computer Sciences, 2016, University of Texas – Austin

 Latent variable modeling (LVM) is a popular approach in many machine learning applications, such as recommender systems and topic modeling, due to its ability to… (more)

Subjects/Keywords: Latent variable modeling; Matrix factorization; Algorithms; Data

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yu, H. (2016). Scalable algorithms for latent variable models in machine learning. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/41635

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Yu, Hsiang-Fu. “Scalable algorithms for latent variable models in machine learning.” 2016. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/41635.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Yu, Hsiang-Fu. “Scalable algorithms for latent variable models in machine learning.” 2016. Web. 16 Feb 2019.

Vancouver:

Yu H. Scalable algorithms for latent variable models in machine learning. [Internet] [Thesis]. University of Texas – Austin; 2016. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/41635.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Yu H. Scalable algorithms for latent variable models in machine learning. [Thesis]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/41635

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

10. -7606-1366. Provable non-convex optimization for learning parametric models.

Degree: Computational Science, Engineering, and Mathematics, 2018, University of Texas – Austin

 Non-convex optimization plays an important role in recent advances of machine learning. A large number of machine learning tasks are performed by solving a non-convex… (more)

Subjects/Keywords: Numerical optimization; Machine learning; Statistics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-7606-1366. (2018). Provable non-convex optimization for learning parametric models. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/69011

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

-7606-1366. “Provable non-convex optimization for learning parametric models.” 2018. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/69011.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

-7606-1366. “Provable non-convex optimization for learning parametric models.” 2018. Web. 16 Feb 2019.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-7606-1366. Provable non-convex optimization for learning parametric models. [Internet] [Thesis]. University of Texas – Austin; 2018. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/69011.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

-7606-1366. Provable non-convex optimization for learning parametric models. [Thesis]. University of Texas – Austin; 2018. Available from: http://hdl.handle.net/2152/69011

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

11. Si, Si, Ph.D. Large-scale non-linear prediction with applications.

Degree: Computer Sciences, 2016, University of Texas – Austin

 With an immense growth in data, there is a great need for training and testing machine learning models on very large data sets. Several standard… (more)

Subjects/Keywords: Kernel methods; Classification; Decision trees

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Si, Si, P. D. (2016). Large-scale non-linear prediction with applications. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/43583

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Si, Si, Ph D. “Large-scale non-linear prediction with applications.” 2016. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/43583.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Si, Si, Ph D. “Large-scale non-linear prediction with applications.” 2016. Web. 16 Feb 2019.

Vancouver:

Si, Si PD. Large-scale non-linear prediction with applications. [Internet] [Thesis]. University of Texas – Austin; 2016. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/43583.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Si, Si PD. Large-scale non-linear prediction with applications. [Thesis]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/43583

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

12. Chen, Mei-Yen, Ph. D. The development of bias in perceptual and financial decision-making.

Degree: Psychology, 2014, University of Texas – Austin

 Decisions are prone to bias. This can be seen in daily choices. For instance, when the markets are plunging, investors tend to sell stocks instead… (more)

Subjects/Keywords: fMRI; Reinforcement learning; Drift-diffusion model; Decision-making

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chen, Mei-Yen, P. D. (2014). The development of bias in perceptual and financial decision-making. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/44086

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Chen, Mei-Yen, Ph D. “The development of bias in perceptual and financial decision-making.” 2014. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/44086.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Chen, Mei-Yen, Ph D. “The development of bias in perceptual and financial decision-making.” 2014. Web. 16 Feb 2019.

Vancouver:

Chen, Mei-Yen PD. The development of bias in perceptual and financial decision-making. [Internet] [Thesis]. University of Texas – Austin; 2014. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/44086.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chen, Mei-Yen PD. The development of bias in perceptual and financial decision-making. [Thesis]. University of Texas – Austin; 2014. Available from: http://hdl.handle.net/2152/44086

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

13. Chiang, Kai-Yang. Statistical analysis for modeling dyadic interactions using machine learning methods.

Degree: Computer Sciences, 2017, University of Texas – Austin

 Modeling dyadic interactions between entities is one of the fundamental problems in machine learning with many real-world applications, including recommender systems, data clustering, social network… (more)

Subjects/Keywords: Dyadic interaction modeling; Statistical machine learning; Signed network analysis; Signed graph clustering; Dyadic rank aggregation; Matrix completion; Robust PCA; Side information

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chiang, K. (2017). Statistical analysis for modeling dyadic interactions using machine learning methods. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/47368

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Chiang, Kai-Yang. “Statistical analysis for modeling dyadic interactions using machine learning methods.” 2017. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/47368.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Chiang, Kai-Yang. “Statistical analysis for modeling dyadic interactions using machine learning methods.” 2017. Web. 16 Feb 2019.

Vancouver:

Chiang K. Statistical analysis for modeling dyadic interactions using machine learning methods. [Internet] [Thesis]. University of Texas – Austin; 2017. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/47368.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chiang K. Statistical analysis for modeling dyadic interactions using machine learning methods. [Thesis]. University of Texas – Austin; 2017. Available from: http://hdl.handle.net/2152/47368

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

14. -8687-0258. A multi-scale framework for graph based machine learning problems.

Degree: Computer Sciences, 2017, University of Texas – Austin

 Graph data have become essential in representing and modeling relationships between entities and complex network structures in various domains such as social networks and recommender… (more)

Subjects/Keywords: Machine learning; Data mining; Spectral decomposition; Low rank approximation; Link prediction; Social network analysis; Recommender systems; Collaborative filtering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-8687-0258. (2017). A multi-scale framework for graph based machine learning problems. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/47407

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

-8687-0258. “A multi-scale framework for graph based machine learning problems.” 2017. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/47407.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

-8687-0258. “A multi-scale framework for graph based machine learning problems.” 2017. Web. 16 Feb 2019.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-8687-0258. A multi-scale framework for graph based machine learning problems. [Internet] [Thesis]. University of Texas – Austin; 2017. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/47407.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

-8687-0258. A multi-scale framework for graph based machine learning problems. [Thesis]. University of Texas – Austin; 2017. Available from: http://hdl.handle.net/2152/47407

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

15. -4493-3358. Appropriate, accessible and appealing probabilistic graphical models.

Degree: Computer Sciences, 2017, University of Texas – Austin

 Appropriate - Many multivariate probabilistic models either use independent distributions or dependent Gaussian distributions. Yet, many real-world datasets contain count-valued or non-negative skewed data, e.g.… (more)

Subjects/Keywords: Graphical models; Topic models; Poisson; Count data; Visualization; Human computer interaction

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-4493-3358. (2017). Appropriate, accessible and appealing probabilistic graphical models. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/62986

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

-4493-3358. “Appropriate, accessible and appealing probabilistic graphical models.” 2017. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/62986.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

-4493-3358. “Appropriate, accessible and appealing probabilistic graphical models.” 2017. Web. 16 Feb 2019.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-4493-3358. Appropriate, accessible and appealing probabilistic graphical models. [Internet] [Thesis]. University of Texas – Austin; 2017. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/62986.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

-4493-3358. Appropriate, accessible and appealing probabilistic graphical models. [Thesis]. University of Texas – Austin; 2017. Available from: http://hdl.handle.net/2152/62986

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

16. -1624-1733. Exploiting structure in large-scale optimization for machine learning.

Degree: Computer Sciences, 2015, University of Texas – Austin

 With an immense growth of data, there is a great need for solving large-scale machine learning problems. Classical optimization algorithms usually cannot scale up due… (more)

Subjects/Keywords: Machine learning; Optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

-1624-1733. (2015). Exploiting structure in large-scale optimization for machine learning. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/31381

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

-1624-1733. “Exploiting structure in large-scale optimization for machine learning.” 2015. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/31381.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

-1624-1733. “Exploiting structure in large-scale optimization for machine learning.” 2015. Web. 16 Feb 2019.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

-1624-1733. Exploiting structure in large-scale optimization for machine learning. [Internet] [Thesis]. University of Texas – Austin; 2015. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/31381.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

-1624-1733. Exploiting structure in large-scale optimization for machine learning. [Thesis]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/31381

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

17. Ramakrishnan, Smriti Rajan. A systems approach to computational protein identification.

Degree: Computer Sciences, 2010, University of Texas – Austin

 Proteomics is the science of understanding the dynamic protein content of an organism's cells (its proteome), which is one of the largest current challenges in… (more)

Subjects/Keywords: Computational biology; Bioinformatics; Integrative statistical data analysis; Computational proteomics; Systems biology; Database indexing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ramakrishnan, S. R. (2010). A systems approach to computational protein identification. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/ETD-UT-2010-05-1036

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Ramakrishnan, Smriti Rajan. “A systems approach to computational protein identification.” 2010. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/ETD-UT-2010-05-1036.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Ramakrishnan, Smriti Rajan. “A systems approach to computational protein identification.” 2010. Web. 16 Feb 2019.

Vancouver:

Ramakrishnan SR. A systems approach to computational protein identification. [Internet] [Thesis]. University of Texas – Austin; 2010. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/ETD-UT-2010-05-1036.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ramakrishnan SR. A systems approach to computational protein identification. [Thesis]. University of Texas – Austin; 2010. Available from: http://hdl.handle.net/2152/ETD-UT-2010-05-1036

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

18. Vijayanarasimhan, Sudheendra. Active visual category learning.

Degree: Computer Sciences, 2011, University of Texas – Austin

 Visual recognition research develops algorithms and representations to autonomously recognize visual entities such as objects, actions, and attributes. The traditional protocol involves manually collecting training… (more)

Subjects/Keywords: Artificial intelligence; Active learning; Object recognition; Object detection; Cost-sensitive learning; Multi-level learning; Budgeted learning; Large-scale active learning; Live learning; Machine learning; Visual recognition system

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Vijayanarasimhan, S. (2011). Active visual category learning. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/ETD-UT-2011-05-3014

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Vijayanarasimhan, Sudheendra. “Active visual category learning.” 2011. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/ETD-UT-2011-05-3014.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Vijayanarasimhan, Sudheendra. “Active visual category learning.” 2011. Web. 16 Feb 2019.

Vancouver:

Vijayanarasimhan S. Active visual category learning. [Internet] [Thesis]. University of Texas – Austin; 2011. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/ETD-UT-2011-05-3014.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Vijayanarasimhan S. Active visual category learning. [Thesis]. University of Texas – Austin; 2011. Available from: http://hdl.handle.net/2152/ETD-UT-2011-05-3014

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

19. Sra, Suvrit, 1976-. Matrix nearness problems in data mining.

Degree: Computer Sciences, 2007, University of Texas – Austin

 This thesis addresses some fundamental problems in data mining and machine learning that may be cast as matrix nearness problems. Some exam- ples of well-known… (more)

Subjects/Keywords: Data mining; Machine learning; Matrices

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sra, Suvrit, 1. (2007). Matrix nearness problems in data mining. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/3313

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sra, Suvrit, 1976-. “Matrix nearness problems in data mining.” 2007. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/3313.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sra, Suvrit, 1976-. “Matrix nearness problems in data mining.” 2007. Web. 16 Feb 2019.

Vancouver:

Sra, Suvrit 1. Matrix nearness problems in data mining. [Internet] [Thesis]. University of Texas – Austin; 2007. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/3313.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sra, Suvrit 1. Matrix nearness problems in data mining. [Thesis]. University of Texas – Austin; 2007. Available from: http://hdl.handle.net/2152/3313

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

20. Guan, Yuqiang. Large-scale clustering: algorithms and applications.

Degree: Computer Sciences, 2006, University of Texas – Austin

 Clustering is a central problem in unsupervised learning for discovering interesting patterns in the underlying data. Though there have been numerous studies on clustering methods,… (more)

Subjects/Keywords: Computer algorithms; Cluster analysis

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Guan, Y. (2006). Large-scale clustering: algorithms and applications. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/2497

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Guan, Yuqiang. “Large-scale clustering: algorithms and applications.” 2006. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/2497.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Guan, Yuqiang. “Large-scale clustering: algorithms and applications.” 2006. Web. 16 Feb 2019.

Vancouver:

Guan Y. Large-scale clustering: algorithms and applications. [Internet] [Thesis]. University of Texas – Austin; 2006. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/2497.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Guan Y. Large-scale clustering: algorithms and applications. [Thesis]. University of Texas – Austin; 2006. Available from: http://hdl.handle.net/2152/2497

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

21. Meka, Raghu Vardhan Reddy. Computational applications of invariance principles.

Degree: Computer Sciences, 2011, University of Texas – Austin

 This thesis focuses on applications of classical tools from probability theory and convex analysis such as limit theorems to problems in theoretical computer science, specifically… (more)

Subjects/Keywords: Invariance principles; Limit theorems; Pseudorandomness; PTFs; Learning theory; Noise sensitivity; Polytopes; Halfspaces

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Meka, R. V. R. (2011). Computational applications of invariance principles. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/30359

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Meka, Raghu Vardhan Reddy. “Computational applications of invariance principles.” 2011. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/30359.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Meka, Raghu Vardhan Reddy. “Computational applications of invariance principles.” 2011. Web. 16 Feb 2019.

Vancouver:

Meka RVR. Computational applications of invariance principles. [Internet] [Thesis]. University of Texas – Austin; 2011. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/30359.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Meka RVR. Computational applications of invariance principles. [Thesis]. University of Texas – Austin; 2011. Available from: http://hdl.handle.net/2152/30359

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

22. Blom, Martin. Automated Prediction of Human Disease Genes.

Degree: Computational Science, Engineering, and Mathematics, 2012, University of Texas – Austin

 The completion of the human genome project has led to a flood of new genetic data, that has proved surprisingly hard to interpret. Network "guilt… (more)

Subjects/Keywords: Bioinformatics; Systems biology

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Blom, M. (2012). Automated Prediction of Human Disease Genes. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/19529

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Blom, Martin. “Automated Prediction of Human Disease Genes.” 2012. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/19529.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Blom, Martin. “Automated Prediction of Human Disease Genes.” 2012. Web. 16 Feb 2019.

Vancouver:

Blom M. Automated Prediction of Human Disease Genes. [Internet] [Thesis]. University of Texas – Austin; 2012. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/19529.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Blom M. Automated Prediction of Human Disease Genes. [Thesis]. University of Texas – Austin; 2012. Available from: http://hdl.handle.net/2152/19529

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

23. Tropp, Joel Aaron. Topics in sparse approximation.

Degree: Computational Science, Engineering, and Mathematics, 2004, University of Texas – Austin

 Sparse approximation problems request a good approximation of an input signal as a linear combination of elementary signals, yet they stipulate that the approximation may… (more)

Subjects/Keywords: Algorithms; Matrices; Algebras, Linear

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tropp, J. A. (2004). Topics in sparse approximation. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/1272

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tropp, Joel Aaron. “Topics in sparse approximation.” 2004. Thesis, University of Texas – Austin. Accessed February 16, 2019. http://hdl.handle.net/2152/1272.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tropp, Joel Aaron. “Topics in sparse approximation.” 2004. Web. 16 Feb 2019.

Vancouver:

Tropp JA. Topics in sparse approximation. [Internet] [Thesis]. University of Texas – Austin; 2004. [cited 2019 Feb 16]. Available from: http://hdl.handle.net/2152/1272.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tropp JA. Topics in sparse approximation. [Thesis]. University of Texas – Austin; 2004. Available from: http://hdl.handle.net/2152/1272

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

.