Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Large scale active learning). Showing records 1 – 30 of 58884 total matches.

[1] [2] [3] [4] [5] … [1963]

Search Limiters

Last 2 Years | English Only

Degrees

Languages

Country

▼ Search Limiters


University of Texas – Austin

1. Vijayanarasimhan, Sudheendra. Active visual category learning.

Degree: Computer Sciences, 2011, University of Texas – Austin

 Visual recognition research develops algorithms and representations to autonomously recognize visual entities such as objects, actions, and attributes. The traditional protocol involves manually collecting training… (more)

Subjects/Keywords: Artificial intelligence; Active learning; Object recognition; Object detection; Cost-sensitive learning; Multi-level learning; Budgeted learning; Large-scale active learning; Live learning; Machine learning; Visual recognition system

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Vijayanarasimhan, S. (2011). Active visual category learning. (Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/ETD-UT-2011-05-3014

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Vijayanarasimhan, Sudheendra. “Active visual category learning.” 2011. Thesis, University of Texas – Austin. Accessed April 20, 2019. http://hdl.handle.net/2152/ETD-UT-2011-05-3014.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Vijayanarasimhan, Sudheendra. “Active visual category learning.” 2011. Web. 20 Apr 2019.

Vancouver:

Vijayanarasimhan S. Active visual category learning. [Internet] [Thesis]. University of Texas – Austin; 2011. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/2152/ETD-UT-2011-05-3014.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Vijayanarasimhan S. Active visual category learning. [Thesis]. University of Texas – Austin; 2011. Available from: http://hdl.handle.net/2152/ETD-UT-2011-05-3014

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Edinburgh

2. Zhu, Zhanxing. Integrating local information for inference and optimization in machine learning.

Degree: PhD, 2016, University of Edinburgh

 In practice, machine learners often care about two key issues: one is how to obtain a more accurate answer with limited data, and the other… (more)

Subjects/Keywords: 006.3; machine learning; large-scale optimization; large-scale Bayesian sampling

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhu, Z. (2016). Integrating local information for inference and optimization in machine learning. (Doctoral Dissertation). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/20980

Chicago Manual of Style (16th Edition):

Zhu, Zhanxing. “Integrating local information for inference and optimization in machine learning.” 2016. Doctoral Dissertation, University of Edinburgh. Accessed April 20, 2019. http://hdl.handle.net/1842/20980.

MLA Handbook (7th Edition):

Zhu, Zhanxing. “Integrating local information for inference and optimization in machine learning.” 2016. Web. 20 Apr 2019.

Vancouver:

Zhu Z. Integrating local information for inference and optimization in machine learning. [Internet] [Doctoral dissertation]. University of Edinburgh; 2016. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/1842/20980.

Council of Science Editors:

Zhu Z. Integrating local information for inference and optimization in machine learning. [Doctoral Dissertation]. University of Edinburgh; 2016. Available from: http://hdl.handle.net/1842/20980

3. Ben Slimene Ben Amor, Ines. Apprentissage actif pour la classification des occupations du sol sur larges étendues à partir d'images multispectrales à haute résolution spatiale : application en milieu cultivé, Lebna (Cap-Bon Tunisie) : Active learning for Mapping land cover on wide area, from high spatial resolution satellite images : application in cultivated areas, Lebna (Cap-Bon Tunisie).

Degree: Docteur es, Sciences du sol, 2017, Montpellier; École Nationale des Sciences de l'Informatique (La Manouba, Tunisie)

Les activités anthropiques dans le bassin méditerranéen sont en forte évolution. Dans les zones agricoles, cette croissance entraîne des évolutions considérables de l'occupation du sol.… (more)

Subjects/Keywords: Aprentissage actif; Image satellitaire haute résolution; Classification; Cartographie; Occupation du sol; Large étendue; Active learning; High spatial resolution satellite images; Classification; Mapping; Scale soil; Wide area

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ben Slimene Ben Amor, I. (2017). Apprentissage actif pour la classification des occupations du sol sur larges étendues à partir d'images multispectrales à haute résolution spatiale : application en milieu cultivé, Lebna (Cap-Bon Tunisie) : Active learning for Mapping land cover on wide area, from high spatial resolution satellite images : application in cultivated areas, Lebna (Cap-Bon Tunisie). (Doctoral Dissertation). Montpellier; École Nationale des Sciences de l'Informatique (La Manouba, Tunisie). Retrieved from http://www.theses.fr/2017MONTT126

Chicago Manual of Style (16th Edition):

Ben Slimene Ben Amor, Ines. “Apprentissage actif pour la classification des occupations du sol sur larges étendues à partir d'images multispectrales à haute résolution spatiale : application en milieu cultivé, Lebna (Cap-Bon Tunisie) : Active learning for Mapping land cover on wide area, from high spatial resolution satellite images : application in cultivated areas, Lebna (Cap-Bon Tunisie).” 2017. Doctoral Dissertation, Montpellier; École Nationale des Sciences de l'Informatique (La Manouba, Tunisie). Accessed April 20, 2019. http://www.theses.fr/2017MONTT126.

MLA Handbook (7th Edition):

Ben Slimene Ben Amor, Ines. “Apprentissage actif pour la classification des occupations du sol sur larges étendues à partir d'images multispectrales à haute résolution spatiale : application en milieu cultivé, Lebna (Cap-Bon Tunisie) : Active learning for Mapping land cover on wide area, from high spatial resolution satellite images : application in cultivated areas, Lebna (Cap-Bon Tunisie).” 2017. Web. 20 Apr 2019.

Vancouver:

Ben Slimene Ben Amor I. Apprentissage actif pour la classification des occupations du sol sur larges étendues à partir d'images multispectrales à haute résolution spatiale : application en milieu cultivé, Lebna (Cap-Bon Tunisie) : Active learning for Mapping land cover on wide area, from high spatial resolution satellite images : application in cultivated areas, Lebna (Cap-Bon Tunisie). [Internet] [Doctoral dissertation]. Montpellier; École Nationale des Sciences de l'Informatique (La Manouba, Tunisie); 2017. [cited 2019 Apr 20]. Available from: http://www.theses.fr/2017MONTT126.

Council of Science Editors:

Ben Slimene Ben Amor I. Apprentissage actif pour la classification des occupations du sol sur larges étendues à partir d'images multispectrales à haute résolution spatiale : application en milieu cultivé, Lebna (Cap-Bon Tunisie) : Active learning for Mapping land cover on wide area, from high spatial resolution satellite images : application in cultivated areas, Lebna (Cap-Bon Tunisie). [Doctoral Dissertation]. Montpellier; École Nationale des Sciences de l'Informatique (La Manouba, Tunisie); 2017. Available from: http://www.theses.fr/2017MONTT126


Georgia Tech

4. Berlind, Christopher. New insights on the power of active learning.

Degree: PhD, Computer Science, 2015, Georgia Tech

 Traditional supervised machine learning algorithms are expected to have access to a large corpus of labeled examples, but the massive amount of data available in… (more)

Subjects/Keywords: Machine learning; Learning theory; Active learning; Semi-supervised learning; Domain adaptation; Large margin learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Berlind, C. (2015). New insights on the power of active learning. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/53948

Chicago Manual of Style (16th Edition):

Berlind, Christopher. “New insights on the power of active learning.” 2015. Doctoral Dissertation, Georgia Tech. Accessed April 20, 2019. http://hdl.handle.net/1853/53948.

MLA Handbook (7th Edition):

Berlind, Christopher. “New insights on the power of active learning.” 2015. Web. 20 Apr 2019.

Vancouver:

Berlind C. New insights on the power of active learning. [Internet] [Doctoral dissertation]. Georgia Tech; 2015. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/1853/53948.

Council of Science Editors:

Berlind C. New insights on the power of active learning. [Doctoral Dissertation]. Georgia Tech; 2015. Available from: http://hdl.handle.net/1853/53948


ETH Zürich

5. Lucic, Mario. Computational and Statistical Tradeoffs via Data Summarization.

Degree: 2017, ETH Zürich

 The massive growth of modern datasets from different sources such as videos, social networks, and sensor data, coupled with limited resources in terms of time… (more)

Subjects/Keywords: Machine Learning; Coresets; Large-scale Machine Learning; Outlier Detection; Mixture Models

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lucic, M. (2017). Computational and Statistical Tradeoffs via Data Summarization. (Doctoral Dissertation). ETH Zürich. Retrieved from http://hdl.handle.net/20.500.11850/220255

Chicago Manual of Style (16th Edition):

Lucic, Mario. “Computational and Statistical Tradeoffs via Data Summarization.” 2017. Doctoral Dissertation, ETH Zürich. Accessed April 20, 2019. http://hdl.handle.net/20.500.11850/220255.

MLA Handbook (7th Edition):

Lucic, Mario. “Computational and Statistical Tradeoffs via Data Summarization.” 2017. Web. 20 Apr 2019.

Vancouver:

Lucic M. Computational and Statistical Tradeoffs via Data Summarization. [Internet] [Doctoral dissertation]. ETH Zürich; 2017. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/20.500.11850/220255.

Council of Science Editors:

Lucic M. Computational and Statistical Tradeoffs via Data Summarization. [Doctoral Dissertation]. ETH Zürich; 2017. Available from: http://hdl.handle.net/20.500.11850/220255

6. Lin, Hongzhou. Algorithmes d'accélération générique pour les méthodes d'optimisation en apprentissage statistique : Generic acceleration schemes for gradient-based optimization in machine learning.

Degree: Docteur es, Mathématiques et informatique, 2017, Grenoble Alpes

 Les problèmes d’optimisation apparaissent naturellement pendant l’entraine-ment de modèles d’apprentissage supervises. Un exemple typique est le problème deminimisation du risque empirique (ERM), qui vise a… (more)

Subjects/Keywords: Apprentissage statistique; Large échelle; Optimization; Accélération; Machine learning; Large-Scale; Optimization; Acceleration; 004; 510

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lin, H. (2017). Algorithmes d'accélération générique pour les méthodes d'optimisation en apprentissage statistique : Generic acceleration schemes for gradient-based optimization in machine learning. (Doctoral Dissertation). Grenoble Alpes. Retrieved from http://www.theses.fr/2017GREAM069

Chicago Manual of Style (16th Edition):

Lin, Hongzhou. “Algorithmes d'accélération générique pour les méthodes d'optimisation en apprentissage statistique : Generic acceleration schemes for gradient-based optimization in machine learning.” 2017. Doctoral Dissertation, Grenoble Alpes. Accessed April 20, 2019. http://www.theses.fr/2017GREAM069.

MLA Handbook (7th Edition):

Lin, Hongzhou. “Algorithmes d'accélération générique pour les méthodes d'optimisation en apprentissage statistique : Generic acceleration schemes for gradient-based optimization in machine learning.” 2017. Web. 20 Apr 2019.

Vancouver:

Lin H. Algorithmes d'accélération générique pour les méthodes d'optimisation en apprentissage statistique : Generic acceleration schemes for gradient-based optimization in machine learning. [Internet] [Doctoral dissertation]. Grenoble Alpes; 2017. [cited 2019 Apr 20]. Available from: http://www.theses.fr/2017GREAM069.

Council of Science Editors:

Lin H. Algorithmes d'accélération générique pour les méthodes d'optimisation en apprentissage statistique : Generic acceleration schemes for gradient-based optimization in machine learning. [Doctoral Dissertation]. Grenoble Alpes; 2017. Available from: http://www.theses.fr/2017GREAM069


University of Ontario Institute of Technology

7. Esmailzadeh, Ali. Novel opposition-based sampling methods for efficiently solving challenging optimization problems.

Degree: 2011, University of Ontario Institute of Technology

 In solving noise-free and noisy optimization problems, candidate initialization and sampling play a key role, but are not deeply investigated. It is of interest to… (more)

Subjects/Keywords: Opposition-based learning; Large-scale; Noisy; Sampling methods; Evolutionary algorithm

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Esmailzadeh, A. (2011). Novel opposition-based sampling methods for efficiently solving challenging optimization problems. (Thesis). University of Ontario Institute of Technology. Retrieved from http://hdl.handle.net/10155/150

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Esmailzadeh, Ali. “Novel opposition-based sampling methods for efficiently solving challenging optimization problems.” 2011. Thesis, University of Ontario Institute of Technology. Accessed April 20, 2019. http://hdl.handle.net/10155/150.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Esmailzadeh, Ali. “Novel opposition-based sampling methods for efficiently solving challenging optimization problems.” 2011. Web. 20 Apr 2019.

Vancouver:

Esmailzadeh A. Novel opposition-based sampling methods for efficiently solving challenging optimization problems. [Internet] [Thesis]. University of Ontario Institute of Technology; 2011. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/10155/150.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Esmailzadeh A. Novel opposition-based sampling methods for efficiently solving challenging optimization problems. [Thesis]. University of Ontario Institute of Technology; 2011. Available from: http://hdl.handle.net/10155/150

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Alberta

8. Latifi, Syed Muhammad Fahad. Development and Validation of an Automated Essay Scoring Framework by Integrating Deep Features of English Language.

Degree: PhD, Department of Educational Psychology, 2016, University of Alberta

 Automated scoring methods have become an important topic for the assessments of 21st century skills. Recent development in computational linguistics and natural language processing has… (more)

Subjects/Keywords: automated scoring; feature extraction; essay evaluation; machine learning; large-scale assessment

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Latifi, S. M. F. (2016). Development and Validation of an Automated Essay Scoring Framework by Integrating Deep Features of English Language. (Doctoral Dissertation). University of Alberta. Retrieved from https://era.library.ualberta.ca/files/cr207tp60r

Chicago Manual of Style (16th Edition):

Latifi, Syed Muhammad Fahad. “Development and Validation of an Automated Essay Scoring Framework by Integrating Deep Features of English Language.” 2016. Doctoral Dissertation, University of Alberta. Accessed April 20, 2019. https://era.library.ualberta.ca/files/cr207tp60r.

MLA Handbook (7th Edition):

Latifi, Syed Muhammad Fahad. “Development and Validation of an Automated Essay Scoring Framework by Integrating Deep Features of English Language.” 2016. Web. 20 Apr 2019.

Vancouver:

Latifi SMF. Development and Validation of an Automated Essay Scoring Framework by Integrating Deep Features of English Language. [Internet] [Doctoral dissertation]. University of Alberta; 2016. [cited 2019 Apr 20]. Available from: https://era.library.ualberta.ca/files/cr207tp60r.

Council of Science Editors:

Latifi SMF. Development and Validation of an Automated Essay Scoring Framework by Integrating Deep Features of English Language. [Doctoral Dissertation]. University of Alberta; 2016. Available from: https://era.library.ualberta.ca/files/cr207tp60r


Carnegie Mellon University

9. Reddi, Sashank Jakkam. New Optimization Methods for Modern Machine Learning.

Degree: 2017, Carnegie Mellon University

 Modern machine learning systems pose several new statistical, scalability, privacy and ethical challenges. With the advent of massive datasets and increasingly complex tasks, scalability has… (more)

Subjects/Keywords: Machine Learning; Optimization; Large-scale; Distributed optimization; Communication-efficient; Finite-sum

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Reddi, S. J. (2017). New Optimization Methods for Modern Machine Learning. (Thesis). Carnegie Mellon University. Retrieved from http://repository.cmu.edu/dissertations/1116

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Reddi, Sashank Jakkam. “New Optimization Methods for Modern Machine Learning.” 2017. Thesis, Carnegie Mellon University. Accessed April 20, 2019. http://repository.cmu.edu/dissertations/1116.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Reddi, Sashank Jakkam. “New Optimization Methods for Modern Machine Learning.” 2017. Web. 20 Apr 2019.

Vancouver:

Reddi SJ. New Optimization Methods for Modern Machine Learning. [Internet] [Thesis]. Carnegie Mellon University; 2017. [cited 2019 Apr 20]. Available from: http://repository.cmu.edu/dissertations/1116.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Reddi SJ. New Optimization Methods for Modern Machine Learning. [Thesis]. Carnegie Mellon University; 2017. Available from: http://repository.cmu.edu/dissertations/1116

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Carnegie Mellon University

10. Dai, Wei. Learning with Staleness.

Degree: 2018, Carnegie Mellon University

 A fundamental assumption behind most machine learning (ML) algorithms and analyses is the sequential execution. That is, any update to the ML model can be… (more)

Subjects/Keywords: Large Scale Machine Learning; Distributed Optimization Method; Distributed System; Parameter Server

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Dai, W. (2018). Learning with Staleness. (Thesis). Carnegie Mellon University. Retrieved from http://repository.cmu.edu/dissertations/1209

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Dai, Wei. “Learning with Staleness.” 2018. Thesis, Carnegie Mellon University. Accessed April 20, 2019. http://repository.cmu.edu/dissertations/1209.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Dai, Wei. “Learning with Staleness.” 2018. Web. 20 Apr 2019.

Vancouver:

Dai W. Learning with Staleness. [Internet] [Thesis]. Carnegie Mellon University; 2018. [cited 2019 Apr 20]. Available from: http://repository.cmu.edu/dissertations/1209.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Dai W. Learning with Staleness. [Thesis]. Carnegie Mellon University; 2018. Available from: http://repository.cmu.edu/dissertations/1209

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of California – Berkeley

11. Rabbani, Tarek. Topics in Large-Scale Sparse Estimation and Control.

Degree: Mechanical Engineering, 2013, University of California – Berkeley

 In this thesis, we study two topics related to large-scale sparseestimation and control. In the first topic, we describe a method toeliminate features (variables) in… (more)

Subjects/Keywords: Engineering; Computer science; Classification; Large-Scale Optimization; Machine Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Rabbani, T. (2013). Topics in Large-Scale Sparse Estimation and Control. (Thesis). University of California – Berkeley. Retrieved from http://www.escholarship.org/uc/item/9bv5600v

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Rabbani, Tarek. “Topics in Large-Scale Sparse Estimation and Control.” 2013. Thesis, University of California – Berkeley. Accessed April 20, 2019. http://www.escholarship.org/uc/item/9bv5600v.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Rabbani, Tarek. “Topics in Large-Scale Sparse Estimation and Control.” 2013. Web. 20 Apr 2019.

Vancouver:

Rabbani T. Topics in Large-Scale Sparse Estimation and Control. [Internet] [Thesis]. University of California – Berkeley; 2013. [cited 2019 Apr 20]. Available from: http://www.escholarship.org/uc/item/9bv5600v.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Rabbani T. Topics in Large-Scale Sparse Estimation and Control. [Thesis]. University of California – Berkeley; 2013. Available from: http://www.escholarship.org/uc/item/9bv5600v

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Cornell University

12. Shrivastava, Anshumali. Probabilistic Hashing Techniques For Big Data .

Degree: 2015, Cornell University

 We investigate probabilistic hashing techniques for addressing computational and memory challenges in large scale machine learning and data mining systems. In this thesis, we show… (more)

Subjects/Keywords: Large Scale Machine Learning; Randomized Algorithms for Big-Data; Hashing, Sketching

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Shrivastava, A. (2015). Probabilistic Hashing Techniques For Big Data . (Thesis). Cornell University. Retrieved from http://hdl.handle.net/1813/40886

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Shrivastava, Anshumali. “Probabilistic Hashing Techniques For Big Data .” 2015. Thesis, Cornell University. Accessed April 20, 2019. http://hdl.handle.net/1813/40886.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Shrivastava, Anshumali. “Probabilistic Hashing Techniques For Big Data .” 2015. Web. 20 Apr 2019.

Vancouver:

Shrivastava A. Probabilistic Hashing Techniques For Big Data . [Internet] [Thesis]. Cornell University; 2015. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/1813/40886.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Shrivastava A. Probabilistic Hashing Techniques For Big Data . [Thesis]. Cornell University; 2015. Available from: http://hdl.handle.net/1813/40886

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of New South Wales

13. Maghrebi, Mojtaba. Using Machine Learning to Automatically Plan Concrete Delivery Dispatching.

Degree: Civil & Environmental Engineering, 2014, University of New South Wales

 Demand for concrete, regardless of the geographical location, is increasing globally. While the Ready Mixed Concrete (RMC) industry is facing an ever-increasing demand for concrete,… (more)

Subjects/Keywords: Large scale optimization; Concrete delivery; Machine learning; Experts’ decisions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Maghrebi, M. (2014). Using Machine Learning to Automatically Plan Concrete Delivery Dispatching. (Doctoral Dissertation). University of New South Wales. Retrieved from http://handle.unsw.edu.au/1959.4/54231 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:13522/SOURCE02?view=true

Chicago Manual of Style (16th Edition):

Maghrebi, Mojtaba. “Using Machine Learning to Automatically Plan Concrete Delivery Dispatching.” 2014. Doctoral Dissertation, University of New South Wales. Accessed April 20, 2019. http://handle.unsw.edu.au/1959.4/54231 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:13522/SOURCE02?view=true.

MLA Handbook (7th Edition):

Maghrebi, Mojtaba. “Using Machine Learning to Automatically Plan Concrete Delivery Dispatching.” 2014. Web. 20 Apr 2019.

Vancouver:

Maghrebi M. Using Machine Learning to Automatically Plan Concrete Delivery Dispatching. [Internet] [Doctoral dissertation]. University of New South Wales; 2014. [cited 2019 Apr 20]. Available from: http://handle.unsw.edu.au/1959.4/54231 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:13522/SOURCE02?view=true.

Council of Science Editors:

Maghrebi M. Using Machine Learning to Automatically Plan Concrete Delivery Dispatching. [Doctoral Dissertation]. University of New South Wales; 2014. Available from: http://handle.unsw.edu.au/1959.4/54231 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:13522/SOURCE02?view=true


University of Minnesota

14. Mardani, Morteza. Leveraging Sparsity and Low Rank for Large-Scale Networks and Data Science.

Degree: PhD, Electrical/Computer Engineering, 2015, University of Minnesota

 We live in an era of ``data deluge," with pervasive sensors collecting massive amounts of information on every bit of our lives, churning out enormous… (more)

Subjects/Keywords: Big data; Large-scale networks; learning; Low rank; Sparsity

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mardani, M. (2015). Leveraging Sparsity and Low Rank for Large-Scale Networks and Data Science. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/174873

Chicago Manual of Style (16th Edition):

Mardani, Morteza. “Leveraging Sparsity and Low Rank for Large-Scale Networks and Data Science.” 2015. Doctoral Dissertation, University of Minnesota. Accessed April 20, 2019. http://hdl.handle.net/11299/174873.

MLA Handbook (7th Edition):

Mardani, Morteza. “Leveraging Sparsity and Low Rank for Large-Scale Networks and Data Science.” 2015. Web. 20 Apr 2019.

Vancouver:

Mardani M. Leveraging Sparsity and Low Rank for Large-Scale Networks and Data Science. [Internet] [Doctoral dissertation]. University of Minnesota; 2015. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/11299/174873.

Council of Science Editors:

Mardani M. Leveraging Sparsity and Low Rank for Large-Scale Networks and Data Science. [Doctoral Dissertation]. University of Minnesota; 2015. Available from: http://hdl.handle.net/11299/174873


Australian National University

15. Tan, Kang Yong. Essays on learning in international macroeconomics .

Degree: 2011, Australian National University

 The objective of this thesis is to explore the implications of learning as an alter- native expectations formation mechanism in international macroeconomics. The first part… (more)

Subjects/Keywords: adaptive learning; international macroeconomic; large-scale macroeconomic models; transmission of shocks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tan, K. Y. (2011). Essays on learning in international macroeconomics . (Thesis). Australian National University. Retrieved from http://hdl.handle.net/1885/6965

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tan, Kang Yong. “Essays on learning in international macroeconomics .” 2011. Thesis, Australian National University. Accessed April 20, 2019. http://hdl.handle.net/1885/6965.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tan, Kang Yong. “Essays on learning in international macroeconomics .” 2011. Web. 20 Apr 2019.

Vancouver:

Tan KY. Essays on learning in international macroeconomics . [Internet] [Thesis]. Australian National University; 2011. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/1885/6965.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tan KY. Essays on learning in international macroeconomics . [Thesis]. Australian National University; 2011. Available from: http://hdl.handle.net/1885/6965

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Penn State University

16. Montasser, Omar. Predicting Demographics of High-Resolution Geographies.

Degree: 2017, Penn State University

 In this thesis, we consider the problem of predicting demographics of geographic units given geotagged Tweets that are composed within these units. Traditional survey methods… (more)

Subjects/Keywords: geotagged social media; geography-based demographics prediction; large-scale supervised learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Montasser, O. (2017). Predicting Demographics of High-Resolution Geographies. (Thesis). Penn State University. Retrieved from https://etda.libraries.psu.edu/catalog/13882ovm5033

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Montasser, Omar. “Predicting Demographics of High-Resolution Geographies.” 2017. Thesis, Penn State University. Accessed April 20, 2019. https://etda.libraries.psu.edu/catalog/13882ovm5033.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Montasser, Omar. “Predicting Demographics of High-Resolution Geographies.” 2017. Web. 20 Apr 2019.

Vancouver:

Montasser O. Predicting Demographics of High-Resolution Geographies. [Internet] [Thesis]. Penn State University; 2017. [cited 2019 Apr 20]. Available from: https://etda.libraries.psu.edu/catalog/13882ovm5033.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Montasser O. Predicting Demographics of High-Resolution Geographies. [Thesis]. Penn State University; 2017. Available from: https://etda.libraries.psu.edu/catalog/13882ovm5033

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Temple University

17. Wang, Zhuang. Budgeted Online Kernel Classifiers for Large Scale Learning.

Degree: PhD, 2010, Temple University

Computer and Information Science

In the environment where new large scale problems are emerging in various disciplines and pervasive computing applications are becoming more common,… (more)

Subjects/Keywords: Computer Science; kernel method; large scale learning; online learning; perceptron; support vector machine

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, Z. (2010). Budgeted Online Kernel Classifiers for Large Scale Learning. (Doctoral Dissertation). Temple University. Retrieved from http://digital.library.temple.edu/u?/p245801coll10,89554

Chicago Manual of Style (16th Edition):

Wang, Zhuang. “Budgeted Online Kernel Classifiers for Large Scale Learning.” 2010. Doctoral Dissertation, Temple University. Accessed April 20, 2019. http://digital.library.temple.edu/u?/p245801coll10,89554.

MLA Handbook (7th Edition):

Wang, Zhuang. “Budgeted Online Kernel Classifiers for Large Scale Learning.” 2010. Web. 20 Apr 2019.

Vancouver:

Wang Z. Budgeted Online Kernel Classifiers for Large Scale Learning. [Internet] [Doctoral dissertation]. Temple University; 2010. [cited 2019 Apr 20]. Available from: http://digital.library.temple.edu/u?/p245801coll10,89554.

Council of Science Editors:

Wang Z. Budgeted Online Kernel Classifiers for Large Scale Learning. [Doctoral Dissertation]. Temple University; 2010. Available from: http://digital.library.temple.edu/u?/p245801coll10,89554


Temple University

18. Djuric, Nemanja. Big Data Algorithms for Visualization and Supervised Learning.

Degree: PhD, 2013, Temple University

Computer and Information Science

Explosive growth in data size, data complexity, and data rates, triggered by emergence of high-throughput technologies such as remote sensing, crowd-sourcing,… (more)

Subjects/Keywords: Computer science;

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Djuric, N. (2013). Big Data Algorithms for Visualization and Supervised Learning. (Doctoral Dissertation). Temple University. Retrieved from http://digital.library.temple.edu/u?/p245801coll10,239445

Chicago Manual of Style (16th Edition):

Djuric, Nemanja. “Big Data Algorithms for Visualization and Supervised Learning.” 2013. Doctoral Dissertation, Temple University. Accessed April 20, 2019. http://digital.library.temple.edu/u?/p245801coll10,239445.

MLA Handbook (7th Edition):

Djuric, Nemanja. “Big Data Algorithms for Visualization and Supervised Learning.” 2013. Web. 20 Apr 2019.

Vancouver:

Djuric N. Big Data Algorithms for Visualization and Supervised Learning. [Internet] [Doctoral dissertation]. Temple University; 2013. [cited 2019 Apr 20]. Available from: http://digital.library.temple.edu/u?/p245801coll10,239445.

Council of Science Editors:

Djuric N. Big Data Algorithms for Visualization and Supervised Learning. [Doctoral Dissertation]. Temple University; 2013. Available from: http://digital.library.temple.edu/u?/p245801coll10,239445


University of California – Berkeley

19. Sparks, Evan Randall. End-to-End Large Scale Machine Learning with KeystoneML.

Degree: Computer Science, 2016, University of California – Berkeley

 The rise of data center computing and Internet-connected devices has led to an unparalleled explosion in the volumes of data collected across a multitude of… (more)

Subjects/Keywords: Computer science; advanced analytics; artificial intelligence; big data; distributed machine learning; large scale; machine learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sparks, E. R. (2016). End-to-End Large Scale Machine Learning with KeystoneML. (Thesis). University of California – Berkeley. Retrieved from http://www.escholarship.org/uc/item/4r73d9rh

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sparks, Evan Randall. “End-to-End Large Scale Machine Learning with KeystoneML.” 2016. Thesis, University of California – Berkeley. Accessed April 20, 2019. http://www.escholarship.org/uc/item/4r73d9rh.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sparks, Evan Randall. “End-to-End Large Scale Machine Learning with KeystoneML.” 2016. Web. 20 Apr 2019.

Vancouver:

Sparks ER. End-to-End Large Scale Machine Learning with KeystoneML. [Internet] [Thesis]. University of California – Berkeley; 2016. [cited 2019 Apr 20]. Available from: http://www.escholarship.org/uc/item/4r73d9rh.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sparks ER. End-to-End Large Scale Machine Learning with KeystoneML. [Thesis]. University of California – Berkeley; 2016. Available from: http://www.escholarship.org/uc/item/4r73d9rh

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


KTH

20. Gebremeskel, Ermias. Analysis and Comparison of Distributed Training Techniques for Deep Neural Networks in a Dynamic Environment.

Degree: Electrical Engineering and Computer Science (EECS), 2018, KTH

Deep learning models' prediction accuracy tends to improve with the size of the model. The implications being that the amount of computational power needed… (more)

Subjects/Keywords: deep learning; large scale distributed deep learning; data parallelism; Computer Sciences; Datavetenskap (datalogi)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gebremeskel, E. (2018). Analysis and Comparison of Distributed Training Techniques for Deep Neural Networks in a Dynamic Environment. (Thesis). KTH. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231350

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Gebremeskel, Ermias. “Analysis and Comparison of Distributed Training Techniques for Deep Neural Networks in a Dynamic Environment.” 2018. Thesis, KTH. Accessed April 20, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231350.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Gebremeskel, Ermias. “Analysis and Comparison of Distributed Training Techniques for Deep Neural Networks in a Dynamic Environment.” 2018. Web. 20 Apr 2019.

Vancouver:

Gebremeskel E. Analysis and Comparison of Distributed Training Techniques for Deep Neural Networks in a Dynamic Environment. [Internet] [Thesis]. KTH; 2018. [cited 2019 Apr 20]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231350.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Gebremeskel E. Analysis and Comparison of Distributed Training Techniques for Deep Neural Networks in a Dynamic Environment. [Thesis]. KTH; 2018. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231350

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Université de Grenoble

21. Babbar, Rohit. Machine Learning Strategies for Large-scale Taxonomies : Strategies d'apprentissage pour la classification dans les grandes taxonomies.

Degree: Docteur es, Informatique, 2014, Université de Grenoble

À l'ère de Big Data, le développement de modèles d'apprentissage machine efficaces et évolutifs opérant sur des Tera-Octets de données est une nécessité. Dans cette… (more)

Subjects/Keywords: Apprentissage automatique; Classification à large échelle; Classification hiérarchique; Automatic Learning; Large-scale Classification; Hierarchical classification; 004

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Babbar, R. (2014). Machine Learning Strategies for Large-scale Taxonomies : Strategies d'apprentissage pour la classification dans les grandes taxonomies. (Doctoral Dissertation). Université de Grenoble. Retrieved from http://www.theses.fr/2014GRENM064

Chicago Manual of Style (16th Edition):

Babbar, Rohit. “Machine Learning Strategies for Large-scale Taxonomies : Strategies d'apprentissage pour la classification dans les grandes taxonomies.” 2014. Doctoral Dissertation, Université de Grenoble. Accessed April 20, 2019. http://www.theses.fr/2014GRENM064.

MLA Handbook (7th Edition):

Babbar, Rohit. “Machine Learning Strategies for Large-scale Taxonomies : Strategies d'apprentissage pour la classification dans les grandes taxonomies.” 2014. Web. 20 Apr 2019.

Vancouver:

Babbar R. Machine Learning Strategies for Large-scale Taxonomies : Strategies d'apprentissage pour la classification dans les grandes taxonomies. [Internet] [Doctoral dissertation]. Université de Grenoble; 2014. [cited 2019 Apr 20]. Available from: http://www.theses.fr/2014GRENM064.

Council of Science Editors:

Babbar R. Machine Learning Strategies for Large-scale Taxonomies : Strategies d'apprentissage pour la classification dans les grandes taxonomies. [Doctoral Dissertation]. Université de Grenoble; 2014. Available from: http://www.theses.fr/2014GRENM064


UCLA

22. Grujic, Olivera. Computational Methods for Processing and Analyzing Large Scale Genomics Datasets.

Degree: Computer Science, 2016, UCLA

 This dissertation develops computational methods for analyzing large-scale genomic and epigenomic datasets. We developed a supervised machine learning approach to predict non-exonic evolutionarily conserved regions… (more)

Subjects/Keywords: Computer science; Bioinformatics; Computational; Genetics; Genomics; Large-scale Data; Machine Learning; Method

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Grujic, O. (2016). Computational Methods for Processing and Analyzing Large Scale Genomics Datasets. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/5dp6x29f

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Grujic, Olivera. “Computational Methods for Processing and Analyzing Large Scale Genomics Datasets.” 2016. Thesis, UCLA. Accessed April 20, 2019. http://www.escholarship.org/uc/item/5dp6x29f.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Grujic, Olivera. “Computational Methods for Processing and Analyzing Large Scale Genomics Datasets.” 2016. Web. 20 Apr 2019.

Vancouver:

Grujic O. Computational Methods for Processing and Analyzing Large Scale Genomics Datasets. [Internet] [Thesis]. UCLA; 2016. [cited 2019 Apr 20]. Available from: http://www.escholarship.org/uc/item/5dp6x29f.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Grujic O. Computational Methods for Processing and Analyzing Large Scale Genomics Datasets. [Thesis]. UCLA; 2016. Available from: http://www.escholarship.org/uc/item/5dp6x29f

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


UCLA

23. Zhang, Qiang. Modern Models for Learning Large-Scale Highly Skewed Online Advertising Data.

Degree: Statistics, 2015, UCLA

 Click through rate (CTR) and conversation rate estimation are two core prediction tasks in online advertising. However, four major challenges emerged as data scientists trying… (more)

Subjects/Keywords: Statistics; Marketing; Data Mining; High Cardinality; Imbalanced Data; Large-scale Classification; Machine Learning; Online Advertising

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhang, Q. (2015). Modern Models for Learning Large-Scale Highly Skewed Online Advertising Data. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/7mc0k1v8

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Zhang, Qiang. “Modern Models for Learning Large-Scale Highly Skewed Online Advertising Data.” 2015. Thesis, UCLA. Accessed April 20, 2019. http://www.escholarship.org/uc/item/7mc0k1v8.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Zhang, Qiang. “Modern Models for Learning Large-Scale Highly Skewed Online Advertising Data.” 2015. Web. 20 Apr 2019.

Vancouver:

Zhang Q. Modern Models for Learning Large-Scale Highly Skewed Online Advertising Data. [Internet] [Thesis]. UCLA; 2015. [cited 2019 Apr 20]. Available from: http://www.escholarship.org/uc/item/7mc0k1v8.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhang Q. Modern Models for Learning Large-Scale Highly Skewed Online Advertising Data. [Thesis]. UCLA; 2015. Available from: http://www.escholarship.org/uc/item/7mc0k1v8

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Columbia University

24. Jou, Brendan Wesley. Large-scale Affective Computing for Visual Multimedia.

Degree: 2016, Columbia University

 In recent years, Affective Computing has arisen as a prolific interdisciplinary field for engineering systems that integrate human affections. While human-computer relationships have long revolved… (more)

Subjects/Keywords: Computer vision; Affect (Psychology) – Computer simulation; Large scale systems; Machine learning; Electrical engineering; Computer science

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jou, B. W. (2016). Large-scale Affective Computing for Visual Multimedia. (Doctoral Dissertation). Columbia University. Retrieved from https://doi.org/10.7916/D8474B0B

Chicago Manual of Style (16th Edition):

Jou, Brendan Wesley. “Large-scale Affective Computing for Visual Multimedia.” 2016. Doctoral Dissertation, Columbia University. Accessed April 20, 2019. https://doi.org/10.7916/D8474B0B.

MLA Handbook (7th Edition):

Jou, Brendan Wesley. “Large-scale Affective Computing for Visual Multimedia.” 2016. Web. 20 Apr 2019.

Vancouver:

Jou BW. Large-scale Affective Computing for Visual Multimedia. [Internet] [Doctoral dissertation]. Columbia University; 2016. [cited 2019 Apr 20]. Available from: https://doi.org/10.7916/D8474B0B.

Council of Science Editors:

Jou BW. Large-scale Affective Computing for Visual Multimedia. [Doctoral Dissertation]. Columbia University; 2016. Available from: https://doi.org/10.7916/D8474B0B


University of Minnesota

25. Hu, Bin. A Robust Control Perspective on Optimization of Strongly-Convex Functions.

Degree: PhD, Aerospace Engineering and Mechanics, 2016, University of Minnesota

Large-scale optimization is a central topic in big data science. First-order black-box optimization methods have been widely applied in machine learning problems, since the oracle… (more)

Subjects/Keywords: Integral Quadratic Constraints; Large-Scale Optimization; Machine Learning; Robust Control; Stochastic Average Gradient; Stochastic Gradient

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hu, B. (2016). A Robust Control Perspective on Optimization of Strongly-Convex Functions. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/182283

Chicago Manual of Style (16th Edition):

Hu, Bin. “A Robust Control Perspective on Optimization of Strongly-Convex Functions.” 2016. Doctoral Dissertation, University of Minnesota. Accessed April 20, 2019. http://hdl.handle.net/11299/182283.

MLA Handbook (7th Edition):

Hu, Bin. “A Robust Control Perspective on Optimization of Strongly-Convex Functions.” 2016. Web. 20 Apr 2019.

Vancouver:

Hu B. A Robust Control Perspective on Optimization of Strongly-Convex Functions. [Internet] [Doctoral dissertation]. University of Minnesota; 2016. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/11299/182283.

Council of Science Editors:

Hu B. A Robust Control Perspective on Optimization of Strongly-Convex Functions. [Doctoral Dissertation]. University of Minnesota; 2016. Available from: http://hdl.handle.net/11299/182283


University of Illinois – Urbana-Champaign

26. Yu, Honghai. Learning compact hashing codes for large-scale similarity search.

Degree: PhD, Electrical & Computer Engr, 2015, University of Illinois – Urbana-Champaign

 Retrieval of similar objects is a key component in many applications. As databases grow larger, learning compact representations for efficient storage and fast search becomes… (more)

Subjects/Keywords: hashing; learning; similarity search; fingerprinting; content identification; robust hashing; large-scale; binary codes; multimedia retrieval

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yu, H. (2015). Learning compact hashing codes for large-scale similarity search. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/78338

Chicago Manual of Style (16th Edition):

Yu, Honghai. “Learning compact hashing codes for large-scale similarity search.” 2015. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed April 20, 2019. http://hdl.handle.net/2142/78338.

MLA Handbook (7th Edition):

Yu, Honghai. “Learning compact hashing codes for large-scale similarity search.” 2015. Web. 20 Apr 2019.

Vancouver:

Yu H. Learning compact hashing codes for large-scale similarity search. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2015. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/2142/78338.

Council of Science Editors:

Yu H. Learning compact hashing codes for large-scale similarity search. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2015. Available from: http://hdl.handle.net/2142/78338


University of Illinois – Urbana-Champaign

27. Huang, Po-Sen. Shallow and deep learning for audio and natural language processing.

Degree: PhD, Electrical & Computer Engr, 2015, University of Illinois – Urbana-Champaign

 Many machine learning algorithms can be viewed as optimization problems that seek the optimum hypothesis in a hypothesis space. To model the complex dependencies in… (more)

Subjects/Keywords: deep learning; large-scale kernel machines; monaural source separation; speech recognition; information retrieval

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Huang, P. (2015). Shallow and deep learning for audio and natural language processing. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/78466

Chicago Manual of Style (16th Edition):

Huang, Po-Sen. “Shallow and deep learning for audio and natural language processing.” 2015. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed April 20, 2019. http://hdl.handle.net/2142/78466.

MLA Handbook (7th Edition):

Huang, Po-Sen. “Shallow and deep learning for audio and natural language processing.” 2015. Web. 20 Apr 2019.

Vancouver:

Huang P. Shallow and deep learning for audio and natural language processing. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2015. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/2142/78466.

Council of Science Editors:

Huang P. Shallow and deep learning for audio and natural language processing. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2015. Available from: http://hdl.handle.net/2142/78466


University of Arizona

28. Li, Qike. New Statistical Methods of Single-subject Transcriptome Analysis for Precision Medicine .

Degree: 2017, University of Arizona

 Precision medicine provides targeted treatment for an individual patient based on disease mechanisms, promoting health care. Matched transcriptomes derived from a single subject enable uncovering… (more)

Subjects/Keywords: large scale inference; precision medicine; RNA-Seq; single-subject analysis; statistical learning; transcriptomics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, Q. (2017). New Statistical Methods of Single-subject Transcriptome Analysis for Precision Medicine . (Doctoral Dissertation). University of Arizona. Retrieved from http://hdl.handle.net/10150/626305

Chicago Manual of Style (16th Edition):

Li, Qike. “New Statistical Methods of Single-subject Transcriptome Analysis for Precision Medicine .” 2017. Doctoral Dissertation, University of Arizona. Accessed April 20, 2019. http://hdl.handle.net/10150/626305.

MLA Handbook (7th Edition):

Li, Qike. “New Statistical Methods of Single-subject Transcriptome Analysis for Precision Medicine .” 2017. Web. 20 Apr 2019.

Vancouver:

Li Q. New Statistical Methods of Single-subject Transcriptome Analysis for Precision Medicine . [Internet] [Doctoral dissertation]. University of Arizona; 2017. [cited 2019 Apr 20]. Available from: http://hdl.handle.net/10150/626305.

Council of Science Editors:

Li Q. New Statistical Methods of Single-subject Transcriptome Analysis for Precision Medicine . [Doctoral Dissertation]. University of Arizona; 2017. Available from: http://hdl.handle.net/10150/626305


University of Pennsylvania

29. Raja Ahmad, Raja Mohd Hafiz Affandi. Learning, Large Scale Inference, and Temporal Modeling of Determinantal Point Processes.

Degree: 2014, University of Pennsylvania

 Determinantal Point Processes (DPPs) are random point processes well-suited for modelling repulsion. In discrete settings, DPPs are a natural model for subset selection problems where… (more)

Subjects/Keywords: Determinantal Point Processes; diversity; large-scale; learning; point processes; repulsion; Computer Sciences; Statistics and Probability

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Raja Ahmad, R. M. H. A. (2014). Learning, Large Scale Inference, and Temporal Modeling of Determinantal Point Processes. (Thesis). University of Pennsylvania. Retrieved from https://repository.upenn.edu/edissertations/1411

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Raja Ahmad, Raja Mohd Hafiz Affandi. “Learning, Large Scale Inference, and Temporal Modeling of Determinantal Point Processes.” 2014. Thesis, University of Pennsylvania. Accessed April 20, 2019. https://repository.upenn.edu/edissertations/1411.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Raja Ahmad, Raja Mohd Hafiz Affandi. “Learning, Large Scale Inference, and Temporal Modeling of Determinantal Point Processes.” 2014. Web. 20 Apr 2019.

Vancouver:

Raja Ahmad RMHA. Learning, Large Scale Inference, and Temporal Modeling of Determinantal Point Processes. [Internet] [Thesis]. University of Pennsylvania; 2014. [cited 2019 Apr 20]. Available from: https://repository.upenn.edu/edissertations/1411.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Raja Ahmad RMHA. Learning, Large Scale Inference, and Temporal Modeling of Determinantal Point Processes. [Thesis]. University of Pennsylvania; 2014. Available from: https://repository.upenn.edu/edissertations/1411

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Tennessee – Knoxville

30. Badara, Ioana Alexandra. Using Transformative Learning Theory to Investigate Ways to Enrich University Teaching: Focus on the Implementation of Student-Centered Teaching in Large Introductory Science Courses.

Degree: 2011, University of Tennessee – Knoxville

 Previous studies have reported high attrition rates in large-enrollment science courses where teacher-centered instruction was prevalent. The scientific literature provides strong evidence that student-centered teaching,… (more)

Subjects/Keywords: transformative learning; large-enrollment science courses; higher education; professional development; active learning; Higher Education and Teaching

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Badara, I. A. (2011). Using Transformative Learning Theory to Investigate Ways to Enrich University Teaching: Focus on the Implementation of Student-Centered Teaching in Large Introductory Science Courses. (Doctoral Dissertation). University of Tennessee – Knoxville. Retrieved from https://trace.tennessee.edu/utk_graddiss/945

Chicago Manual of Style (16th Edition):

Badara, Ioana Alexandra. “Using Transformative Learning Theory to Investigate Ways to Enrich University Teaching: Focus on the Implementation of Student-Centered Teaching in Large Introductory Science Courses.” 2011. Doctoral Dissertation, University of Tennessee – Knoxville. Accessed April 20, 2019. https://trace.tennessee.edu/utk_graddiss/945.

MLA Handbook (7th Edition):

Badara, Ioana Alexandra. “Using Transformative Learning Theory to Investigate Ways to Enrich University Teaching: Focus on the Implementation of Student-Centered Teaching in Large Introductory Science Courses.” 2011. Web. 20 Apr 2019.

Vancouver:

Badara IA. Using Transformative Learning Theory to Investigate Ways to Enrich University Teaching: Focus on the Implementation of Student-Centered Teaching in Large Introductory Science Courses. [Internet] [Doctoral dissertation]. University of Tennessee – Knoxville; 2011. [cited 2019 Apr 20]. Available from: https://trace.tennessee.edu/utk_graddiss/945.

Council of Science Editors:

Badara IA. Using Transformative Learning Theory to Investigate Ways to Enrich University Teaching: Focus on the Implementation of Student-Centered Teaching in Large Introductory Science Courses. [Doctoral Dissertation]. University of Tennessee – Knoxville; 2011. Available from: https://trace.tennessee.edu/utk_graddiss/945

[1] [2] [3] [4] [5] … [1963]

.