Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(continual learning). Showing records 1 – 20 of 20 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


University of Waterloo

1. Gaurav, Ashish. Safety-Oriented Stability Biases for Continual Learning.

Degree: 2020, University of Waterloo

Continual learning is often confounded by “catastrophic forgetting” that prevents neural networks from learning tasks sequentially. In the case of real world classification systems that… (more)

Subjects/Keywords: deep learning; continual learning; classification; reinforcement learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gaurav, A. (2020). Safety-Oriented Stability Biases for Continual Learning. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/15579

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Gaurav, Ashish. “Safety-Oriented Stability Biases for Continual Learning.” 2020. Thesis, University of Waterloo. Accessed February 27, 2021. http://hdl.handle.net/10012/15579.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Gaurav, Ashish. “Safety-Oriented Stability Biases for Continual Learning.” 2020. Web. 27 Feb 2021.

Vancouver:

Gaurav A. Safety-Oriented Stability Biases for Continual Learning. [Internet] [Thesis]. University of Waterloo; 2020. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/10012/15579.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Gaurav A. Safety-Oriented Stability Biases for Continual Learning. [Thesis]. University of Waterloo; 2020. Available from: http://hdl.handle.net/10012/15579

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Waterloo

2. El Khatib, Alaa. Continual Learning and Forgetting in Deep Learning Models.

Degree: 2020, University of Waterloo

Continual learning is a framework of learning in which we aim to move beyond the limitations of standard isolated optimization of deep learning models toward… (more)

Subjects/Keywords: deep learning; continual learning; catastrophic forgetting

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

El Khatib, A. (2020). Continual Learning and Forgetting in Deep Learning Models. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/16544

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

El Khatib, Alaa. “Continual Learning and Forgetting in Deep Learning Models.” 2020. Thesis, University of Waterloo. Accessed February 27, 2021. http://hdl.handle.net/10012/16544.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

El Khatib, Alaa. “Continual Learning and Forgetting in Deep Learning Models.” 2020. Web. 27 Feb 2021.

Vancouver:

El Khatib A. Continual Learning and Forgetting in Deep Learning Models. [Internet] [Thesis]. University of Waterloo; 2020. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/10012/16544.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

El Khatib A. Continual Learning and Forgetting in Deep Learning Models. [Thesis]. University of Waterloo; 2020. Available from: http://hdl.handle.net/10012/16544

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Rochester Institute of Technology

3. Pandit, Tej. Relational Neurogenesis for Lifelong Learning Agents.

Degree: MS, Computer Engineering, 2019, Rochester Institute of Technology

  Reinforcement learning systems have shown tremendous potential in being able to model meritorious behavior in virtual agents and robots. The ability to learn through… (more)

Subjects/Keywords: Artificial intelligence; Continual learning; Lifelong learning; Neural networks; Neurogenesis; Reinforcement learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Pandit, T. (2019). Relational Neurogenesis for Lifelong Learning Agents. (Masters Thesis). Rochester Institute of Technology. Retrieved from https://scholarworks.rit.edu/theses/10096

Chicago Manual of Style (16th Edition):

Pandit, Tej. “Relational Neurogenesis for Lifelong Learning Agents.” 2019. Masters Thesis, Rochester Institute of Technology. Accessed February 27, 2021. https://scholarworks.rit.edu/theses/10096.

MLA Handbook (7th Edition):

Pandit, Tej. “Relational Neurogenesis for Lifelong Learning Agents.” 2019. Web. 27 Feb 2021.

Vancouver:

Pandit T. Relational Neurogenesis for Lifelong Learning Agents. [Internet] [Masters thesis]. Rochester Institute of Technology; 2019. [cited 2021 Feb 27]. Available from: https://scholarworks.rit.edu/theses/10096.

Council of Science Editors:

Pandit T. Relational Neurogenesis for Lifelong Learning Agents. [Masters Thesis]. Rochester Institute of Technology; 2019. Available from: https://scholarworks.rit.edu/theses/10096


Tampere University

4. Khan, Amna. Comparison of machine learning approaches for classification of invoices .

Degree: 2020, Tampere University

 Machine learning has become one of the leading sciences governing modern world. Various disciplines specifically neural networks have recently gained a lot of attention due… (more)

Subjects/Keywords: Machine Learning ; Invoice prediction ; Neural Networks ; Multi-task learning ; Continual Learning ; Deep Learning in Finance

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Khan, A. (2020). Comparison of machine learning approaches for classification of invoices . (Masters Thesis). Tampere University. Retrieved from https://trepo.tuni.fi/handle/10024/120493

Chicago Manual of Style (16th Edition):

Khan, Amna. “Comparison of machine learning approaches for classification of invoices .” 2020. Masters Thesis, Tampere University. Accessed February 27, 2021. https://trepo.tuni.fi/handle/10024/120493.

MLA Handbook (7th Edition):

Khan, Amna. “Comparison of machine learning approaches for classification of invoices .” 2020. Web. 27 Feb 2021.

Vancouver:

Khan A. Comparison of machine learning approaches for classification of invoices . [Internet] [Masters thesis]. Tampere University; 2020. [cited 2021 Feb 27]. Available from: https://trepo.tuni.fi/handle/10024/120493.

Council of Science Editors:

Khan A. Comparison of machine learning approaches for classification of invoices . [Masters Thesis]. Tampere University; 2020. Available from: https://trepo.tuni.fi/handle/10024/120493


University of Guelph

5. Thangarasa, Vithursan. Differentiable Hebbian Consolidation for Continual Lifelong Learning.

Degree: Master of Applied Science, School of Engineering, 2019, University of Guelph

 Catastrophic forgetting poses a grand challenge for continual learning systems. It prevents neural network models from protecting previously learned knowledge while learning new tasks in… (more)

Subjects/Keywords: Deep Learning; Machine Learning; Neural Networks; Lifelong Learning; Continual Learning; Computer Vision; Neuroplasticity

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Thangarasa, V. (2019). Differentiable Hebbian Consolidation for Continual Lifelong Learning. (Masters Thesis). University of Guelph. Retrieved from https://atrium.lib.uoguelph.ca/xmlui/handle/10214/17440

Chicago Manual of Style (16th Edition):

Thangarasa, Vithursan. “Differentiable Hebbian Consolidation for Continual Lifelong Learning.” 2019. Masters Thesis, University of Guelph. Accessed February 27, 2021. https://atrium.lib.uoguelph.ca/xmlui/handle/10214/17440.

MLA Handbook (7th Edition):

Thangarasa, Vithursan. “Differentiable Hebbian Consolidation for Continual Lifelong Learning.” 2019. Web. 27 Feb 2021.

Vancouver:

Thangarasa V. Differentiable Hebbian Consolidation for Continual Lifelong Learning. [Internet] [Masters thesis]. University of Guelph; 2019. [cited 2021 Feb 27]. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/17440.

Council of Science Editors:

Thangarasa V. Differentiable Hebbian Consolidation for Continual Lifelong Learning. [Masters Thesis]. University of Guelph; 2019. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/17440


University of Michigan

6. Lee, Kibok. Robust Deep Learning in the Open World with Lifelong Learning and Representation Learning.

Degree: PhD, Computer Science & Engineering, 2020, University of Michigan

 Deep neural networks have shown a superior performance in many learning problems by learning hierarchical latent representations from a large amount of labeled data. However,… (more)

Subjects/Keywords: Machine Learning; Deep Learning; Novelty Detection; Continual Learning; Domain Generalization; Representation Learning; Computer Science; Engineering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lee, K. (2020). Robust Deep Learning in the Open World with Lifelong Learning and Representation Learning. (Doctoral Dissertation). University of Michigan. Retrieved from http://hdl.handle.net/2027.42/162981

Chicago Manual of Style (16th Edition):

Lee, Kibok. “Robust Deep Learning in the Open World with Lifelong Learning and Representation Learning.” 2020. Doctoral Dissertation, University of Michigan. Accessed February 27, 2021. http://hdl.handle.net/2027.42/162981.

MLA Handbook (7th Edition):

Lee, Kibok. “Robust Deep Learning in the Open World with Lifelong Learning and Representation Learning.” 2020. Web. 27 Feb 2021.

Vancouver:

Lee K. Robust Deep Learning in the Open World with Lifelong Learning and Representation Learning. [Internet] [Doctoral dissertation]. University of Michigan; 2020. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/2027.42/162981.

Council of Science Editors:

Lee K. Robust Deep Learning in the Open World with Lifelong Learning and Representation Learning. [Doctoral Dissertation]. University of Michigan; 2020. Available from: http://hdl.handle.net/2027.42/162981


University of Illinois – Urbana-Champaign

7. Li, Zhizhong. Knowledge transfer in vision tasks with incomplete data.

Degree: PhD, Computer Science, 2020, University of Illinois – Urbana-Champaign

 In many machine learning applications, some assumptions are so prevalent as to be left unwritten: all necessary data are available throughout the training process, the… (more)

Subjects/Keywords: Knowledge transfer; incomplete data; transfer learning; continual learning; deep learning; computer vision

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, Z. (2020). Knowledge transfer in vision tasks with incomplete data. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/107956

Chicago Manual of Style (16th Edition):

Li, Zhizhong. “Knowledge transfer in vision tasks with incomplete data.” 2020. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed February 27, 2021. http://hdl.handle.net/2142/107956.

MLA Handbook (7th Edition):

Li, Zhizhong. “Knowledge transfer in vision tasks with incomplete data.” 2020. Web. 27 Feb 2021.

Vancouver:

Li Z. Knowledge transfer in vision tasks with incomplete data. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2020. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/2142/107956.

Council of Science Editors:

Li Z. Knowledge transfer in vision tasks with incomplete data. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2020. Available from: http://hdl.handle.net/2142/107956


University of Pretoria

8. Boshoff, Annette. Professional development of academic staff in private higher education: Living theory: Mentorship.

Degree: PhD, Humanities Education, 2014, University of Pretoria

 A common phenomenon in the private higher education environment is that lecturers are highly qualified subject specialists and conduct research mainly in areas in their… (more)

Subjects/Keywords: Continual professional development; Herrmann Whole Brain®; Innovative facilitating learning; Learning style; Lifelong learning; UCTD; Posters; Sing-along learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Boshoff, A. (2014). Professional development of academic staff in private higher education: Living theory: Mentorship. (Doctoral Dissertation). University of Pretoria. Retrieved from http://hdl.handle.net/2263/43272

Chicago Manual of Style (16th Edition):

Boshoff, Annette. “Professional development of academic staff in private higher education: Living theory: Mentorship.” 2014. Doctoral Dissertation, University of Pretoria. Accessed February 27, 2021. http://hdl.handle.net/2263/43272.

MLA Handbook (7th Edition):

Boshoff, Annette. “Professional development of academic staff in private higher education: Living theory: Mentorship.” 2014. Web. 27 Feb 2021.

Vancouver:

Boshoff A. Professional development of academic staff in private higher education: Living theory: Mentorship. [Internet] [Doctoral dissertation]. University of Pretoria; 2014. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/2263/43272.

Council of Science Editors:

Boshoff A. Professional development of academic staff in private higher education: Living theory: Mentorship. [Doctoral Dissertation]. University of Pretoria; 2014. Available from: http://hdl.handle.net/2263/43272


RMIT University

9. Fayek, H. Continual deep learning via progressive learning.

Degree: 2018, RMIT University

 Machine learning is one of several approaches to artificial intelligence. It allows us to build machines that can learn from experience as opposed to being… (more)

Subjects/Keywords: Fields of Research; Artificial Intelligence; Machine Learning; Deep Learning; Neural Networks; Continual Learning; Machine Perception; Computer Vision; Speech Recognition

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fayek, H. (2018). Continual deep learning via progressive learning. (Thesis). RMIT University. Retrieved from http://researchbank.rmit.edu.au/view/rmit:162646

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Fayek, H. “Continual deep learning via progressive learning.” 2018. Thesis, RMIT University. Accessed February 27, 2021. http://researchbank.rmit.edu.au/view/rmit:162646.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Fayek, H. “Continual deep learning via progressive learning.” 2018. Web. 27 Feb 2021.

Vancouver:

Fayek H. Continual deep learning via progressive learning. [Internet] [Thesis]. RMIT University; 2018. [cited 2021 Feb 27]. Available from: http://researchbank.rmit.edu.au/view/rmit:162646.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Fayek H. Continual deep learning via progressive learning. [Thesis]. RMIT University; 2018. Available from: http://researchbank.rmit.edu.au/view/rmit:162646

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

10. Lesort, Timothée. Continual Learning : Tackling Catastrophic Forgetting in Deep Neural Networks with Replay Processes : Apprentissage continu : S'attaquer à l'oubli foudroyant des réseaux de neurones profonds grâce aux méthodes à rejeu de données.

Degree: Docteur es, Informatique, données, IA, 2020, Institut polytechnique de Paris

Les humains apprennent toute leur vie. Ils accumulent des connaissances à partir d'une succession d'expériences d'apprentissage et en mémorisent les aspects essentiels sans les oublier.… (more)

Subjects/Keywords: Apprentissage profond; Apprentissage Continu; Régénération; Méthodes de Rejeu; Robotique; Deep Learning; Continual Learning; Generative Replay.; Replay Processes; Robotics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lesort, T. (2020). Continual Learning : Tackling Catastrophic Forgetting in Deep Neural Networks with Replay Processes : Apprentissage continu : S'attaquer à l'oubli foudroyant des réseaux de neurones profonds grâce aux méthodes à rejeu de données. (Doctoral Dissertation). Institut polytechnique de Paris. Retrieved from http://www.theses.fr/2020IPPAE003

Chicago Manual of Style (16th Edition):

Lesort, Timothée. “Continual Learning : Tackling Catastrophic Forgetting in Deep Neural Networks with Replay Processes : Apprentissage continu : S'attaquer à l'oubli foudroyant des réseaux de neurones profonds grâce aux méthodes à rejeu de données.” 2020. Doctoral Dissertation, Institut polytechnique de Paris. Accessed February 27, 2021. http://www.theses.fr/2020IPPAE003.

MLA Handbook (7th Edition):

Lesort, Timothée. “Continual Learning : Tackling Catastrophic Forgetting in Deep Neural Networks with Replay Processes : Apprentissage continu : S'attaquer à l'oubli foudroyant des réseaux de neurones profonds grâce aux méthodes à rejeu de données.” 2020. Web. 27 Feb 2021.

Vancouver:

Lesort T. Continual Learning : Tackling Catastrophic Forgetting in Deep Neural Networks with Replay Processes : Apprentissage continu : S'attaquer à l'oubli foudroyant des réseaux de neurones profonds grâce aux méthodes à rejeu de données. [Internet] [Doctoral dissertation]. Institut polytechnique de Paris; 2020. [cited 2021 Feb 27]. Available from: http://www.theses.fr/2020IPPAE003.

Council of Science Editors:

Lesort T. Continual Learning : Tackling Catastrophic Forgetting in Deep Neural Networks with Replay Processes : Apprentissage continu : S'attaquer à l'oubli foudroyant des réseaux de neurones profonds grâce aux méthodes à rejeu de données. [Doctoral Dissertation]. Institut polytechnique de Paris; 2020. Available from: http://www.theses.fr/2020IPPAE003


Linnaeus University

11. Strutynskiy, Maksym. A concept of an intent-based contextual chat-bot with capabilities for continual learning.

Degree: computer science and media technology (CM), 2020, Linnaeus University

  Chat-bots are computer programs designed to conduct textual or audible conversations with a single user. The job of a chat-bot is to be able… (more)

Subjects/Keywords: Machine learning; intent based; chat-bot; dialogue systems; rule based; Python; TensorFlow; TFLearn; continual learning; online learning; supervised learning; unsupervised learning; IBM Watson; Watson Assistant; Computer Sciences; Datavetenskap (datalogi)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Strutynskiy, M. (2020). A concept of an intent-based contextual chat-bot with capabilities for continual learning. (Thesis). Linnaeus University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-99102

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Strutynskiy, Maksym. “A concept of an intent-based contextual chat-bot with capabilities for continual learning.” 2020. Thesis, Linnaeus University. Accessed February 27, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-99102.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Strutynskiy, Maksym. “A concept of an intent-based contextual chat-bot with capabilities for continual learning.” 2020. Web. 27 Feb 2021.

Vancouver:

Strutynskiy M. A concept of an intent-based contextual chat-bot with capabilities for continual learning. [Internet] [Thesis]. Linnaeus University; 2020. [cited 2021 Feb 27]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-99102.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Strutynskiy M. A concept of an intent-based contextual chat-bot with capabilities for continual learning. [Thesis]. Linnaeus University; 2020. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-99102

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Otago

12. Atkinson, Craig Robert. Achieving continual learning in deep neural networks through pseudo-rehearsal .

Degree: University of Otago

 Neural networks are very powerful computational models, capable of outperforming humans on a variety of tasks. However, unlike humans, these networks tend to catastrophically forget… (more)

Subjects/Keywords: Deep Reinforcement Learning; Pseudo-Rehearsal; Catastrophic Forgetting; Generative Adversarial Network; Continual Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Atkinson, C. R. (n.d.). Achieving continual learning in deep neural networks through pseudo-rehearsal . (Doctoral Dissertation). University of Otago. Retrieved from http://hdl.handle.net/10523/10385

Note: this citation may be lacking information needed for this citation format:
No year of publication.

Chicago Manual of Style (16th Edition):

Atkinson, Craig Robert. “Achieving continual learning in deep neural networks through pseudo-rehearsal .” Doctoral Dissertation, University of Otago. Accessed February 27, 2021. http://hdl.handle.net/10523/10385.

Note: this citation may be lacking information needed for this citation format:
No year of publication.

MLA Handbook (7th Edition):

Atkinson, Craig Robert. “Achieving continual learning in deep neural networks through pseudo-rehearsal .” Web. 27 Feb 2021.

Note: this citation may be lacking information needed for this citation format:
No year of publication.

Vancouver:

Atkinson CR. Achieving continual learning in deep neural networks through pseudo-rehearsal . [Internet] [Doctoral dissertation]. University of Otago; [cited 2021 Feb 27]. Available from: http://hdl.handle.net/10523/10385.

Note: this citation may be lacking information needed for this citation format:
No year of publication.

Council of Science Editors:

Atkinson CR. Achieving continual learning in deep neural networks through pseudo-rehearsal . [Doctoral Dissertation]. University of Otago; Available from: http://hdl.handle.net/10523/10385

Note: this citation may be lacking information needed for this citation format:
No year of publication.


Universitat Autònoma de Barcelona

13. Liu, Xialei. Visual recognition in the wild: learning from rankings in small domains and continual learning in new domains.

Degree: Departament de Ciències de la Computació, 2019, Universitat Autònoma de Barcelona

 Deep convolutional neural networks (CNNs) have achieved superior performance in many visual recognition application, such as image classification, detection and segmentation. In this thesis we… (more)

Subjects/Keywords: Reconeixement visual; Reconocimiento visual; Visual recognition; Aprenentatge auto-supervisat; Aprendizaje auto-supervisado; Self-supervised learning; Aprenentatge continu; Aprendizaje continuo; Continual learning; Ciències Experimentals; 004

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liu, X. (2019). Visual recognition in the wild: learning from rankings in small domains and continual learning in new domains. (Thesis). Universitat Autònoma de Barcelona. Retrieved from http://hdl.handle.net/10803/670154

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Liu, Xialei. “Visual recognition in the wild: learning from rankings in small domains and continual learning in new domains.” 2019. Thesis, Universitat Autònoma de Barcelona. Accessed February 27, 2021. http://hdl.handle.net/10803/670154.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Liu, Xialei. “Visual recognition in the wild: learning from rankings in small domains and continual learning in new domains.” 2019. Web. 27 Feb 2021.

Vancouver:

Liu X. Visual recognition in the wild: learning from rankings in small domains and continual learning in new domains. [Internet] [Thesis]. Universitat Autònoma de Barcelona; 2019. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/10803/670154.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu X. Visual recognition in the wild: learning from rankings in small domains and continual learning in new domains. [Thesis]. Universitat Autònoma de Barcelona; 2019. Available from: http://hdl.handle.net/10803/670154

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


KTH

14. Elers, Andreas. Continual imitation learning: Enhancing safe data set aggregation with elastic weight consolidation.

Degree: Electrical Engineering and Computer Science (EECS), 2019, KTH

The field of machine learning currently draws massive attention due to ad- vancements and successful applications announced in the last few years. One of… (more)

Subjects/Keywords: Elasticweight consolidation; SafeDAGGER; DAGGER; Rehearsal buffer; Self-driving vehicle; Continual learning; Elastisk viktkonsolidering; SafeDAGGER; DAGGER; Repeteringsbuffert; Självkörande fordon; Stegvis inlärning; Computer and Information Sciences; Data- och informationsvetenskap

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Elers, A. (2019). Continual imitation learning: Enhancing safe data set aggregation with elastic weight consolidation. (Thesis). KTH. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-256074

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Elers, Andreas. “Continual imitation learning: Enhancing safe data set aggregation with elastic weight consolidation.” 2019. Thesis, KTH. Accessed February 27, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-256074.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Elers, Andreas. “Continual imitation learning: Enhancing safe data set aggregation with elastic weight consolidation.” 2019. Web. 27 Feb 2021.

Vancouver:

Elers A. Continual imitation learning: Enhancing safe data set aggregation with elastic weight consolidation. [Internet] [Thesis]. KTH; 2019. [cited 2021 Feb 27]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-256074.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Elers A. Continual imitation learning: Enhancing safe data set aggregation with elastic weight consolidation. [Thesis]. KTH; 2019. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-256074

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

15. Lokegaonkar, Sanket Avinash. Continual Learning for Deep Dense Prediction.

Degree: MS, Computer Science and Applications, 2018, Virginia Tech

 Transferring a deep learning model from old tasks to a new one is known to suffer from the catastrophic forgetting effects. Such forgetting mechanism is… (more)

Subjects/Keywords: Computer Vision; Continual Learning; Image Segmentation; Dense Prediction

…List of Figures 1.1 Illustration of continual learning for learning classes, with… …continual learning setting of incremental classes (30 class Pascal to 15 class Pascal)… …24 Evaluation of the baselines and method variants for the continual learning task of… …26 4.4 Evaluation of the baselines and method variants for the continual learning setting… …respectively. 27 x 4.5 Evaluation of the baselines and method variants for the continual learning… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lokegaonkar, S. A. (2018). Continual Learning for Deep Dense Prediction. (Masters Thesis). Virginia Tech. Retrieved from http://hdl.handle.net/10919/83513

Chicago Manual of Style (16th Edition):

Lokegaonkar, Sanket Avinash. “Continual Learning for Deep Dense Prediction.” 2018. Masters Thesis, Virginia Tech. Accessed February 27, 2021. http://hdl.handle.net/10919/83513.

MLA Handbook (7th Edition):

Lokegaonkar, Sanket Avinash. “Continual Learning for Deep Dense Prediction.” 2018. Web. 27 Feb 2021.

Vancouver:

Lokegaonkar SA. Continual Learning for Deep Dense Prediction. [Internet] [Masters thesis]. Virginia Tech; 2018. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/10919/83513.

Council of Science Editors:

Lokegaonkar SA. Continual Learning for Deep Dense Prediction. [Masters Thesis]. Virginia Tech; 2018. Available from: http://hdl.handle.net/10919/83513

16. Besedin, Andrey. Continual forgetting-free deep learning from high-dimensional data streams : L'apprentissage profond continu sans oubli sur les flux de données de haute dimension.

Degree: Docteur es, Informatique, 2019, Paris, CNAM

Dans cette thèse, nous proposons une nouvelle approche de l’apprentissage profond pour la classification des flux de données de grande dimension. Au cours des dernières… (more)

Subjects/Keywords: Apprentissage profond; Apprentissage continu; Flux de données; Oubli catastrophique; Modèles génératifs; Classification de données; Reseaux de Neurones; Apprentissage Incrémental; Deep learning; Continual learning; Data streams; Catastrophic forgetting; Generative models; Classification; Neural Networks; Incremental Learning; 006.32

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Besedin, A. (2019). Continual forgetting-free deep learning from high-dimensional data streams : L'apprentissage profond continu sans oubli sur les flux de données de haute dimension. (Doctoral Dissertation). Paris, CNAM. Retrieved from http://www.theses.fr/2019CNAM1263

Chicago Manual of Style (16th Edition):

Besedin, Andrey. “Continual forgetting-free deep learning from high-dimensional data streams : L'apprentissage profond continu sans oubli sur les flux de données de haute dimension.” 2019. Doctoral Dissertation, Paris, CNAM. Accessed February 27, 2021. http://www.theses.fr/2019CNAM1263.

MLA Handbook (7th Edition):

Besedin, Andrey. “Continual forgetting-free deep learning from high-dimensional data streams : L'apprentissage profond continu sans oubli sur les flux de données de haute dimension.” 2019. Web. 27 Feb 2021.

Vancouver:

Besedin A. Continual forgetting-free deep learning from high-dimensional data streams : L'apprentissage profond continu sans oubli sur les flux de données de haute dimension. [Internet] [Doctoral dissertation]. Paris, CNAM; 2019. [cited 2021 Feb 27]. Available from: http://www.theses.fr/2019CNAM1263.

Council of Science Editors:

Besedin A. Continual forgetting-free deep learning from high-dimensional data streams : L'apprentissage profond continu sans oubli sur les flux de données de haute dimension. [Doctoral Dissertation]. Paris, CNAM; 2019. Available from: http://www.theses.fr/2019CNAM1263

17. Lightheart, Toby Asher. Constructive spiking neural networks for simulations of neuroplasticity.

Degree: 2018, University of Adelaide

 Artificial neural networks are important tools in machine learning and neuroscience; however, a difficult step in their implementation is the selection of the neural network… (more)

Subjects/Keywords: constructive neural networks; spiking neurons; neural simulation; neuroplasticity; STDP; pattern detection; one-shot learning; continual learning

Continual learning is demonstrated through neuron construction with immediate detection of new… …connections are developed in this thesis as an approach to continual learning. Unlike time… …continual learning. Constructive algorithms can provide strict rules for creating neurons and… …detection demonstrate that the constructive algorithm can perform continual learning in long… …Chapter 7 (Continual Learning of Spike Patterns). The thesis concludes with a… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lightheart, T. A. (2018). Constructive spiking neural networks for simulations of neuroplasticity. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/115481

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lightheart, Toby Asher. “Constructive spiking neural networks for simulations of neuroplasticity.” 2018. Thesis, University of Adelaide. Accessed February 27, 2021. http://hdl.handle.net/2440/115481.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lightheart, Toby Asher. “Constructive spiking neural networks for simulations of neuroplasticity.” 2018. Web. 27 Feb 2021.

Vancouver:

Lightheart TA. Constructive spiking neural networks for simulations of neuroplasticity. [Internet] [Thesis]. University of Adelaide; 2018. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/2440/115481.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lightheart TA. Constructive spiking neural networks for simulations of neuroplasticity. [Thesis]. University of Adelaide; 2018. Available from: http://hdl.handle.net/2440/115481

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Arizona State University

18. Zhang, Jie. Towards Robust Machine Learning Models for Data Scarcity.

Degree: Computer Science, 2020, Arizona State University

Subjects/Keywords: Computer science; Computer Vision; Continual Learning; Data Scarcity; Medical Image Analysis; Modeling Disease Progression; Sparse Coding

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhang, J. (2020). Towards Robust Machine Learning Models for Data Scarcity. (Doctoral Dissertation). Arizona State University. Retrieved from http://repository.asu.edu/items/57014

Chicago Manual of Style (16th Edition):

Zhang, Jie. “Towards Robust Machine Learning Models for Data Scarcity.” 2020. Doctoral Dissertation, Arizona State University. Accessed February 27, 2021. http://repository.asu.edu/items/57014.

MLA Handbook (7th Edition):

Zhang, Jie. “Towards Robust Machine Learning Models for Data Scarcity.” 2020. Web. 27 Feb 2021.

Vancouver:

Zhang J. Towards Robust Machine Learning Models for Data Scarcity. [Internet] [Doctoral dissertation]. Arizona State University; 2020. [cited 2021 Feb 27]. Available from: http://repository.asu.edu/items/57014.

Council of Science Editors:

Zhang J. Towards Robust Machine Learning Models for Data Scarcity. [Doctoral Dissertation]. Arizona State University; 2020. Available from: http://repository.asu.edu/items/57014

19. Söderström, Peter. Mognadsgraden för värdeskapande och kontinuerligt lärande : En studie om internt utvecklingsarbete inom den privata tjänstesektorn.

Degree: Faculty of Arts and Sciences, 2016, Linköping UniversityLinköping University

Bakgrund: En kund köper inte varor och tjänster som inte skapar något värde, vilket är något företagen på marknaden måste ta hänsyn till. Utöver… (more)

Subjects/Keywords: Value creation; continual learning; resources; maturity assessment; Organizational IQ-test; Värdeskapande; kontinuerligt lärande; resurser; mognadsgrad; Organizational IQtest; Business Administration; Företagsekonomi

continual learning, continuous improvement, resources samt maturity assessment. Dessutom har vi… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Söderström, P. (2016). Mognadsgraden för värdeskapande och kontinuerligt lärande : En studie om internt utvecklingsarbete inom den privata tjänstesektorn. (Thesis). Linköping UniversityLinköping University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-131999

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Söderström, Peter. “Mognadsgraden för värdeskapande och kontinuerligt lärande : En studie om internt utvecklingsarbete inom den privata tjänstesektorn.” 2016. Thesis, Linköping UniversityLinköping University. Accessed February 27, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-131999.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Söderström, Peter. “Mognadsgraden för värdeskapande och kontinuerligt lärande : En studie om internt utvecklingsarbete inom den privata tjänstesektorn.” 2016. Web. 27 Feb 2021.

Vancouver:

Söderström P. Mognadsgraden för värdeskapande och kontinuerligt lärande : En studie om internt utvecklingsarbete inom den privata tjänstesektorn. [Internet] [Thesis]. Linköping UniversityLinköping University; 2016. [cited 2021 Feb 27]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-131999.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Söderström P. Mognadsgraden för värdeskapande och kontinuerligt lärande : En studie om internt utvecklingsarbete inom den privata tjänstesektorn. [Thesis]. Linköping UniversityLinköping University; 2016. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-131999

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Université de Montréal

20. Gupta, Gunshi. Look-ahead meta-learning for continual learning.

Degree: 2020, Université de Montréal

Subjects/Keywords: Meta Learning; Continual Learning; Machine Learning; Apprentissage tout au long de la vie; E-learning; Méta-apprentissage; Modulation du taux d’apprentissage; Online learning; Learning rate modulation; Applied Sciences - Artificial Intelligence / Sciences appliqués et technologie - Intelligence artificielle (UMI : 0800)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gupta, G. (2020). Look-ahead meta-learning for continual learning. (Thesis). Université de Montréal. Retrieved from http://hdl.handle.net/1866/24315

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Gupta, Gunshi. “Look-ahead meta-learning for continual learning.” 2020. Thesis, Université de Montréal. Accessed February 27, 2021. http://hdl.handle.net/1866/24315.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Gupta, Gunshi. “Look-ahead meta-learning for continual learning.” 2020. Web. 27 Feb 2021.

Vancouver:

Gupta G. Look-ahead meta-learning for continual learning. [Internet] [Thesis]. Université de Montréal; 2020. [cited 2021 Feb 27]. Available from: http://hdl.handle.net/1866/24315.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Gupta G. Look-ahead meta-learning for continual learning. [Thesis]. Université de Montréal; 2020. Available from: http://hdl.handle.net/1866/24315

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

.