Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Deep learning). Showing records 1 – 30 of 1793 total matches.

[1] [2] [3] [4] [5] … [60]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Languages

Country

▼ Search Limiters


Oregon State University

1. Ghaeini, Mohammad Reza. Event Detection with Forward-Backward Recurrent Neural Networks.

Degree: MS, 2017, Oregon State University

 Automatic event extraction from natural text is an important and challenging task for natural language understanding. Traditional event detection methods heavily rely on manually engineered… (more)

Subjects/Keywords: Deep Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ghaeini, M. R. (2017). Event Detection with Forward-Backward Recurrent Neural Networks. (Masters Thesis). Oregon State University. Retrieved from http://hdl.handle.net/1957/61576

Chicago Manual of Style (16th Edition):

Ghaeini, Mohammad Reza. “Event Detection with Forward-Backward Recurrent Neural Networks.” 2017. Masters Thesis, Oregon State University. Accessed February 27, 2020. http://hdl.handle.net/1957/61576.

MLA Handbook (7th Edition):

Ghaeini, Mohammad Reza. “Event Detection with Forward-Backward Recurrent Neural Networks.” 2017. Web. 27 Feb 2020.

Vancouver:

Ghaeini MR. Event Detection with Forward-Backward Recurrent Neural Networks. [Internet] [Masters thesis]. Oregon State University; 2017. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/1957/61576.

Council of Science Editors:

Ghaeini MR. Event Detection with Forward-Backward Recurrent Neural Networks. [Masters Thesis]. Oregon State University; 2017. Available from: http://hdl.handle.net/1957/61576


University of Sydney

2. Windrim, Lloyd. Illumination Invariant Deep Learning for Hyperspectral Data .

Degree: 2018, University of Sydney

 Motivated by the variability in hyperspectral images due to illumination and the difficulty in acquiring labelled data, this thesis proposes different approaches for learning illumination… (more)

Subjects/Keywords: Deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Windrim, L. (2018). Illumination Invariant Deep Learning for Hyperspectral Data . (Thesis). University of Sydney. Retrieved from http://hdl.handle.net/2123/18734

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Windrim, Lloyd. “Illumination Invariant Deep Learning for Hyperspectral Data .” 2018. Thesis, University of Sydney. Accessed February 27, 2020. http://hdl.handle.net/2123/18734.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Windrim, Lloyd. “Illumination Invariant Deep Learning for Hyperspectral Data .” 2018. Web. 27 Feb 2020.

Vancouver:

Windrim L. Illumination Invariant Deep Learning for Hyperspectral Data . [Internet] [Thesis]. University of Sydney; 2018. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/2123/18734.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Windrim L. Illumination Invariant Deep Learning for Hyperspectral Data . [Thesis]. University of Sydney; 2018. Available from: http://hdl.handle.net/2123/18734

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Universidad de Cantabria

3. Noriega Puente, Andrea. Segmentación de gliomas en imagen de resonancia magnética multimodal: Glioma segmentation in multimodal magnetic resonance imaging.

Degree: Máster en Ciencia de Datos, 2019, Universidad de Cantabria

 RESUMEN: El glioma es el tipo de tumor cerebral más común, presentando distintos grados de malignidad y agresividad, así como un pronóstico variable. La gran… (more)

Subjects/Keywords: Deep Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Noriega Puente, A. (2019). Segmentación de gliomas en imagen de resonancia magnética multimodal: Glioma segmentation in multimodal magnetic resonance imaging. (Masters Thesis). Universidad de Cantabria. Retrieved from http://hdl.handle.net/10902/17859

Chicago Manual of Style (16th Edition):

Noriega Puente, Andrea. “Segmentación de gliomas en imagen de resonancia magnética multimodal: Glioma segmentation in multimodal magnetic resonance imaging.” 2019. Masters Thesis, Universidad de Cantabria. Accessed February 27, 2020. http://hdl.handle.net/10902/17859.

MLA Handbook (7th Edition):

Noriega Puente, Andrea. “Segmentación de gliomas en imagen de resonancia magnética multimodal: Glioma segmentation in multimodal magnetic resonance imaging.” 2019. Web. 27 Feb 2020.

Vancouver:

Noriega Puente A. Segmentación de gliomas en imagen de resonancia magnética multimodal: Glioma segmentation in multimodal magnetic resonance imaging. [Internet] [Masters thesis]. Universidad de Cantabria; 2019. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/10902/17859.

Council of Science Editors:

Noriega Puente A. Segmentación de gliomas en imagen de resonancia magnética multimodal: Glioma segmentation in multimodal magnetic resonance imaging. [Masters Thesis]. Universidad de Cantabria; 2019. Available from: http://hdl.handle.net/10902/17859


Cornell University

4. Sedra, Daniel. Training Paradigms For Deep Residual Networks .

Degree: 2016, Cornell University

 Convolutional networks are the current state of the art for image tasks. It has long been known that depth is key for increasing their expressive… (more)

Subjects/Keywords: deep learning; machine learning; deep residual networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sedra, D. (2016). Training Paradigms For Deep Residual Networks . (Thesis). Cornell University. Retrieved from http://hdl.handle.net/1813/44294

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sedra, Daniel. “Training Paradigms For Deep Residual Networks .” 2016. Thesis, Cornell University. Accessed February 27, 2020. http://hdl.handle.net/1813/44294.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sedra, Daniel. “Training Paradigms For Deep Residual Networks .” 2016. Web. 27 Feb 2020.

Vancouver:

Sedra D. Training Paradigms For Deep Residual Networks . [Internet] [Thesis]. Cornell University; 2016. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/1813/44294.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sedra D. Training Paradigms For Deep Residual Networks . [Thesis]. Cornell University; 2016. Available from: http://hdl.handle.net/1813/44294

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Georgia Tech

5. Choi, Edward. Doctor AI: Interpretable deep learning for modeling electronic health records.

Degree: PhD, Computational Science and Engineering, 2018, Georgia Tech

Deep learning recently has been showing superior performance in complex domains such as computer vision, audio processing and natural language processing compared to traditional statistical… (more)

Subjects/Keywords: Deep learning; Healthcare

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Choi, E. (2018). Doctor AI: Interpretable deep learning for modeling electronic health records. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/60226

Chicago Manual of Style (16th Edition):

Choi, Edward. “Doctor AI: Interpretable deep learning for modeling electronic health records.” 2018. Doctoral Dissertation, Georgia Tech. Accessed February 27, 2020. http://hdl.handle.net/1853/60226.

MLA Handbook (7th Edition):

Choi, Edward. “Doctor AI: Interpretable deep learning for modeling electronic health records.” 2018. Web. 27 Feb 2020.

Vancouver:

Choi E. Doctor AI: Interpretable deep learning for modeling electronic health records. [Internet] [Doctoral dissertation]. Georgia Tech; 2018. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/1853/60226.

Council of Science Editors:

Choi E. Doctor AI: Interpretable deep learning for modeling electronic health records. [Doctoral Dissertation]. Georgia Tech; 2018. Available from: http://hdl.handle.net/1853/60226


University of KwaZulu-Natal

6. Govender, Lishen. Determination of quantum entanglement concurrence using multilayer perceptron neural networks.

Degree: 2017, University of KwaZulu-Natal

 Artificial Neural Networks, inspired by biological neural networks, have seen widespread implementations across all research areas in the past few years. This partly due to… (more)

Subjects/Keywords: Deep learning.; Machine learning.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Govender, L. (2017). Determination of quantum entanglement concurrence using multilayer perceptron neural networks. (Thesis). University of KwaZulu-Natal. Retrieved from http://hdl.handle.net/10413/15713

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Govender, Lishen. “Determination of quantum entanglement concurrence using multilayer perceptron neural networks.” 2017. Thesis, University of KwaZulu-Natal. Accessed February 27, 2020. http://hdl.handle.net/10413/15713.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Govender, Lishen. “Determination of quantum entanglement concurrence using multilayer perceptron neural networks.” 2017. Web. 27 Feb 2020.

Vancouver:

Govender L. Determination of quantum entanglement concurrence using multilayer perceptron neural networks. [Internet] [Thesis]. University of KwaZulu-Natal; 2017. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/10413/15713.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Govender L. Determination of quantum entanglement concurrence using multilayer perceptron neural networks. [Thesis]. University of KwaZulu-Natal; 2017. Available from: http://hdl.handle.net/10413/15713

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Illinois – Urbana-Champaign

7. Deshpande, Ishan. Generative modeling using the sliced Wasserstein distance.

Degree: MS, Electrical & Computer Engr, 2018, University of Illinois – Urbana-Champaign

 Generative adversarial nets (GANs) are very successful at modeling distributions from given samples, even in the high-dimensional case. However, their formulation is also known to… (more)

Subjects/Keywords: Machine Learning; Deep Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Deshpande, I. (2018). Generative modeling using the sliced Wasserstein distance. (Thesis). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/100951

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Deshpande, Ishan. “Generative modeling using the sliced Wasserstein distance.” 2018. Thesis, University of Illinois – Urbana-Champaign. Accessed February 27, 2020. http://hdl.handle.net/2142/100951.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Deshpande, Ishan. “Generative modeling using the sliced Wasserstein distance.” 2018. Web. 27 Feb 2020.

Vancouver:

Deshpande I. Generative modeling using the sliced Wasserstein distance. [Internet] [Thesis]. University of Illinois – Urbana-Champaign; 2018. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/2142/100951.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Deshpande I. Generative modeling using the sliced Wasserstein distance. [Thesis]. University of Illinois – Urbana-Champaign; 2018. Available from: http://hdl.handle.net/2142/100951

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


California State University – Sacramento

8. Poosarla, Akshay. Bone age prediction with convolutional neural networks.

Degree: MS, Computer Science, 2019, California State University – Sacramento

 Skeletal bone age assessment is a common clinical practice to analyze and assess the biological maturity of pediatric patients. This process generally involves taking X-ray… (more)

Subjects/Keywords: Machine learning; Deep learning; Boneage

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Poosarla, A. (2019). Bone age prediction with convolutional neural networks. (Masters Thesis). California State University – Sacramento. Retrieved from http://hdl.handle.net/10211.3/207660

Chicago Manual of Style (16th Edition):

Poosarla, Akshay. “Bone age prediction with convolutional neural networks.” 2019. Masters Thesis, California State University – Sacramento. Accessed February 27, 2020. http://hdl.handle.net/10211.3/207660.

MLA Handbook (7th Edition):

Poosarla, Akshay. “Bone age prediction with convolutional neural networks.” 2019. Web. 27 Feb 2020.

Vancouver:

Poosarla A. Bone age prediction with convolutional neural networks. [Internet] [Masters thesis]. California State University – Sacramento; 2019. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/10211.3/207660.

Council of Science Editors:

Poosarla A. Bone age prediction with convolutional neural networks. [Masters Thesis]. California State University – Sacramento; 2019. Available from: http://hdl.handle.net/10211.3/207660


Cornell University

9. Lenz, Ian. Deep Learning For Robotics .

Degree: 2016, Cornell University

 Robotics faces many unique challenges as robotic platforms move out of the lab and into the real world. In particular, the huge amount of variety… (more)

Subjects/Keywords: Robotics; Machine learning; Deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lenz, I. (2016). Deep Learning For Robotics . (Thesis). Cornell University. Retrieved from http://hdl.handle.net/1813/44317

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lenz, Ian. “Deep Learning For Robotics .” 2016. Thesis, Cornell University. Accessed February 27, 2020. http://hdl.handle.net/1813/44317.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lenz, Ian. “Deep Learning For Robotics .” 2016. Web. 27 Feb 2020.

Vancouver:

Lenz I. Deep Learning For Robotics . [Internet] [Thesis]. Cornell University; 2016. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/1813/44317.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lenz I. Deep Learning For Robotics . [Thesis]. Cornell University; 2016. Available from: http://hdl.handle.net/1813/44317

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Illinois – Urbana-Champaign

10. Liu, Jialin. Machine learning workflow optimization via automatic discovery of resource reuse opportunities.

Degree: MS, Computer Science, 2019, University of Illinois – Urbana-Champaign

 Many state-of-the-art deep learning models rely on dynamic computation logic, making them difficult to optimize. In this thesis, we present a hashing based algorithm that… (more)

Subjects/Keywords: Machine Learning; Deep Learning; System

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liu, J. (2019). Machine learning workflow optimization via automatic discovery of resource reuse opportunities. (Thesis). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/104894

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Liu, Jialin. “Machine learning workflow optimization via automatic discovery of resource reuse opportunities.” 2019. Thesis, University of Illinois – Urbana-Champaign. Accessed February 27, 2020. http://hdl.handle.net/2142/104894.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Liu, Jialin. “Machine learning workflow optimization via automatic discovery of resource reuse opportunities.” 2019. Web. 27 Feb 2020.

Vancouver:

Liu J. Machine learning workflow optimization via automatic discovery of resource reuse opportunities. [Internet] [Thesis]. University of Illinois – Urbana-Champaign; 2019. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/2142/104894.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu J. Machine learning workflow optimization via automatic discovery of resource reuse opportunities. [Thesis]. University of Illinois – Urbana-Champaign; 2019. Available from: http://hdl.handle.net/2142/104894

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Princeton University

11. Ravi, Sachin. Meta-Learning for Data and Processing Efficiency .

Degree: PhD, 2019, Princeton University

Deep learning models have shown great success in a variety of machine learning benchmarks; however, these models still lack the efficiency and flexibility of humans.… (more)

Subjects/Keywords: Deep Learning; Meta-Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ravi, S. (2019). Meta-Learning for Data and Processing Efficiency . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp013j333513x

Chicago Manual of Style (16th Edition):

Ravi, Sachin. “Meta-Learning for Data and Processing Efficiency .” 2019. Doctoral Dissertation, Princeton University. Accessed February 27, 2020. http://arks.princeton.edu/ark:/88435/dsp013j333513x.

MLA Handbook (7th Edition):

Ravi, Sachin. “Meta-Learning for Data and Processing Efficiency .” 2019. Web. 27 Feb 2020.

Vancouver:

Ravi S. Meta-Learning for Data and Processing Efficiency . [Internet] [Doctoral dissertation]. Princeton University; 2019. [cited 2020 Feb 27]. Available from: http://arks.princeton.edu/ark:/88435/dsp013j333513x.

Council of Science Editors:

Ravi S. Meta-Learning for Data and Processing Efficiency . [Doctoral Dissertation]. Princeton University; 2019. Available from: http://arks.princeton.edu/ark:/88435/dsp013j333513x


Australian National University

12. Dong, Cong. Spatial Deep Networks for Outdoor Scene Classification .

Degree: 2015, Australian National University

 Scene classification has become an increasingly popular topic in computer vision. The techniques for scene classification can be widely used in many other aspects, such… (more)

Subjects/Keywords: Deep Learning; Scene Classification; Spatial Deep Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Dong, C. (2015). Spatial Deep Networks for Outdoor Scene Classification . (Thesis). Australian National University. Retrieved from http://hdl.handle.net/1885/101712

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Dong, Cong. “Spatial Deep Networks for Outdoor Scene Classification .” 2015. Thesis, Australian National University. Accessed February 27, 2020. http://hdl.handle.net/1885/101712.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Dong, Cong. “Spatial Deep Networks for Outdoor Scene Classification .” 2015. Web. 27 Feb 2020.

Vancouver:

Dong C. Spatial Deep Networks for Outdoor Scene Classification . [Internet] [Thesis]. Australian National University; 2015. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/1885/101712.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Dong C. Spatial Deep Networks for Outdoor Scene Classification . [Thesis]. Australian National University; 2015. Available from: http://hdl.handle.net/1885/101712

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Guelph

13. Im, Jiwoong. Analyzing Unsupervised Representation Learning Models Under the View of Dynamical Systems .

Degree: 2015, University of Guelph

 The objective of this thesis is to take the dynamical systems approach to understand the unsupervised learning models and learning algorithms. Gated auto-encoders (GAEs) are… (more)

Subjects/Keywords: Machine learning; Deep Learning; unsupervised learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Im, J. (2015). Analyzing Unsupervised Representation Learning Models Under the View of Dynamical Systems . (Thesis). University of Guelph. Retrieved from https://atrium.lib.uoguelph.ca/xmlui/handle/10214/8809

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Im, Jiwoong. “Analyzing Unsupervised Representation Learning Models Under the View of Dynamical Systems .” 2015. Thesis, University of Guelph. Accessed February 27, 2020. https://atrium.lib.uoguelph.ca/xmlui/handle/10214/8809.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Im, Jiwoong. “Analyzing Unsupervised Representation Learning Models Under the View of Dynamical Systems .” 2015. Web. 27 Feb 2020.

Vancouver:

Im J. Analyzing Unsupervised Representation Learning Models Under the View of Dynamical Systems . [Internet] [Thesis]. University of Guelph; 2015. [cited 2020 Feb 27]. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/8809.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Im J. Analyzing Unsupervised Representation Learning Models Under the View of Dynamical Systems . [Thesis]. University of Guelph; 2015. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/8809

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Waterloo

14. Gaurav, Ashish. Safety-Oriented Stability Biases for Continual Learning.

Degree: 2020, University of Waterloo

 Continual learning is often confounded by “catastrophic forgetting” that prevents neural networks from learning tasks sequentially. In the case of real world classification systems that… (more)

Subjects/Keywords: deep learning; continual learning; classification; reinforcement learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gaurav, A. (2020). Safety-Oriented Stability Biases for Continual Learning. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/15579

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Gaurav, Ashish. “Safety-Oriented Stability Biases for Continual Learning.” 2020. Thesis, University of Waterloo. Accessed February 27, 2020. http://hdl.handle.net/10012/15579.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Gaurav, Ashish. “Safety-Oriented Stability Biases for Continual Learning.” 2020. Web. 27 Feb 2020.

Vancouver:

Gaurav A. Safety-Oriented Stability Biases for Continual Learning. [Internet] [Thesis]. University of Waterloo; 2020. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/10012/15579.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Gaurav A. Safety-Oriented Stability Biases for Continual Learning. [Thesis]. University of Waterloo; 2020. Available from: http://hdl.handle.net/10012/15579

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


NSYSU

15. Lin, Kun-da. Deep Reinforcement Learning with a Gating Network.

Degree: Master, Electrical Engineering, 2017, NSYSU

 Reinforcement Learning (RL) is a good way to train the robot since it doesn't need an exact model of the environment. All need is to… (more)

Subjects/Keywords: Reinforcement Learning; Deep Reinforcement Learning; Deep Learning; Gating network; Neural network

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lin, K. (2017). Deep Reinforcement Learning with a Gating Network. (Thesis). NSYSU. Retrieved from http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0223117-131536

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lin, Kun-da. “Deep Reinforcement Learning with a Gating Network.” 2017. Thesis, NSYSU. Accessed February 27, 2020. http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0223117-131536.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lin, Kun-da. “Deep Reinforcement Learning with a Gating Network.” 2017. Web. 27 Feb 2020.

Vancouver:

Lin K. Deep Reinforcement Learning with a Gating Network. [Internet] [Thesis]. NSYSU; 2017. [cited 2020 Feb 27]. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0223117-131536.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lin K. Deep Reinforcement Learning with a Gating Network. [Thesis]. NSYSU; 2017. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0223117-131536

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Illinois – Urbana-Champaign

16. Liang, Shiyu. Why deep neural networks for function approximation.

Degree: MS, Electrical & Computer Engr, 2017, University of Illinois – Urbana-Champaign

 Recently there has been much interest in understanding why deep neural networks are preferred to shallow networks. We show that, for a large class of… (more)

Subjects/Keywords: Neural networks; Deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liang, S. (2017). Why deep neural networks for function approximation. (Thesis). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/99417

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Liang, Shiyu. “Why deep neural networks for function approximation.” 2017. Thesis, University of Illinois – Urbana-Champaign. Accessed February 27, 2020. http://hdl.handle.net/2142/99417.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Liang, Shiyu. “Why deep neural networks for function approximation.” 2017. Web. 27 Feb 2020.

Vancouver:

Liang S. Why deep neural networks for function approximation. [Internet] [Thesis]. University of Illinois – Urbana-Champaign; 2017. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/2142/99417.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liang S. Why deep neural networks for function approximation. [Thesis]. University of Illinois – Urbana-Champaign; 2017. Available from: http://hdl.handle.net/2142/99417

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Illinois – Urbana-Champaign

17. Yeh, Raymond Alexander. Stable and symmetric convolutional neural network.

Degree: MS, Electrical & Computer Engr, 2016, University of Illinois – Urbana-Champaign

 First we present a proof that convolutional neural networks (CNNs) with max-norm regularization, max-pooling, and Relu non-linearity are stable to additive noise. Second, we explore… (more)

Subjects/Keywords: convolutional neural network; deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yeh, R. A. (2016). Stable and symmetric convolutional neural network. (Thesis). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/92687

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Yeh, Raymond Alexander. “Stable and symmetric convolutional neural network.” 2016. Thesis, University of Illinois – Urbana-Champaign. Accessed February 27, 2020. http://hdl.handle.net/2142/92687.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Yeh, Raymond Alexander. “Stable and symmetric convolutional neural network.” 2016. Web. 27 Feb 2020.

Vancouver:

Yeh RA. Stable and symmetric convolutional neural network. [Internet] [Thesis]. University of Illinois – Urbana-Champaign; 2016. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/2142/92687.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Yeh RA. Stable and symmetric convolutional neural network. [Thesis]. University of Illinois – Urbana-Champaign; 2016. Available from: http://hdl.handle.net/2142/92687

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Rochester Institute of Technology

18. Petroski Such, Felipe. Deep Learning Architectures for Novel Problems.

Degree: MS, Computer Engineering, 2017, Rochester Institute of Technology

  With convolutional neural networks revolutionizing the computer vision field it is important to extend the capabilities of neural-based systems to dynamic and unrestricted data… (more)

Subjects/Keywords: Deep learning; ICR; Machine intelligence

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Petroski Such, F. (2017). Deep Learning Architectures for Novel Problems. (Masters Thesis). Rochester Institute of Technology. Retrieved from https://scholarworks.rit.edu/theses/9611

Chicago Manual of Style (16th Edition):

Petroski Such, Felipe. “Deep Learning Architectures for Novel Problems.” 2017. Masters Thesis, Rochester Institute of Technology. Accessed February 27, 2020. https://scholarworks.rit.edu/theses/9611.

MLA Handbook (7th Edition):

Petroski Such, Felipe. “Deep Learning Architectures for Novel Problems.” 2017. Web. 27 Feb 2020.

Vancouver:

Petroski Such F. Deep Learning Architectures for Novel Problems. [Internet] [Masters thesis]. Rochester Institute of Technology; 2017. [cited 2020 Feb 27]. Available from: https://scholarworks.rit.edu/theses/9611.

Council of Science Editors:

Petroski Such F. Deep Learning Architectures for Novel Problems. [Masters Thesis]. Rochester Institute of Technology; 2017. Available from: https://scholarworks.rit.edu/theses/9611


Rochester Institute of Technology

19. Lamos-Sweeney, Joshua. Deep learning using genetic algorithms.

Degree: Computer Science (GCCIS), 2012, Rochester Institute of Technology

Deep Learning networks are a new type of neural network that discovers important object features. These networks determine features without supervision, and are adept at… (more)

Subjects/Keywords: Deep learning; Genetic algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lamos-Sweeney, J. (2012). Deep learning using genetic algorithms. (Thesis). Rochester Institute of Technology. Retrieved from https://scholarworks.rit.edu/theses/254

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lamos-Sweeney, Joshua. “Deep learning using genetic algorithms.” 2012. Thesis, Rochester Institute of Technology. Accessed February 27, 2020. https://scholarworks.rit.edu/theses/254.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lamos-Sweeney, Joshua. “Deep learning using genetic algorithms.” 2012. Web. 27 Feb 2020.

Vancouver:

Lamos-Sweeney J. Deep learning using genetic algorithms. [Internet] [Thesis]. Rochester Institute of Technology; 2012. [cited 2020 Feb 27]. Available from: https://scholarworks.rit.edu/theses/254.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lamos-Sweeney J. Deep learning using genetic algorithms. [Thesis]. Rochester Institute of Technology; 2012. Available from: https://scholarworks.rit.edu/theses/254

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Penn State University

20. Lageman, Nathaniel John. BinDNN: Resilient Function Matching Using Deep Learning.

Degree: MS, Computer Science and Engineering, 2016, Penn State University

 Determining if two functions taken from different compiled binaries originate from the same function in the source code has many applications to malware reverse engineering.… (more)

Subjects/Keywords: reverse engineering; malware; deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lageman, N. J. (2016). BinDNN: Resilient Function Matching Using Deep Learning. (Masters Thesis). Penn State University. Retrieved from https://etda.libraries.psu.edu/catalog/12477njl5114

Chicago Manual of Style (16th Edition):

Lageman, Nathaniel John. “BinDNN: Resilient Function Matching Using Deep Learning.” 2016. Masters Thesis, Penn State University. Accessed February 27, 2020. https://etda.libraries.psu.edu/catalog/12477njl5114.

MLA Handbook (7th Edition):

Lageman, Nathaniel John. “BinDNN: Resilient Function Matching Using Deep Learning.” 2016. Web. 27 Feb 2020.

Vancouver:

Lageman NJ. BinDNN: Resilient Function Matching Using Deep Learning. [Internet] [Masters thesis]. Penn State University; 2016. [cited 2020 Feb 27]. Available from: https://etda.libraries.psu.edu/catalog/12477njl5114.

Council of Science Editors:

Lageman NJ. BinDNN: Resilient Function Matching Using Deep Learning. [Masters Thesis]. Penn State University; 2016. Available from: https://etda.libraries.psu.edu/catalog/12477njl5114


Penn State University

21. Papernot, Nicolas. On The Integrity Of Deep Learning Systems In Adversarial Settings.

Degree: MS, Computer Science and Engineering, 2016, Penn State University

Deep learning takes advantage of large datasets and computationally efficient training algorithms to outperform other approaches at various machine learning tasks. However, imperfections in the… (more)

Subjects/Keywords: computer security; deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Papernot, N. (2016). On The Integrity Of Deep Learning Systems In Adversarial Settings. (Masters Thesis). Penn State University. Retrieved from https://etda.libraries.psu.edu/catalog/28680

Chicago Manual of Style (16th Edition):

Papernot, Nicolas. “On The Integrity Of Deep Learning Systems In Adversarial Settings.” 2016. Masters Thesis, Penn State University. Accessed February 27, 2020. https://etda.libraries.psu.edu/catalog/28680.

MLA Handbook (7th Edition):

Papernot, Nicolas. “On The Integrity Of Deep Learning Systems In Adversarial Settings.” 2016. Web. 27 Feb 2020.

Vancouver:

Papernot N. On The Integrity Of Deep Learning Systems In Adversarial Settings. [Internet] [Masters thesis]. Penn State University; 2016. [cited 2020 Feb 27]. Available from: https://etda.libraries.psu.edu/catalog/28680.

Council of Science Editors:

Papernot N. On The Integrity Of Deep Learning Systems In Adversarial Settings. [Masters Thesis]. Penn State University; 2016. Available from: https://etda.libraries.psu.edu/catalog/28680


University of Manchester

22. Salman, Ahmad. Learning speaker-specific characteristics with deep neural architecture.

Degree: PhD, 2012, University of Manchester

 Robust Speaker Recognition (SR) has been a focus of attention for researchers since long. The advancement in speech-aided technologies especially biometrics highlights the necessity of… (more)

Subjects/Keywords: 006.4; Speaker Recognition; Deep Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Salman, A. (2012). Learning speaker-specific characteristics with deep neural architecture. (Doctoral Dissertation). University of Manchester. Retrieved from https://www.research.manchester.ac.uk/portal/en/theses/learning-speakerspecific-characteristics-with-deep-neural-architecture(24acb31d-2106-4e52-80ab-6c649838026a).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.558060

Chicago Manual of Style (16th Edition):

Salman, Ahmad. “Learning speaker-specific characteristics with deep neural architecture.” 2012. Doctoral Dissertation, University of Manchester. Accessed February 27, 2020. https://www.research.manchester.ac.uk/portal/en/theses/learning-speakerspecific-characteristics-with-deep-neural-architecture(24acb31d-2106-4e52-80ab-6c649838026a).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.558060.

MLA Handbook (7th Edition):

Salman, Ahmad. “Learning speaker-specific characteristics with deep neural architecture.” 2012. Web. 27 Feb 2020.

Vancouver:

Salman A. Learning speaker-specific characteristics with deep neural architecture. [Internet] [Doctoral dissertation]. University of Manchester; 2012. [cited 2020 Feb 27]. Available from: https://www.research.manchester.ac.uk/portal/en/theses/learning-speakerspecific-characteristics-with-deep-neural-architecture(24acb31d-2106-4e52-80ab-6c649838026a).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.558060.

Council of Science Editors:

Salman A. Learning speaker-specific characteristics with deep neural architecture. [Doctoral Dissertation]. University of Manchester; 2012. Available from: https://www.research.manchester.ac.uk/portal/en/theses/learning-speakerspecific-characteristics-with-deep-neural-architecture(24acb31d-2106-4e52-80ab-6c649838026a).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.558060


University of Otago

23. Szymanski, Lech. Deep architectures and classification by intermediary transformations .

Degree: 2012, University of Otago

 With the development of deep belief nets, the empirical evidence supporting a link between deep architecture neural networks and generalisation with respect to classification has… (more)

Subjects/Keywords: Machine learning; Classification; Deep architectures

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Szymanski, L. (2012). Deep architectures and classification by intermediary transformations . (Doctoral Dissertation). University of Otago. Retrieved from http://hdl.handle.net/10523/2129

Chicago Manual of Style (16th Edition):

Szymanski, Lech. “Deep architectures and classification by intermediary transformations .” 2012. Doctoral Dissertation, University of Otago. Accessed February 27, 2020. http://hdl.handle.net/10523/2129.

MLA Handbook (7th Edition):

Szymanski, Lech. “Deep architectures and classification by intermediary transformations .” 2012. Web. 27 Feb 2020.

Vancouver:

Szymanski L. Deep architectures and classification by intermediary transformations . [Internet] [Doctoral dissertation]. University of Otago; 2012. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/10523/2129.

Council of Science Editors:

Szymanski L. Deep architectures and classification by intermediary transformations . [Doctoral Dissertation]. University of Otago; 2012. Available from: http://hdl.handle.net/10523/2129


University of Illinois – Urbana-Champaign

24. Wang, Zhangyang. Task-specific and interpretable feature learning.

Degree: PhD, Electrical & Computer Engr, 2016, University of Illinois – Urbana-Champaign

Deep learning models have had tremendous impacts in recent years, while a question has been raised by many: Is deep learning just a triumph of… (more)

Subjects/Keywords: deep learning; sparse representation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, Z. (2016). Task-specific and interpretable feature learning. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/95560

Chicago Manual of Style (16th Edition):

Wang, Zhangyang. “Task-specific and interpretable feature learning.” 2016. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed February 27, 2020. http://hdl.handle.net/2142/95560.

MLA Handbook (7th Edition):

Wang, Zhangyang. “Task-specific and interpretable feature learning.” 2016. Web. 27 Feb 2020.

Vancouver:

Wang Z. Task-specific and interpretable feature learning. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2016. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/2142/95560.

Council of Science Editors:

Wang Z. Task-specific and interpretable feature learning. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2016. Available from: http://hdl.handle.net/2142/95560

25. Furusho, Yasutaka. Roles of Pre-training in Deep Neural Networks from Information Theoretical Perspective : Pre-trainingがニューラルネットワークに与える影響の情報理論的解析; Pre-training ガ ニューラル ネットワーク ニ アタエル エイキョウ ノ ジョウホウ リロンテキ カイセキ.

Degree: Nara Institute of Science and Technology / 奈良先端科学技術大学院大学

Subjects/Keywords: deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Furusho, Y. (n.d.). Roles of Pre-training in Deep Neural Networks from Information Theoretical Perspective : Pre-trainingがニューラルネットワークに与える影響の情報理論的解析; Pre-training ガ ニューラル ネットワーク ニ アタエル エイキョウ ノ ジョウホウ リロンテキ カイセキ. (Thesis). Nara Institute of Science and Technology / 奈良先端科学技術大学院大学. Retrieved from http://hdl.handle.net/10061/11622

Note: this citation may be lacking information needed for this citation format:
No year of publication.
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Furusho, Yasutaka. “Roles of Pre-training in Deep Neural Networks from Information Theoretical Perspective : Pre-trainingがニューラルネットワークに与える影響の情報理論的解析; Pre-training ガ ニューラル ネットワーク ニ アタエル エイキョウ ノ ジョウホウ リロンテキ カイセキ.” Thesis, Nara Institute of Science and Technology / 奈良先端科学技術大学院大学. Accessed February 27, 2020. http://hdl.handle.net/10061/11622.

Note: this citation may be lacking information needed for this citation format:
No year of publication.
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Furusho, Yasutaka. “Roles of Pre-training in Deep Neural Networks from Information Theoretical Perspective : Pre-trainingがニューラルネットワークに与える影響の情報理論的解析; Pre-training ガ ニューラル ネットワーク ニ アタエル エイキョウ ノ ジョウホウ リロンテキ カイセキ.” Web. 27 Feb 2020.

Note: this citation may be lacking information needed for this citation format:
No year of publication.

Vancouver:

Furusho Y. Roles of Pre-training in Deep Neural Networks from Information Theoretical Perspective : Pre-trainingがニューラルネットワークに与える影響の情報理論的解析; Pre-training ガ ニューラル ネットワーク ニ アタエル エイキョウ ノ ジョウホウ リロンテキ カイセキ. [Internet] [Thesis]. Nara Institute of Science and Technology / 奈良先端科学技術大学院大学; [cited 2020 Feb 27]. Available from: http://hdl.handle.net/10061/11622.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
No year of publication.

Council of Science Editors:

Furusho Y. Roles of Pre-training in Deep Neural Networks from Information Theoretical Perspective : Pre-trainingがニューラルネットワークに与える影響の情報理論的解析; Pre-training ガ ニューラル ネットワーク ニ アタエル エイキョウ ノ ジョウホウ リロンテキ カイセキ. [Thesis]. Nara Institute of Science and Technology / 奈良先端科学技術大学院大学; Available from: http://hdl.handle.net/10061/11622

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
No year of publication.


McMaster University

26. Liu, Zheng. Generic Model-Agnostic Convolutional Neural Networks for Single Image Dehazing.

Degree: MASc, 2018, McMaster University

Haze and smog are among the most common environmental factors impacting image quality and, therefore, image analysis. In this paper, I propose an end-to-end generative… (more)

Subjects/Keywords: image dehazing; deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liu, Z. (2018). Generic Model-Agnostic Convolutional Neural Networks for Single Image Dehazing. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/23979

Chicago Manual of Style (16th Edition):

Liu, Zheng. “Generic Model-Agnostic Convolutional Neural Networks for Single Image Dehazing.” 2018. Masters Thesis, McMaster University. Accessed February 27, 2020. http://hdl.handle.net/11375/23979.

MLA Handbook (7th Edition):

Liu, Zheng. “Generic Model-Agnostic Convolutional Neural Networks for Single Image Dehazing.” 2018. Web. 27 Feb 2020.

Vancouver:

Liu Z. Generic Model-Agnostic Convolutional Neural Networks for Single Image Dehazing. [Internet] [Masters thesis]. McMaster University; 2018. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/11375/23979.

Council of Science Editors:

Liu Z. Generic Model-Agnostic Convolutional Neural Networks for Single Image Dehazing. [Masters Thesis]. McMaster University; 2018. Available from: http://hdl.handle.net/11375/23979


University of Adelaide

27. Li, Teng. Deep learning for fine-grained visual recognition.

Degree: 2017, University of Adelaide

 Fine-grained object recognition is an important task in computer vision. The cross-convolutional-layer pooling method is one of the significant milestones in the development of this… (more)

Subjects/Keywords: deep learning; fine-grained; recognition

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, T. (2017). Deep learning for fine-grained visual recognition. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/106421

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Li, Teng. “Deep learning for fine-grained visual recognition.” 2017. Thesis, University of Adelaide. Accessed February 27, 2020. http://hdl.handle.net/2440/106421.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Li, Teng. “Deep learning for fine-grained visual recognition.” 2017. Web. 27 Feb 2020.

Vancouver:

Li T. Deep learning for fine-grained visual recognition. [Internet] [Thesis]. University of Adelaide; 2017. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/2440/106421.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Li T. Deep learning for fine-grained visual recognition. [Thesis]. University of Adelaide; 2017. Available from: http://hdl.handle.net/2440/106421

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


McMaster University

28. Chi, Zhixiang. IMAGE RESTORATIONS USING DEEP LEARNING TECHNIQUES.

Degree: MASc, 2018, McMaster University

Conventional methods for solving image restoration problems are typically built on an image degradation model and on some priors of the latent image. The model… (more)

Subjects/Keywords: Image restoration; Deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chi, Z. (2018). IMAGE RESTORATIONS USING DEEP LEARNING TECHNIQUES. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/24290

Chicago Manual of Style (16th Edition):

Chi, Zhixiang. “IMAGE RESTORATIONS USING DEEP LEARNING TECHNIQUES.” 2018. Masters Thesis, McMaster University. Accessed February 27, 2020. http://hdl.handle.net/11375/24290.

MLA Handbook (7th Edition):

Chi, Zhixiang. “IMAGE RESTORATIONS USING DEEP LEARNING TECHNIQUES.” 2018. Web. 27 Feb 2020.

Vancouver:

Chi Z. IMAGE RESTORATIONS USING DEEP LEARNING TECHNIQUES. [Internet] [Masters thesis]. McMaster University; 2018. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/11375/24290.

Council of Science Editors:

Chi Z. IMAGE RESTORATIONS USING DEEP LEARNING TECHNIQUES. [Masters Thesis]. McMaster University; 2018. Available from: http://hdl.handle.net/11375/24290


University of Waterloo

29. Tang, Yichuan. Robust Visual Recognition Using Multilayer Generative Neural Networks.

Degree: 2010, University of Waterloo

Deep generative neural networks such as the Deep Belief Network and Deep Boltzmann Machines have been used successfully to model high dimensional visual data. However,… (more)

Subjects/Keywords: Neural Networks; Deep Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tang, Y. (2010). Robust Visual Recognition Using Multilayer Generative Neural Networks. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/5376

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tang, Yichuan. “Robust Visual Recognition Using Multilayer Generative Neural Networks.” 2010. Thesis, University of Waterloo. Accessed February 27, 2020. http://hdl.handle.net/10012/5376.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tang, Yichuan. “Robust Visual Recognition Using Multilayer Generative Neural Networks.” 2010. Web. 27 Feb 2020.

Vancouver:

Tang Y. Robust Visual Recognition Using Multilayer Generative Neural Networks. [Internet] [Thesis]. University of Waterloo; 2010. [cited 2020 Feb 27]. Available from: http://hdl.handle.net/10012/5376.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tang Y. Robust Visual Recognition Using Multilayer Generative Neural Networks. [Thesis]. University of Waterloo; 2010. Available from: http://hdl.handle.net/10012/5376

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


UCLA

30. JI, NAN. Incomplete Image Filling by Popular Deep Learning Methods.

Degree: Statistics, 2019, UCLA

 Incomplete image filling task, often known as the image inpainting task, is a popular topic in the applied deep learning field. This thesis paper considers… (more)

Subjects/Keywords: Statistics; Deep Learning; Image Inpainting

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

JI, N. (2019). Incomplete Image Filling by Popular Deep Learning Methods. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/1tz4d61x

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

JI, NAN. “Incomplete Image Filling by Popular Deep Learning Methods.” 2019. Thesis, UCLA. Accessed February 27, 2020. http://www.escholarship.org/uc/item/1tz4d61x.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

JI, NAN. “Incomplete Image Filling by Popular Deep Learning Methods.” 2019. Web. 27 Feb 2020.

Vancouver:

JI N. Incomplete Image Filling by Popular Deep Learning Methods. [Internet] [Thesis]. UCLA; 2019. [cited 2020 Feb 27]. Available from: http://www.escholarship.org/uc/item/1tz4d61x.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

JI N. Incomplete Image Filling by Popular Deep Learning Methods. [Thesis]. UCLA; 2019. Available from: http://www.escholarship.org/uc/item/1tz4d61x

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

[1] [2] [3] [4] [5] … [60]

.