Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Recurrent Neural Networks). Showing records 1 – 30 of 230 total matches.

[1] [2] [3] [4] [5] [6] [7] [8]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Languages

Country

▼ Search Limiters


The Ohio State University

1. Shao, Yuanlong. Learning Sparse Recurrent Neural Networks in Language Modeling.

Degree: MS, Computer Science and Engineering, 2014, The Ohio State University

 In the context of statistical language modeling, we explored the task of learning an Elman network with sparse weight matrices, as a pilot study towards… (more)

Subjects/Keywords: Computer Science; Artificial Intelligence; language modeling; recurrent neural networks; sparse recurrent neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Shao, Y. (2014). Learning Sparse Recurrent Neural Networks in Language Modeling. (Masters Thesis). The Ohio State University. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373

Chicago Manual of Style (16th Edition):

Shao, Yuanlong. “Learning Sparse Recurrent Neural Networks in Language Modeling.” 2014. Masters Thesis, The Ohio State University. Accessed January 27, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373.

MLA Handbook (7th Edition):

Shao, Yuanlong. “Learning Sparse Recurrent Neural Networks in Language Modeling.” 2014. Web. 27 Jan 2021.

Vancouver:

Shao Y. Learning Sparse Recurrent Neural Networks in Language Modeling. [Internet] [Masters thesis]. The Ohio State University; 2014. [cited 2021 Jan 27]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373.

Council of Science Editors:

Shao Y. Learning Sparse Recurrent Neural Networks in Language Modeling. [Masters Thesis]. The Ohio State University; 2014. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373


NSYSU

2. Wang, Hao-Yi. The Impacts of Image Contexts on Dialogue Systems.

Degree: Master, Information Management, 2018, NSYSU

 Chatting with machines is not only possible but also more and more common in our lives these days. With the approach, we can execute commands… (more)

Subjects/Keywords: , Dialogue; Convolutional neural networks; Recurrent neural networks; Image recognition; Natural language; Neural networks; Machine learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, H. (2018). The Impacts of Image Contexts on Dialogue Systems. (Thesis). NSYSU. Retrieved from http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0616118-181354

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Wang, Hao-Yi. “The Impacts of Image Contexts on Dialogue Systems.” 2018. Thesis, NSYSU. Accessed January 27, 2021. http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0616118-181354.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Wang, Hao-Yi. “The Impacts of Image Contexts on Dialogue Systems.” 2018. Web. 27 Jan 2021.

Vancouver:

Wang H. The Impacts of Image Contexts on Dialogue Systems. [Internet] [Thesis]. NSYSU; 2018. [cited 2021 Jan 27]. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0616118-181354.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang H. The Impacts of Image Contexts on Dialogue Systems. [Thesis]. NSYSU; 2018. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0616118-181354

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Carnegie Mellon University

3. Le, Ngan Thi Hoang. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.

Degree: 2018, Carnegie Mellon University

 Semantic labeling is becoming more and more popular among researchers in computer vision and machine learning. Many applications, such as autonomous driving, tracking, indoor navigation,… (more)

Subjects/Keywords: Gated Recurrent Unit; Level Set; Recurrent Neural Networks; Residual Network; Scene Labeling; Semantic Instance Segmentation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Le, N. T. H. (2018). Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. (Thesis). Carnegie Mellon University. Retrieved from http://repository.cmu.edu/dissertations/1166

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Le, Ngan Thi Hoang. “Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.” 2018. Thesis, Carnegie Mellon University. Accessed January 27, 2021. http://repository.cmu.edu/dissertations/1166.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Le, Ngan Thi Hoang. “Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.” 2018. Web. 27 Jan 2021.

Vancouver:

Le NTH. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. [Internet] [Thesis]. Carnegie Mellon University; 2018. [cited 2021 Jan 27]. Available from: http://repository.cmu.edu/dissertations/1166.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Le NTH. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. [Thesis]. Carnegie Mellon University; 2018. Available from: http://repository.cmu.edu/dissertations/1166

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Toronto

4. Li, Yunpeng. Word Representation Using A Deep Neural Network.

Degree: 2016, University of Toronto

Word representation or word embedding is an important step in understanding languages. It maps similar words to vectors that are close in space. In this… (more)

Subjects/Keywords: Morphology; Natural Language Processing; Recurrent Neural Networks; Recursive Neural Networks; Word Embedding; 0800

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, Y. (2016). Word Representation Using A Deep Neural Network. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/75155

Chicago Manual of Style (16th Edition):

Li, Yunpeng. “Word Representation Using A Deep Neural Network.” 2016. Masters Thesis, University of Toronto. Accessed January 27, 2021. http://hdl.handle.net/1807/75155.

MLA Handbook (7th Edition):

Li, Yunpeng. “Word Representation Using A Deep Neural Network.” 2016. Web. 27 Jan 2021.

Vancouver:

Li Y. Word Representation Using A Deep Neural Network. [Internet] [Masters thesis]. University of Toronto; 2016. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1807/75155.

Council of Science Editors:

Li Y. Word Representation Using A Deep Neural Network. [Masters Thesis]. University of Toronto; 2016. Available from: http://hdl.handle.net/1807/75155


University of California – San Diego

5. Tripathi, Subarna. Improving Object Detection and Segmentation by Utilizing Context.

Degree: Electrical Engineering (Signal and Image Proc), 2018, University of California – San Diego

 Object detection and segmentation are important computer vision problems that have applications in several domains such as autonomous driving, virtual and augmented reality systems, human-computer… (more)

Subjects/Keywords: Computer science; Convolutional Neural Networks; Deep Learning; Object Detection; Recurrent Neural Networks; Segmentation; Video Processing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tripathi, S. (2018). Improving Object Detection and Segmentation by Utilizing Context. (Thesis). University of California – San Diego. Retrieved from http://www.escholarship.org/uc/item/5955t4nq

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tripathi, Subarna. “Improving Object Detection and Segmentation by Utilizing Context.” 2018. Thesis, University of California – San Diego. Accessed January 27, 2021. http://www.escholarship.org/uc/item/5955t4nq.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tripathi, Subarna. “Improving Object Detection and Segmentation by Utilizing Context.” 2018. Web. 27 Jan 2021.

Vancouver:

Tripathi S. Improving Object Detection and Segmentation by Utilizing Context. [Internet] [Thesis]. University of California – San Diego; 2018. [cited 2021 Jan 27]. Available from: http://www.escholarship.org/uc/item/5955t4nq.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tripathi S. Improving Object Detection and Segmentation by Utilizing Context. [Thesis]. University of California – San Diego; 2018. Available from: http://www.escholarship.org/uc/item/5955t4nq

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Colorado

6. Maierhofer, Jeffrey Matthew. Lifetime Limited Memory Neural Networks.

Degree: MS, 2019, University of Colorado

  In the modern digital environment, many data sources can be characterized as event sequences. These event sequences describe a series of events and an… (more)

Subjects/Keywords: events; neural networks; prediction; recurrent neural networks; Applied Mathematics; Computer Sciences; Statistics and Probability

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Maierhofer, J. M. (2019). Lifetime Limited Memory Neural Networks. (Masters Thesis). University of Colorado. Retrieved from https://scholar.colorado.edu/appm_gradetds/149

Chicago Manual of Style (16th Edition):

Maierhofer, Jeffrey Matthew. “Lifetime Limited Memory Neural Networks.” 2019. Masters Thesis, University of Colorado. Accessed January 27, 2021. https://scholar.colorado.edu/appm_gradetds/149.

MLA Handbook (7th Edition):

Maierhofer, Jeffrey Matthew. “Lifetime Limited Memory Neural Networks.” 2019. Web. 27 Jan 2021.

Vancouver:

Maierhofer JM. Lifetime Limited Memory Neural Networks. [Internet] [Masters thesis]. University of Colorado; 2019. [cited 2021 Jan 27]. Available from: https://scholar.colorado.edu/appm_gradetds/149.

Council of Science Editors:

Maierhofer JM. Lifetime Limited Memory Neural Networks. [Masters Thesis]. University of Colorado; 2019. Available from: https://scholar.colorado.edu/appm_gradetds/149


KTH

7. Lustig, Joakim. Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns.

Degree: Computer Science and Communication (CSC), 2016, KTH

  Dyslexia affects between 5-17% of all school children, mak-ing it the most common learning disability. It has beenfound to severely affect learning ability in… (more)

Subjects/Keywords: dyslexia; machine learning; neural networks; recurrent neural networks; Computer Sciences; Datavetenskap (datalogi)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lustig, J. (2016). Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns. (Thesis). KTH. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191233

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lustig, Joakim. “Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns.” 2016. Thesis, KTH. Accessed January 27, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191233.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lustig, Joakim. “Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns.” 2016. Web. 27 Jan 2021.

Vancouver:

Lustig J. Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns. [Internet] [Thesis]. KTH; 2016. [cited 2021 Jan 27]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191233.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lustig J. Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns. [Thesis]. KTH; 2016. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191233

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


IUPUI

8. Raptis, Konstantinos. The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet.

Degree: 2016, IUPUI

Indiana University-Purdue University Indianapolis (IUPUI)

Action recognition has been an active research topic for over three decades. There are various applications of action recognition, such… (more)

Subjects/Keywords: Action Recognition; Dense Trajectories; R-CNN; LSTM RNN; Convolution Neural Networks; Recurrent Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Raptis, K. (2016). The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet. (Thesis). IUPUI. Retrieved from http://hdl.handle.net/1805/11827

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Raptis, Konstantinos. “The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet.” 2016. Thesis, IUPUI. Accessed January 27, 2021. http://hdl.handle.net/1805/11827.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Raptis, Konstantinos. “The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet.” 2016. Web. 27 Jan 2021.

Vancouver:

Raptis K. The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet. [Internet] [Thesis]. IUPUI; 2016. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1805/11827.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Raptis K. The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet. [Thesis]. IUPUI; 2016. Available from: http://hdl.handle.net/1805/11827

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Ontario Institute of Technology

9. Baeenh, Mohmmed. Multi-character prediction using attention.

Degree: 2020, University of Ontario Institute of Technology

 We propose a computational attention approach to localize and classify characters in a sequence in a given image. Our approach combines spatial soft-attention with attention… (more)

Subjects/Keywords: Computational Attention; Convolutional Neural Networks; Recurrent Neural Networks; Multi-Digit Classification; CAPTCHA

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Baeenh, M. (2020). Multi-character prediction using attention. (Thesis). University of Ontario Institute of Technology. Retrieved from http://hdl.handle.net/10155/1132

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Baeenh, Mohmmed. “Multi-character prediction using attention.” 2020. Thesis, University of Ontario Institute of Technology. Accessed January 27, 2021. http://hdl.handle.net/10155/1132.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Baeenh, Mohmmed. “Multi-character prediction using attention.” 2020. Web. 27 Jan 2021.

Vancouver:

Baeenh M. Multi-character prediction using attention. [Internet] [Thesis]. University of Ontario Institute of Technology; 2020. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/10155/1132.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Baeenh M. Multi-character prediction using attention. [Thesis]. University of Ontario Institute of Technology; 2020. Available from: http://hdl.handle.net/10155/1132

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Tampere University

10. Zhou, Yi. Sentiment classification with deep neural networks .

Degree: 2019, Tampere University

 Sentiment classification is an important task in Natural Language Processing (NLP) area. Deep neural networks become the mainstream method to perform the text sentiment classification… (more)

Subjects/Keywords: deep neural networks; convolutional neural network; recurrent neural network; sentiment classification; hotel reviews; TripAdvisor

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhou, Y. (2019). Sentiment classification with deep neural networks . (Masters Thesis). Tampere University. Retrieved from https://trepo.tuni.fi//handle/10024/116148

Chicago Manual of Style (16th Edition):

Zhou, Yi. “Sentiment classification with deep neural networks .” 2019. Masters Thesis, Tampere University. Accessed January 27, 2021. https://trepo.tuni.fi//handle/10024/116148.

MLA Handbook (7th Edition):

Zhou, Yi. “Sentiment classification with deep neural networks .” 2019. Web. 27 Jan 2021.

Vancouver:

Zhou Y. Sentiment classification with deep neural networks . [Internet] [Masters thesis]. Tampere University; 2019. [cited 2021 Jan 27]. Available from: https://trepo.tuni.fi//handle/10024/116148.

Council of Science Editors:

Zhou Y. Sentiment classification with deep neural networks . [Masters Thesis]. Tampere University; 2019. Available from: https://trepo.tuni.fi//handle/10024/116148


Georgia Tech

11. Youmans, Michael Thomas. Identifying and generating candidate antibacterial peptides with long short-term memory recurrent neural networks.

Degree: PhD, Biomedical Engineering (Joint GT/Emory Department), 2019, Georgia Tech

 There is a growing need to deal with increasing rates of resistance to antibiotics among pathogenic bacteria. The development of resistance in bacteria to current… (more)

Subjects/Keywords: Recurrent neural networks; Antibacterial peptides; Antimicrobial peptides; Long short-term memory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Youmans, M. T. (2019). Identifying and generating candidate antibacterial peptides with long short-term memory recurrent neural networks. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/61210

Chicago Manual of Style (16th Edition):

Youmans, Michael Thomas. “Identifying and generating candidate antibacterial peptides with long short-term memory recurrent neural networks.” 2019. Doctoral Dissertation, Georgia Tech. Accessed January 27, 2021. http://hdl.handle.net/1853/61210.

MLA Handbook (7th Edition):

Youmans, Michael Thomas. “Identifying and generating candidate antibacterial peptides with long short-term memory recurrent neural networks.” 2019. Web. 27 Jan 2021.

Vancouver:

Youmans MT. Identifying and generating candidate antibacterial peptides with long short-term memory recurrent neural networks. [Internet] [Doctoral dissertation]. Georgia Tech; 2019. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1853/61210.

Council of Science Editors:

Youmans MT. Identifying and generating candidate antibacterial peptides with long short-term memory recurrent neural networks. [Doctoral Dissertation]. Georgia Tech; 2019. Available from: http://hdl.handle.net/1853/61210


UCLA

12. Hardy, Nicholas. Neural network dynamics of temporal processing.

Degree: Neuroscience, 2018, UCLA

 Time is centrally involved in most tasks the brain performs. However, the neurobiological mechanisms of timing remain a mystery. Signatures of temporal processing related to… (more)

Subjects/Keywords: Neurosciences; Calcium imaging; Cortex; Recurrent neural networks; Theory; Timing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hardy, N. (2018). Neural network dynamics of temporal processing. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/9kd0p4h8

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hardy, Nicholas. “Neural network dynamics of temporal processing.” 2018. Thesis, UCLA. Accessed January 27, 2021. http://www.escholarship.org/uc/item/9kd0p4h8.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hardy, Nicholas. “Neural network dynamics of temporal processing.” 2018. Web. 27 Jan 2021.

Vancouver:

Hardy N. Neural network dynamics of temporal processing. [Internet] [Thesis]. UCLA; 2018. [cited 2021 Jan 27]. Available from: http://www.escholarship.org/uc/item/9kd0p4h8.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hardy N. Neural network dynamics of temporal processing. [Thesis]. UCLA; 2018. Available from: http://www.escholarship.org/uc/item/9kd0p4h8

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Victoria University of Wellington

13. Chandra, Rohitash. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.

Degree: 2012, Victoria University of Wellington

 One way to train neural networks is to use evolutionary algorithms such as cooperative coevolution - a method that decomposes the network's learnable parameters into… (more)

Subjects/Keywords: Neural networks; Cooperative coevolution; Recurrent network; Co-operative co-evolution

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chandra, R. (2012). Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. (Doctoral Dissertation). Victoria University of Wellington. Retrieved from http://hdl.handle.net/10063/2110

Chicago Manual of Style (16th Edition):

Chandra, Rohitash. “Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.” 2012. Doctoral Dissertation, Victoria University of Wellington. Accessed January 27, 2021. http://hdl.handle.net/10063/2110.

MLA Handbook (7th Edition):

Chandra, Rohitash. “Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.” 2012. Web. 27 Jan 2021.

Vancouver:

Chandra R. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. [Internet] [Doctoral dissertation]. Victoria University of Wellington; 2012. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/10063/2110.

Council of Science Editors:

Chandra R. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. [Doctoral Dissertation]. Victoria University of Wellington; 2012. Available from: http://hdl.handle.net/10063/2110


Delft University of Technology

14. Joosse, Corniël (author). Absence seizure prediction using recurrent neural networks.

Degree: 2020, Delft University of Technology

Absence seizures have a real-life impact on epileptic subjects, as day-to-day tasks can by suddenly interrupted making for dangerous situations. Though a lot of work… (more)

Subjects/Keywords: Recurrent Neural Networks; Absence seizure; Seizure prediction; Machine learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Joosse, C. (. (2020). Absence seizure prediction using recurrent neural networks. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:e752232f-cd6b-47c4-ba70-899a020b7e6d

Chicago Manual of Style (16th Edition):

Joosse, Corniël (author). “Absence seizure prediction using recurrent neural networks.” 2020. Masters Thesis, Delft University of Technology. Accessed January 27, 2021. http://resolver.tudelft.nl/uuid:e752232f-cd6b-47c4-ba70-899a020b7e6d.

MLA Handbook (7th Edition):

Joosse, Corniël (author). “Absence seizure prediction using recurrent neural networks.” 2020. Web. 27 Jan 2021.

Vancouver:

Joosse C(. Absence seizure prediction using recurrent neural networks. [Internet] [Masters thesis]. Delft University of Technology; 2020. [cited 2021 Jan 27]. Available from: http://resolver.tudelft.nl/uuid:e752232f-cd6b-47c4-ba70-899a020b7e6d.

Council of Science Editors:

Joosse C(. Absence seizure prediction using recurrent neural networks. [Masters Thesis]. Delft University of Technology; 2020. Available from: http://resolver.tudelft.nl/uuid:e752232f-cd6b-47c4-ba70-899a020b7e6d


Tampere University

15. Liukkonen, Mikko. Recurrent neural network model for detecting Diameter signalling patterns in LTE control-plane traffic .

Degree: 2017, Tampere University

 Data roaming in the LTE network has become popular because people want to continue using the same services abroad as in their home network. Hence,… (more)

Subjects/Keywords: recurrent neural networks; LTE roaming; Diameter signalling; pattern recognition

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liukkonen, M. (2017). Recurrent neural network model for detecting Diameter signalling patterns in LTE control-plane traffic . (Masters Thesis). Tampere University. Retrieved from https://trepo.tuni.fi/handle/10024/100951

Chicago Manual of Style (16th Edition):

Liukkonen, Mikko. “Recurrent neural network model for detecting Diameter signalling patterns in LTE control-plane traffic .” 2017. Masters Thesis, Tampere University. Accessed January 27, 2021. https://trepo.tuni.fi/handle/10024/100951.

MLA Handbook (7th Edition):

Liukkonen, Mikko. “Recurrent neural network model for detecting Diameter signalling patterns in LTE control-plane traffic .” 2017. Web. 27 Jan 2021.

Vancouver:

Liukkonen M. Recurrent neural network model for detecting Diameter signalling patterns in LTE control-plane traffic . [Internet] [Masters thesis]. Tampere University; 2017. [cited 2021 Jan 27]. Available from: https://trepo.tuni.fi/handle/10024/100951.

Council of Science Editors:

Liukkonen M. Recurrent neural network model for detecting Diameter signalling patterns in LTE control-plane traffic . [Masters Thesis]. Tampere University; 2017. Available from: https://trepo.tuni.fi/handle/10024/100951


Rice University

16. Michalenko, Joshua James. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.

Degree: MS, Electrical & Computer Eng., 2019, Rice University

 We investigate the internal representations that a recurrent neural network (RNN) uses while learning to recognize a regular formal language. Specially, we train a RNN… (more)

Subjects/Keywords: Language recognition; Recurrent Neural Networks; Representation Learning; deterministic finite automaton; automaton

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Michalenko, J. J. (2019). Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. (Masters Thesis). Rice University. Retrieved from http://hdl.handle.net/1911/105422

Chicago Manual of Style (16th Edition):

Michalenko, Joshua James. “Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.” 2019. Masters Thesis, Rice University. Accessed January 27, 2021. http://hdl.handle.net/1911/105422.

MLA Handbook (7th Edition):

Michalenko, Joshua James. “Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.” 2019. Web. 27 Jan 2021.

Vancouver:

Michalenko JJ. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. [Internet] [Masters thesis]. Rice University; 2019. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1911/105422.

Council of Science Editors:

Michalenko JJ. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. [Masters Thesis]. Rice University; 2019. Available from: http://hdl.handle.net/1911/105422


Delft University of Technology

17. Samad, Azlaan Mustafa (author). Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands.

Degree: 2020, Delft University of Technology

 In today’s scenario due to rapid urbanisation there has been a shift of population from rural to urban areas especially in developing countries in search… (more)

Subjects/Keywords: Deep Reinforcement Learning; Deep Q-Network; Recurrent Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Samad, A. M. (. (2020). Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:84d20f53-3be7-4e85-8588-b92b962b32fe

Chicago Manual of Style (16th Edition):

Samad, Azlaan Mustafa (author). “Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands.” 2020. Masters Thesis, Delft University of Technology. Accessed January 27, 2021. http://resolver.tudelft.nl/uuid:84d20f53-3be7-4e85-8588-b92b962b32fe.

MLA Handbook (7th Edition):

Samad, Azlaan Mustafa (author). “Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands.” 2020. Web. 27 Jan 2021.

Vancouver:

Samad AM(. Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands. [Internet] [Masters thesis]. Delft University of Technology; 2020. [cited 2021 Jan 27]. Available from: http://resolver.tudelft.nl/uuid:84d20f53-3be7-4e85-8588-b92b962b32fe.

Council of Science Editors:

Samad AM(. Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands. [Masters Thesis]. Delft University of Technology; 2020. Available from: http://resolver.tudelft.nl/uuid:84d20f53-3be7-4e85-8588-b92b962b32fe


Georgia Tech

18. Chatterjee, Anirban. A methodology for quantifying and improving pavement condition estimation and forecasting by integrating smartphone and 3D laser data.

Degree: PhD, Civil and Environmental Engineering, 2019, Georgia Tech

 This thesis aims to combine data from accurate but expensive 3D laser scanners and inexpensive smartphones for improving pavement condition estimation and forecasting using both… (more)

Subjects/Keywords: Pavement condition; Smartphone; 3D pavement data; Recurrent neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chatterjee, A. (2019). A methodology for quantifying and improving pavement condition estimation and forecasting by integrating smartphone and 3D laser data. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/63550

Chicago Manual of Style (16th Edition):

Chatterjee, Anirban. “A methodology for quantifying and improving pavement condition estimation and forecasting by integrating smartphone and 3D laser data.” 2019. Doctoral Dissertation, Georgia Tech. Accessed January 27, 2021. http://hdl.handle.net/1853/63550.

MLA Handbook (7th Edition):

Chatterjee, Anirban. “A methodology for quantifying and improving pavement condition estimation and forecasting by integrating smartphone and 3D laser data.” 2019. Web. 27 Jan 2021.

Vancouver:

Chatterjee A. A methodology for quantifying and improving pavement condition estimation and forecasting by integrating smartphone and 3D laser data. [Internet] [Doctoral dissertation]. Georgia Tech; 2019. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1853/63550.

Council of Science Editors:

Chatterjee A. A methodology for quantifying and improving pavement condition estimation and forecasting by integrating smartphone and 3D laser data. [Doctoral Dissertation]. Georgia Tech; 2019. Available from: http://hdl.handle.net/1853/63550


University of Ottawa

19. Parthiban, Dwarak Govind. On the Softmax Bottleneck of Word-Level Recurrent Language Models.

Degree: MCS, Génie / Engineering, 2020, University of Ottawa

 For different input contexts (sequence of previous words), to predict the next word, a neural word-level language model outputs a probability distribution over all the… (more)

Subjects/Keywords: Language Models; Softmax Bottleneck; Recurrent Neural Networks; AWD-LSTM; Generalized SigSoftmax

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Parthiban, D. G. (2020). On the Softmax Bottleneck of Word-Level Recurrent Language Models. (Masters Thesis). University of Ottawa. Retrieved from http://dx.doi.org/10.20381/ruor-25636

Chicago Manual of Style (16th Edition):

Parthiban, Dwarak Govind. “On the Softmax Bottleneck of Word-Level Recurrent Language Models.” 2020. Masters Thesis, University of Ottawa. Accessed January 27, 2021. http://dx.doi.org/10.20381/ruor-25636.

MLA Handbook (7th Edition):

Parthiban, Dwarak Govind. “On the Softmax Bottleneck of Word-Level Recurrent Language Models.” 2020. Web. 27 Jan 2021.

Vancouver:

Parthiban DG. On the Softmax Bottleneck of Word-Level Recurrent Language Models. [Internet] [Masters thesis]. University of Ottawa; 2020. [cited 2021 Jan 27]. Available from: http://dx.doi.org/10.20381/ruor-25636.

Council of Science Editors:

Parthiban DG. On the Softmax Bottleneck of Word-Level Recurrent Language Models. [Masters Thesis]. University of Ottawa; 2020. Available from: http://dx.doi.org/10.20381/ruor-25636


Rice University

20. Nguyen, Minh Tan. On the Momentum-based Methods for Training and Designing Deep Neural Networks.

Degree: PhD, Engineering, 2020, Rice University

 Training and designing deep neural networks (DNNs) are an art that often involves expensive search over candidate architectures and optimization algorithms. In my thesis, we… (more)

Subjects/Keywords: momentum methods; scheduled restart SGD; recurrent neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Nguyen, M. T. (2020). On the Momentum-based Methods for Training and Designing Deep Neural Networks. (Doctoral Dissertation). Rice University. Retrieved from http://hdl.handle.net/1911/109343

Chicago Manual of Style (16th Edition):

Nguyen, Minh Tan. “On the Momentum-based Methods for Training and Designing Deep Neural Networks.” 2020. Doctoral Dissertation, Rice University. Accessed January 27, 2021. http://hdl.handle.net/1911/109343.

MLA Handbook (7th Edition):

Nguyen, Minh Tan. “On the Momentum-based Methods for Training and Designing Deep Neural Networks.” 2020. Web. 27 Jan 2021.

Vancouver:

Nguyen MT. On the Momentum-based Methods for Training and Designing Deep Neural Networks. [Internet] [Doctoral dissertation]. Rice University; 2020. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1911/109343.

Council of Science Editors:

Nguyen MT. On the Momentum-based Methods for Training and Designing Deep Neural Networks. [Doctoral Dissertation]. Rice University; 2020. Available from: http://hdl.handle.net/1911/109343


Brandeis University

21. Garimella, Manaswini. Detecting article errors in English learner essays with recurrent neural networks.

Degree: 2016, Brandeis University

 Article and determiner errors are common in the writing of English language learners, but automated systems of detecting and correcting them can be challenging to… (more)

Subjects/Keywords: computational linguistics; English language learning; grammatical error correction; recurrent neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Garimella, M. (2016). Detecting article errors in English learner essays with recurrent neural networks. (Thesis). Brandeis University. Retrieved from http://hdl.handle.net/10192/32890

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Garimella, Manaswini. “Detecting article errors in English learner essays with recurrent neural networks.” 2016. Thesis, Brandeis University. Accessed January 27, 2021. http://hdl.handle.net/10192/32890.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Garimella, Manaswini. “Detecting article errors in English learner essays with recurrent neural networks.” 2016. Web. 27 Jan 2021.

Vancouver:

Garimella M. Detecting article errors in English learner essays with recurrent neural networks. [Internet] [Thesis]. Brandeis University; 2016. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/10192/32890.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Garimella M. Detecting article errors in English learner essays with recurrent neural networks. [Thesis]. Brandeis University; 2016. Available from: http://hdl.handle.net/10192/32890

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Toronto

22. Trischler, Adam Peter. A Computational Model for Episodic Memory Inspired by the Brain.

Degree: PhD, 2016, University of Toronto

 Memory is a pillar of intelligence, and to think like us, it may be that artificial systems must remember like us. This dissertation introduces a… (more)

Subjects/Keywords: deep learning; dynamical systems; episodic memory; recurrent neural networks; 0984

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Trischler, A. P. (2016). A Computational Model for Episodic Memory Inspired by the Brain. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/73201

Chicago Manual of Style (16th Edition):

Trischler, Adam Peter. “A Computational Model for Episodic Memory Inspired by the Brain.” 2016. Doctoral Dissertation, University of Toronto. Accessed January 27, 2021. http://hdl.handle.net/1807/73201.

MLA Handbook (7th Edition):

Trischler, Adam Peter. “A Computational Model for Episodic Memory Inspired by the Brain.” 2016. Web. 27 Jan 2021.

Vancouver:

Trischler AP. A Computational Model for Episodic Memory Inspired by the Brain. [Internet] [Doctoral dissertation]. University of Toronto; 2016. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1807/73201.

Council of Science Editors:

Trischler AP. A Computational Model for Episodic Memory Inspired by the Brain. [Doctoral Dissertation]. University of Toronto; 2016. Available from: http://hdl.handle.net/1807/73201


Georgia Tech

23. Kim, Young Jin. A deep learning and parallel simulation methodology for air traffic management.

Degree: PhD, Aerospace Engineering, 2017, Georgia Tech

 Air traffic management is widely studied in several different fields because of its complexity and criticality to a variety of stakeholders including passengers, airlines, regulatory… (more)

Subjects/Keywords: parallel simulation; recurrent neural networks; air traffic management

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kim, Y. J. (2017). A deep learning and parallel simulation methodology for air traffic management. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/59180

Chicago Manual of Style (16th Edition):

Kim, Young Jin. “A deep learning and parallel simulation methodology for air traffic management.” 2017. Doctoral Dissertation, Georgia Tech. Accessed January 27, 2021. http://hdl.handle.net/1853/59180.

MLA Handbook (7th Edition):

Kim, Young Jin. “A deep learning and parallel simulation methodology for air traffic management.” 2017. Web. 27 Jan 2021.

Vancouver:

Kim YJ. A deep learning and parallel simulation methodology for air traffic management. [Internet] [Doctoral dissertation]. Georgia Tech; 2017. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1853/59180.

Council of Science Editors:

Kim YJ. A deep learning and parallel simulation methodology for air traffic management. [Doctoral Dissertation]. Georgia Tech; 2017. Available from: http://hdl.handle.net/1853/59180


Rice University

24. Nguyen, Minh Tan. On the Momentum-based Methods for Training and Designing Deep Neural Networks.

Degree: PhD, Electrical & Computer Eng., 2020, Rice University

 Training and designing deep neural networks (DNNs) are an art that often involves expensive search over candidate architectures and optimization algorithms. In my thesis, we… (more)

Subjects/Keywords: momentum methods; scheduled restart SGD; recurrent neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Nguyen, M. T. (2020). On the Momentum-based Methods for Training and Designing Deep Neural Networks. (Doctoral Dissertation). Rice University. Retrieved from http://hdl.handle.net/1911/109344

Chicago Manual of Style (16th Edition):

Nguyen, Minh Tan. “On the Momentum-based Methods for Training and Designing Deep Neural Networks.” 2020. Doctoral Dissertation, Rice University. Accessed January 27, 2021. http://hdl.handle.net/1911/109344.

MLA Handbook (7th Edition):

Nguyen, Minh Tan. “On the Momentum-based Methods for Training and Designing Deep Neural Networks.” 2020. Web. 27 Jan 2021.

Vancouver:

Nguyen MT. On the Momentum-based Methods for Training and Designing Deep Neural Networks. [Internet] [Doctoral dissertation]. Rice University; 2020. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1911/109344.

Council of Science Editors:

Nguyen MT. On the Momentum-based Methods for Training and Designing Deep Neural Networks. [Doctoral Dissertation]. Rice University; 2020. Available from: http://hdl.handle.net/1911/109344


ETH Zürich

25. Neil, Daniel. Deep Neural Networks and Hardware Systems for Event-driven Data.

Degree: 2017, ETH Zürich

 Event-based sensors, built with biological inspiration, differ greatly from traditional sensor types. A standard vision sensor uses a pixel array to produce a frame containing… (more)

Subjects/Keywords: Deep Neural Networks; Event-driven sensors; Deep neural networks (DNNs); Spiking deep neural networks; Recurrent Neural Networks; Convolutional neural networks; info:eu-repo/classification/ddc/4; Data processing, computer science

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Neil, D. (2017). Deep Neural Networks and Hardware Systems for Event-driven Data. (Doctoral Dissertation). ETH Zürich. Retrieved from http://hdl.handle.net/20.500.11850/168865

Chicago Manual of Style (16th Edition):

Neil, Daniel. “Deep Neural Networks and Hardware Systems for Event-driven Data.” 2017. Doctoral Dissertation, ETH Zürich. Accessed January 27, 2021. http://hdl.handle.net/20.500.11850/168865.

MLA Handbook (7th Edition):

Neil, Daniel. “Deep Neural Networks and Hardware Systems for Event-driven Data.” 2017. Web. 27 Jan 2021.

Vancouver:

Neil D. Deep Neural Networks and Hardware Systems for Event-driven Data. [Internet] [Doctoral dissertation]. ETH Zürich; 2017. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/20.500.11850/168865.

Council of Science Editors:

Neil D. Deep Neural Networks and Hardware Systems for Event-driven Data. [Doctoral Dissertation]. ETH Zürich; 2017. Available from: http://hdl.handle.net/20.500.11850/168865


Delft University of Technology

26. Voss, Sander (author). Application of Deep Learning for Spacecraft Fault Detection and Isolation.

Degree: 2019, Delft University of Technology

Spacecraft require high availability, autonomous operation, and a high degree of mission success. Spacecraft use sensors, such as star trackers and GPS, and actuators, such… (more)

Subjects/Keywords: FDI; FDIR; Fault Detection; Fault Isolation; Deep Learning; Recurrent networks; Recurrent Neural Network; long short-term memory networks; LSTM

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Voss, S. (. (2019). Application of Deep Learning for Spacecraft Fault Detection and Isolation. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:7c308a4b-f97b-4a83-b739-4019ad306853

Chicago Manual of Style (16th Edition):

Voss, Sander (author). “Application of Deep Learning for Spacecraft Fault Detection and Isolation.” 2019. Masters Thesis, Delft University of Technology. Accessed January 27, 2021. http://resolver.tudelft.nl/uuid:7c308a4b-f97b-4a83-b739-4019ad306853.

MLA Handbook (7th Edition):

Voss, Sander (author). “Application of Deep Learning for Spacecraft Fault Detection and Isolation.” 2019. Web. 27 Jan 2021.

Vancouver:

Voss S(. Application of Deep Learning for Spacecraft Fault Detection and Isolation. [Internet] [Masters thesis]. Delft University of Technology; 2019. [cited 2021 Jan 27]. Available from: http://resolver.tudelft.nl/uuid:7c308a4b-f97b-4a83-b739-4019ad306853.

Council of Science Editors:

Voss S(. Application of Deep Learning for Spacecraft Fault Detection and Isolation. [Masters Thesis]. Delft University of Technology; 2019. Available from: http://resolver.tudelft.nl/uuid:7c308a4b-f97b-4a83-b739-4019ad306853


Rice University

27. Cone, Ian. Learning precise spatiotemporal sequences via biophysically realistic neural circuits with modular structure.

Degree: MS, Natural Sciences, 2020, Rice University

 The ability to express and learn temporal sequences is an essential part of neural learning and memory. Learned temporal sequences are expressed in multiple brain… (more)

Subjects/Keywords: sequence learning; sequence recall; neural circuits; non-Markovian sequences; biophysically realistic models; recurrent neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Cone, I. (2020). Learning precise spatiotemporal sequences via biophysically realistic neural circuits with modular structure. (Masters Thesis). Rice University. Retrieved from http://hdl.handle.net/1911/108773

Chicago Manual of Style (16th Edition):

Cone, Ian. “Learning precise spatiotemporal sequences via biophysically realistic neural circuits with modular structure.” 2020. Masters Thesis, Rice University. Accessed January 27, 2021. http://hdl.handle.net/1911/108773.

MLA Handbook (7th Edition):

Cone, Ian. “Learning precise spatiotemporal sequences via biophysically realistic neural circuits with modular structure.” 2020. Web. 27 Jan 2021.

Vancouver:

Cone I. Learning precise spatiotemporal sequences via biophysically realistic neural circuits with modular structure. [Internet] [Masters thesis]. Rice University; 2020. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/1911/108773.

Council of Science Editors:

Cone I. Learning precise spatiotemporal sequences via biophysically realistic neural circuits with modular structure. [Masters Thesis]. Rice University; 2020. Available from: http://hdl.handle.net/1911/108773


University of Plymouth

28. Carmantini, Giovanni Sirio. Dynamical systems theory for transparent symbolic computation in neuronal networks.

Degree: PhD, 2017, University of Plymouth

 In this thesis, we explore the interface between symbolic and dynamical system computation, with particular regard to dynamical system models of neuronal networks. In doing… (more)

Subjects/Keywords: 006.3; Automata Theory; Recurrent Neural Networks; Representation Theory; Neural Symbolic Computation; Dynamical Systems; Symbolic Dynamics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Carmantini, G. S. (2017). Dynamical systems theory for transparent symbolic computation in neuronal networks. (Doctoral Dissertation). University of Plymouth. Retrieved from http://hdl.handle.net/10026.1/8647

Chicago Manual of Style (16th Edition):

Carmantini, Giovanni Sirio. “Dynamical systems theory for transparent symbolic computation in neuronal networks.” 2017. Doctoral Dissertation, University of Plymouth. Accessed January 27, 2021. http://hdl.handle.net/10026.1/8647.

MLA Handbook (7th Edition):

Carmantini, Giovanni Sirio. “Dynamical systems theory for transparent symbolic computation in neuronal networks.” 2017. Web. 27 Jan 2021.

Vancouver:

Carmantini GS. Dynamical systems theory for transparent symbolic computation in neuronal networks. [Internet] [Doctoral dissertation]. University of Plymouth; 2017. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/10026.1/8647.

Council of Science Editors:

Carmantini GS. Dynamical systems theory for transparent symbolic computation in neuronal networks. [Doctoral Dissertation]. University of Plymouth; 2017. Available from: http://hdl.handle.net/10026.1/8647

29. JOHANSSON, SIMON. Differentiable Neural Computers for in silico molecular design: Benchmarks of architectures in generative modeling of molecules .

Degree: Chalmers tekniska högskola / Institutionen för data och informationsteknik, 2019, Chalmers University of Technology

 In the area of in silico drug discovery, deep learning has grown immensely as a field of research. Recurrent neural networks (RNN) is one of… (more)

Subjects/Keywords: Computer science; machine learning; recurrent neural networks; GRU; LSTM; differentiable neural computer; engineering; project; thesis

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

JOHANSSON, S. (2019). Differentiable Neural Computers for in silico molecular design: Benchmarks of architectures in generative modeling of molecules . (Thesis). Chalmers University of Technology. Retrieved from http://hdl.handle.net/20.500.12380/300419

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

JOHANSSON, SIMON. “Differentiable Neural Computers for in silico molecular design: Benchmarks of architectures in generative modeling of molecules .” 2019. Thesis, Chalmers University of Technology. Accessed January 27, 2021. http://hdl.handle.net/20.500.12380/300419.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

JOHANSSON, SIMON. “Differentiable Neural Computers for in silico molecular design: Benchmarks of architectures in generative modeling of molecules .” 2019. Web. 27 Jan 2021.

Vancouver:

JOHANSSON S. Differentiable Neural Computers for in silico molecular design: Benchmarks of architectures in generative modeling of molecules . [Internet] [Thesis]. Chalmers University of Technology; 2019. [cited 2021 Jan 27]. Available from: http://hdl.handle.net/20.500.12380/300419.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

JOHANSSON S. Differentiable Neural Computers for in silico molecular design: Benchmarks of architectures in generative modeling of molecules . [Thesis]. Chalmers University of Technology; 2019. Available from: http://hdl.handle.net/20.500.12380/300419

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

30. Tirumaladasu, Sai Subhakar. Autonomous Driving: Traffic Sign Classification.

Degree: 2019, , Department of Applied Signal Processing

  Autonomous Driving and Advance Driver Assistance Systems (ADAS) are revolutionizing the way we drive and the future of mobility. Among ADAS, Traffic Sign Classification… (more)

Subjects/Keywords: Autonomous Driving; Deep Learning; Image Processing; Convolutional Neural Networks; Recurrent Neural Networks; Generative Adversarial Networks; Engineering and Technology; Teknik och teknologier

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tirumaladasu, S. S. (2019). Autonomous Driving: Traffic Sign Classification. (Thesis). , Department of Applied Signal Processing. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17783

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tirumaladasu, Sai Subhakar. “Autonomous Driving: Traffic Sign Classification.” 2019. Thesis, , Department of Applied Signal Processing. Accessed January 27, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17783.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tirumaladasu, Sai Subhakar. “Autonomous Driving: Traffic Sign Classification.” 2019. Web. 27 Jan 2021.

Vancouver:

Tirumaladasu SS. Autonomous Driving: Traffic Sign Classification. [Internet] [Thesis]. , Department of Applied Signal Processing; 2019. [cited 2021 Jan 27]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17783.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tirumaladasu SS. Autonomous Driving: Traffic Sign Classification. [Thesis]. , Department of Applied Signal Processing; 2019. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17783

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

[1] [2] [3] [4] [5] [6] [7] [8]

.