Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Recurrent Neural Network). Showing records 1 – 30 of 161 total matches.

[1] [2] [3] [4] [5] [6]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Languages

Country

▼ Search Limiters


Texas A&M University

1. Fan, David Dawei. Backpropagation for Continuous Theta Neuron Networks.

Degree: MS, Electrical Engineering, 2015, Texas A&M University

 The Theta neuron model is a spiking neuron model which, unlike traditional Leaky-Integrate-and-Fire neurons, can model spike latencies, threshold adaptation, bistability of resting and tonic… (more)

Subjects/Keywords: Spiking Neural Network; Backpropagation; Recurrent Neural Network; Neural Network; Theta Neuron

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fan, D. D. (2015). Backpropagation for Continuous Theta Neuron Networks. (Masters Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/186998

Chicago Manual of Style (16th Edition):

Fan, David Dawei. “Backpropagation for Continuous Theta Neuron Networks.” 2015. Masters Thesis, Texas A&M University. Accessed March 07, 2021. http://hdl.handle.net/1969.1/186998.

MLA Handbook (7th Edition):

Fan, David Dawei. “Backpropagation for Continuous Theta Neuron Networks.” 2015. Web. 07 Mar 2021.

Vancouver:

Fan DD. Backpropagation for Continuous Theta Neuron Networks. [Internet] [Masters thesis]. Texas A&M University; 2015. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/1969.1/186998.

Council of Science Editors:

Fan DD. Backpropagation for Continuous Theta Neuron Networks. [Masters Thesis]. Texas A&M University; 2015. Available from: http://hdl.handle.net/1969.1/186998


Penn State University

2. Lin, Tao. A DATA TRIAGE RETRIEVAL SYSTEM FOR CYBER SECURITY OPERATIONS CENTER.

Degree: 2018, Penn State University

 Triage analysis is a fundamental stage in cyber operations in Security Operations Centers (SOCs). The massive data sources generate great demands on cyber security analysts'… (more)

Subjects/Keywords: Recurrent Neural Network; Machine Learning; Retrieval; Security

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lin, T. (2018). A DATA TRIAGE RETRIEVAL SYSTEM FOR CYBER SECURITY OPERATIONS CENTER. (Thesis). Penn State University. Retrieved from https://submit-etda.libraries.psu.edu/catalog/14787txl78

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lin, Tao. “A DATA TRIAGE RETRIEVAL SYSTEM FOR CYBER SECURITY OPERATIONS CENTER.” 2018. Thesis, Penn State University. Accessed March 07, 2021. https://submit-etda.libraries.psu.edu/catalog/14787txl78.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lin, Tao. “A DATA TRIAGE RETRIEVAL SYSTEM FOR CYBER SECURITY OPERATIONS CENTER.” 2018. Web. 07 Mar 2021.

Vancouver:

Lin T. A DATA TRIAGE RETRIEVAL SYSTEM FOR CYBER SECURITY OPERATIONS CENTER. [Internet] [Thesis]. Penn State University; 2018. [cited 2021 Mar 07]. Available from: https://submit-etda.libraries.psu.edu/catalog/14787txl78.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lin T. A DATA TRIAGE RETRIEVAL SYSTEM FOR CYBER SECURITY OPERATIONS CENTER. [Thesis]. Penn State University; 2018. Available from: https://submit-etda.libraries.psu.edu/catalog/14787txl78

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Mid Sweden University

3. Wang, Xutao. Chinese Text Classification Based On Deep Learning.

Degree: Information Systems and Technology, 2018, Mid Sweden University

  Text classification has always been a concern in area of natural language processing, especially nowadays the data are getting massive due to the development… (more)

Subjects/Keywords: Text classification; Recurrent neural network; Convolutional neural network; Computer Systems; Datorsystem

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, X. (2018). Chinese Text Classification Based On Deep Learning. (Thesis). Mid Sweden University. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-35322

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Wang, Xutao. “Chinese Text Classification Based On Deep Learning.” 2018. Thesis, Mid Sweden University. Accessed March 07, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-35322.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Wang, Xutao. “Chinese Text Classification Based On Deep Learning.” 2018. Web. 07 Mar 2021.

Vancouver:

Wang X. Chinese Text Classification Based On Deep Learning. [Internet] [Thesis]. Mid Sweden University; 2018. [cited 2021 Mar 07]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-35322.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang X. Chinese Text Classification Based On Deep Learning. [Thesis]. Mid Sweden University; 2018. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-35322

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


York University

4. Bidgoli, Rohollah Soltani. Higher Order Recurrent Neural Network for Language Modeling.

Degree: MSc -MS, Computer Science, 2016, York University

 In this thesis, we study novel neural network structures to better model long term dependency in sequential data. We propose to use more memory units… (more)

Subjects/Keywords: Computer science; Machine Learning; Deep Learning; Neural Network; Recurrent Neural Network; Language Modeling; Higher Order Recurrent Neural Network

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bidgoli, R. S. (2016). Higher Order Recurrent Neural Network for Language Modeling. (Masters Thesis). York University. Retrieved from http://hdl.handle.net/10315/32337

Chicago Manual of Style (16th Edition):

Bidgoli, Rohollah Soltani. “Higher Order Recurrent Neural Network for Language Modeling.” 2016. Masters Thesis, York University. Accessed March 07, 2021. http://hdl.handle.net/10315/32337.

MLA Handbook (7th Edition):

Bidgoli, Rohollah Soltani. “Higher Order Recurrent Neural Network for Language Modeling.” 2016. Web. 07 Mar 2021.

Vancouver:

Bidgoli RS. Higher Order Recurrent Neural Network for Language Modeling. [Internet] [Masters thesis]. York University; 2016. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/10315/32337.

Council of Science Editors:

Bidgoli RS. Higher Order Recurrent Neural Network for Language Modeling. [Masters Thesis]. York University; 2016. Available from: http://hdl.handle.net/10315/32337


Tampere University

5. Zhou, Yi. Sentiment classification with deep neural networks .

Degree: 2019, Tampere University

 Sentiment classification is an important task in Natural Language Processing (NLP) area. Deep neural networks become the mainstream method to perform the text sentiment classification… (more)

Subjects/Keywords: deep neural networks; convolutional neural network; recurrent neural network; sentiment classification; hotel reviews; TripAdvisor

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhou, Y. (2019). Sentiment classification with deep neural networks . (Masters Thesis). Tampere University. Retrieved from https://trepo.tuni.fi//handle/10024/116148

Chicago Manual of Style (16th Edition):

Zhou, Yi. “Sentiment classification with deep neural networks .” 2019. Masters Thesis, Tampere University. Accessed March 07, 2021. https://trepo.tuni.fi//handle/10024/116148.

MLA Handbook (7th Edition):

Zhou, Yi. “Sentiment classification with deep neural networks .” 2019. Web. 07 Mar 2021.

Vancouver:

Zhou Y. Sentiment classification with deep neural networks . [Internet] [Masters thesis]. Tampere University; 2019. [cited 2021 Mar 07]. Available from: https://trepo.tuni.fi//handle/10024/116148.

Council of Science Editors:

Zhou Y. Sentiment classification with deep neural networks . [Masters Thesis]. Tampere University; 2019. Available from: https://trepo.tuni.fi//handle/10024/116148


University of Illinois – Urbana-Champaign

6. Yan, Zhicheng. Image recognition, semantic segmentation and photo adjustment using deep neural networks.

Degree: PhD, Computer Science, 2016, University of Illinois – Urbana-Champaign

 Deep Neural Networks (DNNs) have proven to be effective models for solving various problems in computer vision. Multi-Layer Perceptron Networks, Convolutional Neural Networks and Recurrent(more)

Subjects/Keywords: Deep Neural Network; Image Recognition; Semantic Segmentation; Photo Adjustment; Convolutional Neural Network; Recurrent Neural Network; Multi-Layer Perceptron Network

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yan, Z. (2016). Image recognition, semantic segmentation and photo adjustment using deep neural networks. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/90724

Chicago Manual of Style (16th Edition):

Yan, Zhicheng. “Image recognition, semantic segmentation and photo adjustment using deep neural networks.” 2016. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed March 07, 2021. http://hdl.handle.net/2142/90724.

MLA Handbook (7th Edition):

Yan, Zhicheng. “Image recognition, semantic segmentation and photo adjustment using deep neural networks.” 2016. Web. 07 Mar 2021.

Vancouver:

Yan Z. Image recognition, semantic segmentation and photo adjustment using deep neural networks. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2016. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/2142/90724.

Council of Science Editors:

Yan Z. Image recognition, semantic segmentation and photo adjustment using deep neural networks. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2016. Available from: http://hdl.handle.net/2142/90724


Carnegie Mellon University

7. Le, Ngan Thi Hoang. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.

Degree: 2018, Carnegie Mellon University

 Semantic labeling is becoming more and more popular among researchers in computer vision and machine learning. Many applications, such as autonomous driving, tracking, indoor navigation,… (more)

Subjects/Keywords: Gated Recurrent Unit; Level Set; Recurrent Neural Networks; Residual Network; Scene Labeling; Semantic Instance Segmentation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Le, N. T. H. (2018). Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. (Thesis). Carnegie Mellon University. Retrieved from http://repository.cmu.edu/dissertations/1166

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Le, Ngan Thi Hoang. “Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.” 2018. Thesis, Carnegie Mellon University. Accessed March 07, 2021. http://repository.cmu.edu/dissertations/1166.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Le, Ngan Thi Hoang. “Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.” 2018. Web. 07 Mar 2021.

Vancouver:

Le NTH. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. [Internet] [Thesis]. Carnegie Mellon University; 2018. [cited 2021 Mar 07]. Available from: http://repository.cmu.edu/dissertations/1166.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Le NTH. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. [Thesis]. Carnegie Mellon University; 2018. Available from: http://repository.cmu.edu/dissertations/1166

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

8. Sarika, Pawan Kumar. Comparing LSTM and GRU for Multiclass Sentiment Analysis of Movie Reviews.

Degree: 2020, , Faculty of Computing

  Today, we are living in a data-driven world. Due to a surge in data generation, there is a need for efficient and accurate techniques… (more)

Subjects/Keywords: Gated recurrent unit; Multiclass classification; Movie reviews; Sentiment Analysis; Recurrent neural network; Computer Systems; Datorsystem

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sarika, P. K. (2020). Comparing LSTM and GRU for Multiclass Sentiment Analysis of Movie Reviews. (Thesis). , Faculty of Computing. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:bth-20213

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sarika, Pawan Kumar. “Comparing LSTM and GRU for Multiclass Sentiment Analysis of Movie Reviews.” 2020. Thesis, , Faculty of Computing. Accessed March 07, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-20213.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sarika, Pawan Kumar. “Comparing LSTM and GRU for Multiclass Sentiment Analysis of Movie Reviews.” 2020. Web. 07 Mar 2021.

Vancouver:

Sarika PK. Comparing LSTM and GRU for Multiclass Sentiment Analysis of Movie Reviews. [Internet] [Thesis]. , Faculty of Computing; 2020. [cited 2021 Mar 07]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-20213.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sarika PK. Comparing LSTM and GRU for Multiclass Sentiment Analysis of Movie Reviews. [Thesis]. , Faculty of Computing; 2020. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-20213

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Addis Ababa University

9. Tewodros, Kibatu. ecurrent Neural Network-based Base Transceiver Station Power System Failure Prediction .

Degree: 2019, Addis Ababa University

 Global network infrastructures are increasing with the development of new technologies and growth in Internet traffic. As network infrastructures increases, maintaining and monitoring them will… (more)

Subjects/Keywords: Base Transceiver Station; Gated Recurrent Unit; Long Short Term Memory; Recurrent Neural Network

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tewodros, K. (2019). ecurrent Neural Network-based Base Transceiver Station Power System Failure Prediction . (Thesis). Addis Ababa University. Retrieved from http://etd.aau.edu.et/handle/123456789/21111

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tewodros, Kibatu. “ecurrent Neural Network-based Base Transceiver Station Power System Failure Prediction .” 2019. Thesis, Addis Ababa University. Accessed March 07, 2021. http://etd.aau.edu.et/handle/123456789/21111.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tewodros, Kibatu. “ecurrent Neural Network-based Base Transceiver Station Power System Failure Prediction .” 2019. Web. 07 Mar 2021.

Vancouver:

Tewodros K. ecurrent Neural Network-based Base Transceiver Station Power System Failure Prediction . [Internet] [Thesis]. Addis Ababa University; 2019. [cited 2021 Mar 07]. Available from: http://etd.aau.edu.et/handle/123456789/21111.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tewodros K. ecurrent Neural Network-based Base Transceiver Station Power System Failure Prediction . [Thesis]. Addis Ababa University; 2019. Available from: http://etd.aau.edu.et/handle/123456789/21111

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of California – Berkeley

10. Thanapirom, Chayut. Neural Representation Learning with Denoising Autoencoder Framework.

Degree: Physics, 2016, University of California – Berkeley

 Understanding of how the brain works and how it can solve difficult problems like image recognition is very important, especially for the progress in developing… (more)

Subjects/Keywords: Biophysics; Neurosciences; Attractor Network; Denoising Autoencoder; Grid Cells; Neural Representation; Recurrent Neural Network; Sparse Coding

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Thanapirom, C. (2016). Neural Representation Learning with Denoising Autoencoder Framework. (Thesis). University of California – Berkeley. Retrieved from http://www.escholarship.org/uc/item/0hm6p6s5

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Thanapirom, Chayut. “Neural Representation Learning with Denoising Autoencoder Framework.” 2016. Thesis, University of California – Berkeley. Accessed March 07, 2021. http://www.escholarship.org/uc/item/0hm6p6s5.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Thanapirom, Chayut. “Neural Representation Learning with Denoising Autoencoder Framework.” 2016. Web. 07 Mar 2021.

Vancouver:

Thanapirom C. Neural Representation Learning with Denoising Autoencoder Framework. [Internet] [Thesis]. University of California – Berkeley; 2016. [cited 2021 Mar 07]. Available from: http://www.escholarship.org/uc/item/0hm6p6s5.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Thanapirom C. Neural Representation Learning with Denoising Autoencoder Framework. [Thesis]. University of California – Berkeley; 2016. Available from: http://www.escholarship.org/uc/item/0hm6p6s5

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Bridgeport

11. Hassan, Abdalraouf. Deep Neural Language Model for Text Classification Based on Convolutional and Recurrent Neural Networks .

Degree: 2018, University of Bridgeport

 The evolution of the social media and the e-commerce sites produces a massive amount of unstructured text data on the internet. Thus, there is a… (more)

Subjects/Keywords: Convolutional neural network; Deep learning; Machine learning; Natural language processing; Recurrent neural network; Sentiment analysis

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hassan, A. (2018). Deep Neural Language Model for Text Classification Based on Convolutional and Recurrent Neural Networks . (Thesis). University of Bridgeport. Retrieved from https://scholarworks.bridgeport.edu/xmlui/handle/123456789/2274

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hassan, Abdalraouf. “Deep Neural Language Model for Text Classification Based on Convolutional and Recurrent Neural Networks .” 2018. Thesis, University of Bridgeport. Accessed March 07, 2021. https://scholarworks.bridgeport.edu/xmlui/handle/123456789/2274.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hassan, Abdalraouf. “Deep Neural Language Model for Text Classification Based on Convolutional and Recurrent Neural Networks .” 2018. Web. 07 Mar 2021.

Vancouver:

Hassan A. Deep Neural Language Model for Text Classification Based on Convolutional and Recurrent Neural Networks . [Internet] [Thesis]. University of Bridgeport; 2018. [cited 2021 Mar 07]. Available from: https://scholarworks.bridgeport.edu/xmlui/handle/123456789/2274.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hassan A. Deep Neural Language Model for Text Classification Based on Convolutional and Recurrent Neural Networks . [Thesis]. University of Bridgeport; 2018. Available from: https://scholarworks.bridgeport.edu/xmlui/handle/123456789/2274

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Victoria University of Wellington

12. Chandra, Rohitash. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.

Degree: 2012, Victoria University of Wellington

 One way to train neural networks is to use evolutionary algorithms such as cooperative coevolution - a method that decomposes the network's learnable parameters into… (more)

Subjects/Keywords: Neural networks; Cooperative coevolution; Recurrent network; Co-operative co-evolution

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chandra, R. (2012). Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. (Doctoral Dissertation). Victoria University of Wellington. Retrieved from http://hdl.handle.net/10063/2110

Chicago Manual of Style (16th Edition):

Chandra, Rohitash. “Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.” 2012. Doctoral Dissertation, Victoria University of Wellington. Accessed March 07, 2021. http://hdl.handle.net/10063/2110.

MLA Handbook (7th Edition):

Chandra, Rohitash. “Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.” 2012. Web. 07 Mar 2021.

Vancouver:

Chandra R. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. [Internet] [Doctoral dissertation]. Victoria University of Wellington; 2012. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/10063/2110.

Council of Science Editors:

Chandra R. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. [Doctoral Dissertation]. Victoria University of Wellington; 2012. Available from: http://hdl.handle.net/10063/2110


Georgia Tech

13. Chen, Hua. Single channel speech enhancement with residual learning and recurrent network.

Degree: MS, Electrical and Computer Engineering, 2020, Georgia Tech

 For speech enhancement tasks, non-stationary noise such as babble noise is much harder to suppress than stationary noise. In low SNR environment, it is even… (more)

Subjects/Keywords: Speech enhancement; Machine learning; ResNet; Convolutional recurrent neural network

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chen, H. (2020). Single channel speech enhancement with residual learning and recurrent network. (Masters Thesis). Georgia Tech. Retrieved from http://hdl.handle.net/1853/62839

Chicago Manual of Style (16th Edition):

Chen, Hua. “Single channel speech enhancement with residual learning and recurrent network.” 2020. Masters Thesis, Georgia Tech. Accessed March 07, 2021. http://hdl.handle.net/1853/62839.

MLA Handbook (7th Edition):

Chen, Hua. “Single channel speech enhancement with residual learning and recurrent network.” 2020. Web. 07 Mar 2021.

Vancouver:

Chen H. Single channel speech enhancement with residual learning and recurrent network. [Internet] [Masters thesis]. Georgia Tech; 2020. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/1853/62839.

Council of Science Editors:

Chen H. Single channel speech enhancement with residual learning and recurrent network. [Masters Thesis]. Georgia Tech; 2020. Available from: http://hdl.handle.net/1853/62839


UCLA

14. Li, Siyuan. Application of Recurrent Neural Networks In Toxic Comment Classification.

Degree: Statistics, 2018, UCLA

 Moderators of online discussion forums often struggle with controlling extremist comments on their platforms. To help provide an efficient and accurate tool to detect online… (more)

Subjects/Keywords: Statistics; classification; natural language processing; recurrent neural network; word2vec

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, S. (2018). Application of Recurrent Neural Networks In Toxic Comment Classification. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/5f87h061

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Li, Siyuan. “Application of Recurrent Neural Networks In Toxic Comment Classification.” 2018. Thesis, UCLA. Accessed March 07, 2021. http://www.escholarship.org/uc/item/5f87h061.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Li, Siyuan. “Application of Recurrent Neural Networks In Toxic Comment Classification.” 2018. Web. 07 Mar 2021.

Vancouver:

Li S. Application of Recurrent Neural Networks In Toxic Comment Classification. [Internet] [Thesis]. UCLA; 2018. [cited 2021 Mar 07]. Available from: http://www.escholarship.org/uc/item/5f87h061.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Li S. Application of Recurrent Neural Networks In Toxic Comment Classification. [Thesis]. UCLA; 2018. Available from: http://www.escholarship.org/uc/item/5f87h061

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Delft University of Technology

15. Mulder, Boris (author). Latent Space Modelling of Unsteady Flow Subdomains: Thesis Report.

Degree: 2019, Delft University of Technology

Very complex flows can be expensive to compute using current CFD techniques. In this thesis, models based on deep learning were used to replace certain… (more)

Subjects/Keywords: Aerodynamics; CFD; Deep Learning; Latent Space; Autoencoder; Recurrent Neural Network

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mulder, B. (. (2019). Latent Space Modelling of Unsteady Flow Subdomains: Thesis Report. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:fbf93d6e-211f-4b92-a057-956d694db315

Chicago Manual of Style (16th Edition):

Mulder, Boris (author). “Latent Space Modelling of Unsteady Flow Subdomains: Thesis Report.” 2019. Masters Thesis, Delft University of Technology. Accessed March 07, 2021. http://resolver.tudelft.nl/uuid:fbf93d6e-211f-4b92-a057-956d694db315.

MLA Handbook (7th Edition):

Mulder, Boris (author). “Latent Space Modelling of Unsteady Flow Subdomains: Thesis Report.” 2019. Web. 07 Mar 2021.

Vancouver:

Mulder B(. Latent Space Modelling of Unsteady Flow Subdomains: Thesis Report. [Internet] [Masters thesis]. Delft University of Technology; 2019. [cited 2021 Mar 07]. Available from: http://resolver.tudelft.nl/uuid:fbf93d6e-211f-4b92-a057-956d694db315.

Council of Science Editors:

Mulder B(. Latent Space Modelling of Unsteady Flow Subdomains: Thesis Report. [Masters Thesis]. Delft University of Technology; 2019. Available from: http://resolver.tudelft.nl/uuid:fbf93d6e-211f-4b92-a057-956d694db315


Delft University of Technology

16. Samad, Azlaan Mustafa (author). Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands.

Degree: 2020, Delft University of Technology

 In today’s scenario due to rapid urbanisation there has been a shift of population from rural to urban areas especially in developing countries in search… (more)

Subjects/Keywords: Deep Reinforcement Learning; Deep Q-Network; Recurrent Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Samad, A. M. (. (2020). Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:84d20f53-3be7-4e85-8588-b92b962b32fe

Chicago Manual of Style (16th Edition):

Samad, Azlaan Mustafa (author). “Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands.” 2020. Masters Thesis, Delft University of Technology. Accessed March 07, 2021. http://resolver.tudelft.nl/uuid:84d20f53-3be7-4e85-8588-b92b962b32fe.

MLA Handbook (7th Edition):

Samad, Azlaan Mustafa (author). “Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands.” 2020. Web. 07 Mar 2021.

Vancouver:

Samad AM(. Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands. [Internet] [Masters thesis]. Delft University of Technology; 2020. [cited 2021 Mar 07]. Available from: http://resolver.tudelft.nl/uuid:84d20f53-3be7-4e85-8588-b92b962b32fe.

Council of Science Editors:

Samad AM(. Multi Agent Deep Recurrent Q-Learning for Different Traffic Demands. [Masters Thesis]. Delft University of Technology; 2020. Available from: http://resolver.tudelft.nl/uuid:84d20f53-3be7-4e85-8588-b92b962b32fe


University of Texas – Austin

17. Zhong, Shijing. A review on constrained recurrent sparse auto-encoder.

Degree: MSin Computational Science, Engineering, and Mathematics, Computational Science, Engineering, and Mathematics, 2020, University of Texas – Austin

 Sparse Dictionary Learning generates a sparse representation for images and signals along with a generalized learned dictionary. We examine closely to the constrained recurrent sparse… (more)

Subjects/Keywords: Convolutional dictionary learning; FISTA; Recurrent neural network; Encoder-decoder

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhong, S. (2020). A review on constrained recurrent sparse auto-encoder. (Masters Thesis). University of Texas – Austin. Retrieved from http://dx.doi.org/10.26153/tsw/10925

Chicago Manual of Style (16th Edition):

Zhong, Shijing. “A review on constrained recurrent sparse auto-encoder.” 2020. Masters Thesis, University of Texas – Austin. Accessed March 07, 2021. http://dx.doi.org/10.26153/tsw/10925.

MLA Handbook (7th Edition):

Zhong, Shijing. “A review on constrained recurrent sparse auto-encoder.” 2020. Web. 07 Mar 2021.

Vancouver:

Zhong S. A review on constrained recurrent sparse auto-encoder. [Internet] [Masters thesis]. University of Texas – Austin; 2020. [cited 2021 Mar 07]. Available from: http://dx.doi.org/10.26153/tsw/10925.

Council of Science Editors:

Zhong S. A review on constrained recurrent sparse auto-encoder. [Masters Thesis]. University of Texas – Austin; 2020. Available from: http://dx.doi.org/10.26153/tsw/10925


University of New Mexico

18. Goudarzi, Alireza. Theory and Practice of Computing with Excitable Dynamics.

Degree: Department of Computer Science, 2016, University of New Mexico

  Reservoir computing (RC) is a promising paradigm for time series processing. In this paradigm, the desired output is computed by combining measurements of an… (more)

Subjects/Keywords: reservoir computing; recurrent neural network; excitable dynamics; dynamical systems; Computer Sciences

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Goudarzi, A. (2016). Theory and Practice of Computing with Excitable Dynamics. (Doctoral Dissertation). University of New Mexico. Retrieved from https://digitalrepository.unm.edu/cs_etds/81

Chicago Manual of Style (16th Edition):

Goudarzi, Alireza. “Theory and Practice of Computing with Excitable Dynamics.” 2016. Doctoral Dissertation, University of New Mexico. Accessed March 07, 2021. https://digitalrepository.unm.edu/cs_etds/81.

MLA Handbook (7th Edition):

Goudarzi, Alireza. “Theory and Practice of Computing with Excitable Dynamics.” 2016. Web. 07 Mar 2021.

Vancouver:

Goudarzi A. Theory and Practice of Computing with Excitable Dynamics. [Internet] [Doctoral dissertation]. University of New Mexico; 2016. [cited 2021 Mar 07]. Available from: https://digitalrepository.unm.edu/cs_etds/81.

Council of Science Editors:

Goudarzi A. Theory and Practice of Computing with Excitable Dynamics. [Doctoral Dissertation]. University of New Mexico; 2016. Available from: https://digitalrepository.unm.edu/cs_etds/81


University of Waterloo

19. Ruvinov, Igor. Recurrent Neural Network Dual Resistance Control of Multiple Memory Shape Memory Alloys.

Degree: 2018, University of Waterloo

 Shape memory alloys (SMAs) are materials with extraordinary thermomechanical properties which have caused numerous engineering advances. NiTi SMAs in particular have been studied for decades… (more)

Subjects/Keywords: Recurrent neural network; Shape memory alloys; Artificial intelligence

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ruvinov, I. (2018). Recurrent Neural Network Dual Resistance Control of Multiple Memory Shape Memory Alloys. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/13647

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Ruvinov, Igor. “Recurrent Neural Network Dual Resistance Control of Multiple Memory Shape Memory Alloys.” 2018. Thesis, University of Waterloo. Accessed March 07, 2021. http://hdl.handle.net/10012/13647.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Ruvinov, Igor. “Recurrent Neural Network Dual Resistance Control of Multiple Memory Shape Memory Alloys.” 2018. Web. 07 Mar 2021.

Vancouver:

Ruvinov I. Recurrent Neural Network Dual Resistance Control of Multiple Memory Shape Memory Alloys. [Internet] [Thesis]. University of Waterloo; 2018. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/10012/13647.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ruvinov I. Recurrent Neural Network Dual Resistance Control of Multiple Memory Shape Memory Alloys. [Thesis]. University of Waterloo; 2018. Available from: http://hdl.handle.net/10012/13647

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

20. Herzfeld, David James. Modeling and Computational Framework for the Specification and Simulation of Large-scale Spiking Neural Networks.

Degree: 2011, Marquette University

 Recurrently connected neural networks, in which synaptic connections between neurons can form directed cycles, have been used extensively in the literature to describe various neurophysiological… (more)

Subjects/Keywords: hemodynamic; neural network; recurrent; spiking; Biomedical Engineering and Bioengineering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Herzfeld, D. J. (2011). Modeling and Computational Framework for the Specification and Simulation of Large-scale Spiking Neural Networks. (Thesis). Marquette University. Retrieved from https://epublications.marquette.edu/theses_open/102

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Herzfeld, David James. “Modeling and Computational Framework for the Specification and Simulation of Large-scale Spiking Neural Networks.” 2011. Thesis, Marquette University. Accessed March 07, 2021. https://epublications.marquette.edu/theses_open/102.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Herzfeld, David James. “Modeling and Computational Framework for the Specification and Simulation of Large-scale Spiking Neural Networks.” 2011. Web. 07 Mar 2021.

Vancouver:

Herzfeld DJ. Modeling and Computational Framework for the Specification and Simulation of Large-scale Spiking Neural Networks. [Internet] [Thesis]. Marquette University; 2011. [cited 2021 Mar 07]. Available from: https://epublications.marquette.edu/theses_open/102.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Herzfeld DJ. Modeling and Computational Framework for the Specification and Simulation of Large-scale Spiking Neural Networks. [Thesis]. Marquette University; 2011. Available from: https://epublications.marquette.edu/theses_open/102

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Texas A&M University

21. Wang, Han. Dynamic Analysis of Recurrent Neural Networks.

Degree: PhD, Computer Science, 2020, Texas A&M University

 With the advancement in deep learning research, neural networks have become one of the most powerful tools for artificial intelligence tasks. More specifically, recurrent neural(more)

Subjects/Keywords: Recurrent Neural Network; machine learning; deep learning; dynamical system

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, H. (2020). Dynamic Analysis of Recurrent Neural Networks. (Doctoral Dissertation). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/191907

Chicago Manual of Style (16th Edition):

Wang, Han. “Dynamic Analysis of Recurrent Neural Networks.” 2020. Doctoral Dissertation, Texas A&M University. Accessed March 07, 2021. http://hdl.handle.net/1969.1/191907.

MLA Handbook (7th Edition):

Wang, Han. “Dynamic Analysis of Recurrent Neural Networks.” 2020. Web. 07 Mar 2021.

Vancouver:

Wang H. Dynamic Analysis of Recurrent Neural Networks. [Internet] [Doctoral dissertation]. Texas A&M University; 2020. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/1969.1/191907.

Council of Science Editors:

Wang H. Dynamic Analysis of Recurrent Neural Networks. [Doctoral Dissertation]. Texas A&M University; 2020. Available from: http://hdl.handle.net/1969.1/191907

22. Ellis, Robert. Leveraging local and global word context for multi-label document classification.

Degree: 2020, Athabasca University

With the increasing volume of text documents, it is crucial to identify the themes and topics contained within. Labelling documents with the identified topics is… (more)

Subjects/Keywords: Recurrent; Convolutional; Neural network; Classification; Attention; Hierarchy; Ensemble; Siamese

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ellis, R. (2020). Leveraging local and global word context for multi-label document classification. (Thesis). Athabasca University. Retrieved from http://hdl.handle.net/10791/334

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Ellis, Robert. “Leveraging local and global word context for multi-label document classification.” 2020. Thesis, Athabasca University. Accessed March 07, 2021. http://hdl.handle.net/10791/334.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Ellis, Robert. “Leveraging local and global word context for multi-label document classification.” 2020. Web. 07 Mar 2021.

Vancouver:

Ellis R. Leveraging local and global word context for multi-label document classification. [Internet] [Thesis]. Athabasca University; 2020. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/10791/334.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ellis R. Leveraging local and global word context for multi-label document classification. [Thesis]. Athabasca University; 2020. Available from: http://hdl.handle.net/10791/334

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

23. Lundström, Oscar. Learning a Better Attitude: A Recurrent Neural Filter for Orientation Estimation .

Degree: Chalmers tekniska högskola / Institutionen för mekanik och maritima vetenskaper, 2020, Chalmers University of Technology

 In the current paradigm of sensor fusion orientation estimation from inertial measurements unit sensor data is done using techniques derived with Bayesian statistics. These derivations… (more)

Subjects/Keywords: sensor-fusion; state estimation; absolute orientation estimation; recurrent neural filter; recurrent neural network; RNN; LSTM; IMU; MARG

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lundström, O. (2020). Learning a Better Attitude: A Recurrent Neural Filter for Orientation Estimation . (Thesis). Chalmers University of Technology. Retrieved from http://hdl.handle.net/20.500.12380/300918

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lundström, Oscar. “Learning a Better Attitude: A Recurrent Neural Filter for Orientation Estimation .” 2020. Thesis, Chalmers University of Technology. Accessed March 07, 2021. http://hdl.handle.net/20.500.12380/300918.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lundström, Oscar. “Learning a Better Attitude: A Recurrent Neural Filter for Orientation Estimation .” 2020. Web. 07 Mar 2021.

Vancouver:

Lundström O. Learning a Better Attitude: A Recurrent Neural Filter for Orientation Estimation . [Internet] [Thesis]. Chalmers University of Technology; 2020. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/20.500.12380/300918.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lundström O. Learning a Better Attitude: A Recurrent Neural Filter for Orientation Estimation . [Thesis]. Chalmers University of Technology; 2020. Available from: http://hdl.handle.net/20.500.12380/300918

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

24. Oskarsson, Gustav. Aktieprediktion med neurala nätverk : En jämförelse av statistiska modeller, neurala nätverk och kombinerade neurala nätverk.

Degree: 2019, , Department of Industrial Economics

This study is about prediction of the stockmarket through a comparison of neural networks and statistical models. The study aims to improve the accuracy… (more)

Subjects/Keywords: neural network; stock market; recurrent neural network; neuralt nätverk; aktiemarknad; recurrent neural network; Other Engineering and Technologies not elsewhere specified; Övrig annan teknik

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Oskarsson, G. (2019). Aktieprediktion med neurala nätverk : En jämförelse av statistiska modeller, neurala nätverk och kombinerade neurala nätverk. (Thesis). , Department of Industrial Economics. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18214

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Oskarsson, Gustav. “Aktieprediktion med neurala nätverk : En jämförelse av statistiska modeller, neurala nätverk och kombinerade neurala nätverk.” 2019. Thesis, , Department of Industrial Economics. Accessed March 07, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18214.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Oskarsson, Gustav. “Aktieprediktion med neurala nätverk : En jämförelse av statistiska modeller, neurala nätverk och kombinerade neurala nätverk.” 2019. Web. 07 Mar 2021.

Vancouver:

Oskarsson G. Aktieprediktion med neurala nätverk : En jämförelse av statistiska modeller, neurala nätverk och kombinerade neurala nätverk. [Internet] [Thesis]. , Department of Industrial Economics; 2019. [cited 2021 Mar 07]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18214.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Oskarsson G. Aktieprediktion med neurala nätverk : En jämförelse av statistiska modeller, neurala nätverk och kombinerade neurala nätverk. [Thesis]. , Department of Industrial Economics; 2019. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-18214

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Oklahoma State University

25. Phan, Manh Cong. Recurrent Neural Networks: Error Surface Analysis and Improved Training.

Degree: Electrical Engineering, 2014, Oklahoma State University

Recurrent neural networks (RNNs) have powerful computational abilities and could be used in a variety of applications; however, training these networks is still a difficult… (more)

Subjects/Keywords: error surface; neural control; recurrent neural network; spurious valley; system identification; training

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Phan, M. C. (2014). Recurrent Neural Networks: Error Surface Analysis and Improved Training. (Thesis). Oklahoma State University. Retrieved from http://hdl.handle.net/11244/15063

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Phan, Manh Cong. “Recurrent Neural Networks: Error Surface Analysis and Improved Training.” 2014. Thesis, Oklahoma State University. Accessed March 07, 2021. http://hdl.handle.net/11244/15063.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Phan, Manh Cong. “Recurrent Neural Networks: Error Surface Analysis and Improved Training.” 2014. Web. 07 Mar 2021.

Vancouver:

Phan MC. Recurrent Neural Networks: Error Surface Analysis and Improved Training. [Internet] [Thesis]. Oklahoma State University; 2014. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/11244/15063.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Phan MC. Recurrent Neural Networks: Error Surface Analysis and Improved Training. [Thesis]. Oklahoma State University; 2014. Available from: http://hdl.handle.net/11244/15063

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Louisiana State University

26. Firth, Robert James. A Novel Recurrent Convolutional Neural Network for Ocean and Weather Forecasting.

Degree: PhD, Computer Sciences, 2016, Louisiana State University

 Numerical weather prediction is a computationally expensive task that requires not only the numerical solution to a complex set of non-linear partial differential equations, but… (more)

Subjects/Keywords: time series; spatial; temporal; time step network; convolutional; recurrent; neural network; weather forecasting; ocean forecasting

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Firth, R. J. (2016). A Novel Recurrent Convolutional Neural Network for Ocean and Weather Forecasting. (Doctoral Dissertation). Louisiana State University. Retrieved from etd-04112016-151259 ; https://digitalcommons.lsu.edu/gradschool_dissertations/2099

Chicago Manual of Style (16th Edition):

Firth, Robert James. “A Novel Recurrent Convolutional Neural Network for Ocean and Weather Forecasting.” 2016. Doctoral Dissertation, Louisiana State University. Accessed March 07, 2021. etd-04112016-151259 ; https://digitalcommons.lsu.edu/gradschool_dissertations/2099.

MLA Handbook (7th Edition):

Firth, Robert James. “A Novel Recurrent Convolutional Neural Network for Ocean and Weather Forecasting.” 2016. Web. 07 Mar 2021.

Vancouver:

Firth RJ. A Novel Recurrent Convolutional Neural Network for Ocean and Weather Forecasting. [Internet] [Doctoral dissertation]. Louisiana State University; 2016. [cited 2021 Mar 07]. Available from: etd-04112016-151259 ; https://digitalcommons.lsu.edu/gradschool_dissertations/2099.

Council of Science Editors:

Firth RJ. A Novel Recurrent Convolutional Neural Network for Ocean and Weather Forecasting. [Doctoral Dissertation]. Louisiana State University; 2016. Available from: etd-04112016-151259 ; https://digitalcommons.lsu.edu/gradschool_dissertations/2099


The Ohio State University

27. Zheng, Yilin. Text-Based Speech Video Synthesis from a Single Face Image.

Degree: MS, Electrical and Computer Engineering, 2019, The Ohio State University

 Speech video synthesis is a task to generate talking characters which look realistic to human evaluators. Previously, most of the studies used animation models and… (more)

Subjects/Keywords: Computer Science; Computer Engineering; Face Image Synthesis, Generative Adversarial Network, Recurrent Neural Network

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zheng, Y. (2019). Text-Based Speech Video Synthesis from a Single Face Image. (Masters Thesis). The Ohio State University. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=osu1572168353691788

Chicago Manual of Style (16th Edition):

Zheng, Yilin. “Text-Based Speech Video Synthesis from a Single Face Image.” 2019. Masters Thesis, The Ohio State University. Accessed March 07, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=osu1572168353691788.

MLA Handbook (7th Edition):

Zheng, Yilin. “Text-Based Speech Video Synthesis from a Single Face Image.” 2019. Web. 07 Mar 2021.

Vancouver:

Zheng Y. Text-Based Speech Video Synthesis from a Single Face Image. [Internet] [Masters thesis]. The Ohio State University; 2019. [cited 2021 Mar 07]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1572168353691788.

Council of Science Editors:

Zheng Y. Text-Based Speech Video Synthesis from a Single Face Image. [Masters Thesis]. The Ohio State University; 2019. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1572168353691788


Delft University of Technology

28. Voss, Sander (author). Application of Deep Learning for Spacecraft Fault Detection and Isolation.

Degree: 2019, Delft University of Technology

Spacecraft require high availability, autonomous operation, and a high degree of mission success. Spacecraft use sensors, such as star trackers and GPS, and actuators, such… (more)

Subjects/Keywords: FDI; FDIR; Fault Detection; Fault Isolation; Deep Learning; Recurrent networks; Recurrent Neural Network; long short-term memory networks; LSTM

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Voss, S. (. (2019). Application of Deep Learning for Spacecraft Fault Detection and Isolation. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:7c308a4b-f97b-4a83-b739-4019ad306853

Chicago Manual of Style (16th Edition):

Voss, Sander (author). “Application of Deep Learning for Spacecraft Fault Detection and Isolation.” 2019. Masters Thesis, Delft University of Technology. Accessed March 07, 2021. http://resolver.tudelft.nl/uuid:7c308a4b-f97b-4a83-b739-4019ad306853.

MLA Handbook (7th Edition):

Voss, Sander (author). “Application of Deep Learning for Spacecraft Fault Detection and Isolation.” 2019. Web. 07 Mar 2021.

Vancouver:

Voss S(. Application of Deep Learning for Spacecraft Fault Detection and Isolation. [Internet] [Masters thesis]. Delft University of Technology; 2019. [cited 2021 Mar 07]. Available from: http://resolver.tudelft.nl/uuid:7c308a4b-f97b-4a83-b739-4019ad306853.

Council of Science Editors:

Voss S(. Application of Deep Learning for Spacecraft Fault Detection and Isolation. [Masters Thesis]. Delft University of Technology; 2019. Available from: http://resolver.tudelft.nl/uuid:7c308a4b-f97b-4a83-b739-4019ad306853


Brno University of Technology

29. Huf, Petr. Machine Learning Strategies in Electronic Trading: Machine Learning Strategies in Electronic Trading.

Degree: 2019, Brno University of Technology

 Successful stock trading is a dream of many people. Eletronic trading is an interesting branch of this business. The trading strategy runs on the computer… (more)

Subjects/Keywords: neuronová síť; rekurentní neuronová síť; burza; obchodování; trh; profit; model; neural network; recurrent neural network; stock exchange; trading; market; profit; model

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Huf, P. (2019). Machine Learning Strategies in Electronic Trading: Machine Learning Strategies in Electronic Trading. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/56492

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Huf, Petr. “Machine Learning Strategies in Electronic Trading: Machine Learning Strategies in Electronic Trading.” 2019. Thesis, Brno University of Technology. Accessed March 07, 2021. http://hdl.handle.net/11012/56492.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Huf, Petr. “Machine Learning Strategies in Electronic Trading: Machine Learning Strategies in Electronic Trading.” 2019. Web. 07 Mar 2021.

Vancouver:

Huf P. Machine Learning Strategies in Electronic Trading: Machine Learning Strategies in Electronic Trading. [Internet] [Thesis]. Brno University of Technology; 2019. [cited 2021 Mar 07]. Available from: http://hdl.handle.net/11012/56492.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Huf P. Machine Learning Strategies in Electronic Trading: Machine Learning Strategies in Electronic Trading. [Thesis]. Brno University of Technology; 2019. Available from: http://hdl.handle.net/11012/56492

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

30. 이, 유림. Chronic Kidney Disease Risk Prediction using Electronic Health Records Pattern Information based on Deep Learning.

Degree: 2019, Ajou University

According to the National Health Insurance Service(NHIS) in 2013, 3.9% of adults aged 30 years or older have chronic kidney disease, and 16.5% are over… (more)

Subjects/Keywords: Electronic Health Records; Convolutional Neural Network; Recurrent Neural Network; Embedding; Attention; 전자의무기록; 순환 신경망; 합성곱 신경망; 임베딩 기법; 어텐션 메커니즘

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

이, . (2019). Chronic Kidney Disease Risk Prediction using Electronic Health Records Pattern Information based on Deep Learning. (Thesis). Ajou University. Retrieved from http://repository.ajou.ac.kr/handle/201003/17869 ; http://dcoll.ajou.ac.kr:9080/dcollection/jsp/common/DcLoOrgPer.jsp?sItemId=000000028811

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

이, 유림. “Chronic Kidney Disease Risk Prediction using Electronic Health Records Pattern Information based on Deep Learning.” 2019. Thesis, Ajou University. Accessed March 07, 2021. http://repository.ajou.ac.kr/handle/201003/17869 ; http://dcoll.ajou.ac.kr:9080/dcollection/jsp/common/DcLoOrgPer.jsp?sItemId=000000028811.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

이, 유림. “Chronic Kidney Disease Risk Prediction using Electronic Health Records Pattern Information based on Deep Learning.” 2019. Web. 07 Mar 2021.

Vancouver:

이 . Chronic Kidney Disease Risk Prediction using Electronic Health Records Pattern Information based on Deep Learning. [Internet] [Thesis]. Ajou University; 2019. [cited 2021 Mar 07]. Available from: http://repository.ajou.ac.kr/handle/201003/17869 ; http://dcoll.ajou.ac.kr:9080/dcollection/jsp/common/DcLoOrgPer.jsp?sItemId=000000028811.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

이 . Chronic Kidney Disease Risk Prediction using Electronic Health Records Pattern Information based on Deep Learning. [Thesis]. Ajou University; 2019. Available from: http://repository.ajou.ac.kr/handle/201003/17869 ; http://dcoll.ajou.ac.kr:9080/dcollection/jsp/common/DcLoOrgPer.jsp?sItemId=000000028811

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

[1] [2] [3] [4] [5] [6]

.