Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Recurrent neural networks). Showing records 1 – 30 of 152 total matches.

[1] [2] [3] [4] [5] [6]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Languages

Country

▼ Search Limiters


University of Waterloo

1. Caterini, Anthony. A Novel Mathematical Framework for the Analysis of Neural Networks.

Degree: 2017, University of Waterloo

 Over the past decade, Deep Neural Networks (DNNs) have become very popular models for processing large amounts of data because of their successful application in… (more)

Subjects/Keywords: Neural Networks; Convolutional Neural Networks; Deep Neural Networks; Machine Learning; Recurrent Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Caterini, A. (2017). A Novel Mathematical Framework for the Analysis of Neural Networks. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/12173

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Caterini, Anthony. “A Novel Mathematical Framework for the Analysis of Neural Networks.” 2017. Thesis, University of Waterloo. Accessed July 16, 2019. http://hdl.handle.net/10012/12173.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Caterini, Anthony. “A Novel Mathematical Framework for the Analysis of Neural Networks.” 2017. Web. 16 Jul 2019.

Vancouver:

Caterini A. A Novel Mathematical Framework for the Analysis of Neural Networks. [Internet] [Thesis]. University of Waterloo; 2017. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/10012/12173.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Caterini A. A Novel Mathematical Framework for the Analysis of Neural Networks. [Thesis]. University of Waterloo; 2017. Available from: http://hdl.handle.net/10012/12173

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


The Ohio State University

2. Shao, Yuanlong. Learning Sparse Recurrent Neural Networks in Language Modeling.

Degree: MS, Computer Science and Engineering, 2014, The Ohio State University

 In the context of statistical language modeling, we explored the task of learning an Elman network with sparse weight matrices, as a pilot study towards… (more)

Subjects/Keywords: Computer Science; Artificial Intelligence; language modeling; recurrent neural networks; sparse recurrent neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Shao, Y. (2014). Learning Sparse Recurrent Neural Networks in Language Modeling. (Masters Thesis). The Ohio State University. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373

Chicago Manual of Style (16th Edition):

Shao, Yuanlong. “Learning Sparse Recurrent Neural Networks in Language Modeling.” 2014. Masters Thesis, The Ohio State University. Accessed July 16, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373.

MLA Handbook (7th Edition):

Shao, Yuanlong. “Learning Sparse Recurrent Neural Networks in Language Modeling.” 2014. Web. 16 Jul 2019.

Vancouver:

Shao Y. Learning Sparse Recurrent Neural Networks in Language Modeling. [Internet] [Masters thesis]. The Ohio State University; 2014. [cited 2019 Jul 16]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373.

Council of Science Editors:

Shao Y. Learning Sparse Recurrent Neural Networks in Language Modeling. [Masters Thesis]. The Ohio State University; 2014. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1398942373


ETH Zürich

3. Neil, Daniel. Deep Neural Networks and Hardware Systems for Event-driven Data.

Degree: 2017, ETH Zürich

 Event-based sensors, built with biological inspiration, differ greatly from traditional sensor types. A standard vision sensor uses a pixel array to produce a frame containing… (more)

Subjects/Keywords: Deep Neural Networks; Event-driven sensors; Deep neural networks (DNNs); Spiking deep neural networks; Recurrent Neural Networks; Convolutional neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Neil, D. (2017). Deep Neural Networks and Hardware Systems for Event-driven Data. (Doctoral Dissertation). ETH Zürich. Retrieved from http://hdl.handle.net/20.500.11850/168865

Chicago Manual of Style (16th Edition):

Neil, Daniel. “Deep Neural Networks and Hardware Systems for Event-driven Data.” 2017. Doctoral Dissertation, ETH Zürich. Accessed July 16, 2019. http://hdl.handle.net/20.500.11850/168865.

MLA Handbook (7th Edition):

Neil, Daniel. “Deep Neural Networks and Hardware Systems for Event-driven Data.” 2017. Web. 16 Jul 2019.

Vancouver:

Neil D. Deep Neural Networks and Hardware Systems for Event-driven Data. [Internet] [Doctoral dissertation]. ETH Zürich; 2017. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/20.500.11850/168865.

Council of Science Editors:

Neil D. Deep Neural Networks and Hardware Systems for Event-driven Data. [Doctoral Dissertation]. ETH Zürich; 2017. Available from: http://hdl.handle.net/20.500.11850/168865


NSYSU

4. Wang, Hao-Yi. The Impacts of Image Contexts on Dialogue Systems.

Degree: Master, Information Management, 2018, NSYSU

 Chatting with machines is not only possible but also more and more common in our lives these days. With the approach, we can execute commands… (more)

Subjects/Keywords: , Dialogue; Convolutional neural networks; Recurrent neural networks; Image recognition; Natural language; Neural networks; Machine learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, H. (2018). The Impacts of Image Contexts on Dialogue Systems. (Thesis). NSYSU. Retrieved from http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0616118-181354

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Wang, Hao-Yi. “The Impacts of Image Contexts on Dialogue Systems.” 2018. Thesis, NSYSU. Accessed July 16, 2019. http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0616118-181354.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Wang, Hao-Yi. “The Impacts of Image Contexts on Dialogue Systems.” 2018. Web. 16 Jul 2019.

Vancouver:

Wang H. The Impacts of Image Contexts on Dialogue Systems. [Internet] [Thesis]. NSYSU; 2018. [cited 2019 Jul 16]. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0616118-181354.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang H. The Impacts of Image Contexts on Dialogue Systems. [Thesis]. NSYSU; 2018. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0616118-181354

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Ottawa

5. Ayoub, Issa. Multimodal Affective Computing Using Temporal Convolutional Neural Network and Deep Convolutional Neural Networks .

Degree: 2019, University of Ottawa

 Affective computing has gained significant attention from researchers in the last decade due to the wide variety of applications that can benefit from this technology.… (more)

Subjects/Keywords: Temporal Convolutional Neural Networks; Recurrent Neural Networks; Gaussian Processes; Hyperparameter Optimization; Convolutional Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ayoub, I. (2019). Multimodal Affective Computing Using Temporal Convolutional Neural Network and Deep Convolutional Neural Networks . (Thesis). University of Ottawa. Retrieved from http://hdl.handle.net/10393/39337

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Ayoub, Issa. “Multimodal Affective Computing Using Temporal Convolutional Neural Network and Deep Convolutional Neural Networks .” 2019. Thesis, University of Ottawa. Accessed July 16, 2019. http://hdl.handle.net/10393/39337.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Ayoub, Issa. “Multimodal Affective Computing Using Temporal Convolutional Neural Network and Deep Convolutional Neural Networks .” 2019. Web. 16 Jul 2019.

Vancouver:

Ayoub I. Multimodal Affective Computing Using Temporal Convolutional Neural Network and Deep Convolutional Neural Networks . [Internet] [Thesis]. University of Ottawa; 2019. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/10393/39337.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ayoub I. Multimodal Affective Computing Using Temporal Convolutional Neural Network and Deep Convolutional Neural Networks . [Thesis]. University of Ottawa; 2019. Available from: http://hdl.handle.net/10393/39337

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Carnegie Mellon University

6. Le, Ngan Thi Hoang. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.

Degree: 2018, Carnegie Mellon University

 Semantic labeling is becoming more and more popular among researchers in computer vision and machine learning. Many applications, such as autonomous driving, tracking, indoor navigation,… (more)

Subjects/Keywords: Gated Recurrent Unit; Level Set; Recurrent Neural Networks; Residual Network; Scene Labeling; Semantic Instance Segmentation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Le, N. T. H. (2018). Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. (Thesis). Carnegie Mellon University. Retrieved from http://repository.cmu.edu/dissertations/1166

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Le, Ngan Thi Hoang. “Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.” 2018. Thesis, Carnegie Mellon University. Accessed July 16, 2019. http://repository.cmu.edu/dissertations/1166.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Le, Ngan Thi Hoang. “Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling.” 2018. Web. 16 Jul 2019.

Vancouver:

Le NTH. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. [Internet] [Thesis]. Carnegie Mellon University; 2018. [cited 2019 Jul 16]. Available from: http://repository.cmu.edu/dissertations/1166.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Le NTH. Contextual Recurrent Level Set Networks and Recurrent Residual Networks for Semantic Labeling. [Thesis]. Carnegie Mellon University; 2018. Available from: http://repository.cmu.edu/dissertations/1166

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


KTH

7. Lustig, Joakim. Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns.

Degree: Computer Science and Communication (CSC), 2016, KTH

  Dyslexia affects between 5-17% of all school children, mak-ing it the most common learning disability. It has beenfound to severely affect learning ability in… (more)

Subjects/Keywords: dyslexia; machine learning; neural networks; recurrent neural networks; Computer Sciences; Datavetenskap (datalogi)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lustig, J. (2016). Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns. (Thesis). KTH. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191233

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lustig, Joakim. “Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns.” 2016. Thesis, KTH. Accessed July 16, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191233.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lustig, Joakim. “Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns.” 2016. Web. 16 Jul 2019.

Vancouver:

Lustig J. Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns. [Internet] [Thesis]. KTH; 2016. [cited 2019 Jul 16]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191233.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lustig J. Identifying dyslectic gaze pattern : Comparison of methods for identifying dyslectic readers based on eye movement patterns. [Thesis]. KTH; 2016. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-191233

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of California – San Diego

8. Tripathi, Subarna. Improving Object Detection and Segmentation by Utilizing Context.

Degree: Electrical Engineering (Signal and Image Proc), 2018, University of California – San Diego

 Object detection and segmentation are important computer vision problems that have applications in several domains such as autonomous driving, virtual and augmented reality systems, human-computer… (more)

Subjects/Keywords: Computer science; Convolutional Neural Networks; Deep Learning; Object Detection; Recurrent Neural Networks; Segmentation; Video Processing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tripathi, S. (2018). Improving Object Detection and Segmentation by Utilizing Context. (Thesis). University of California – San Diego. Retrieved from http://www.escholarship.org/uc/item/5955t4nq

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tripathi, Subarna. “Improving Object Detection and Segmentation by Utilizing Context.” 2018. Thesis, University of California – San Diego. Accessed July 16, 2019. http://www.escholarship.org/uc/item/5955t4nq.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tripathi, Subarna. “Improving Object Detection and Segmentation by Utilizing Context.” 2018. Web. 16 Jul 2019.

Vancouver:

Tripathi S. Improving Object Detection and Segmentation by Utilizing Context. [Internet] [Thesis]. University of California – San Diego; 2018. [cited 2019 Jul 16]. Available from: http://www.escholarship.org/uc/item/5955t4nq.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tripathi S. Improving Object Detection and Segmentation by Utilizing Context. [Thesis]. University of California – San Diego; 2018. Available from: http://www.escholarship.org/uc/item/5955t4nq

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


IUPUI

9. Raptis, Konstantinos. The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet.

Degree: 2016, IUPUI

Indiana University-Purdue University Indianapolis (IUPUI)

Action recognition has been an active research topic for over three decades. There are various applications of action recognition, such… (more)

Subjects/Keywords: Action Recognition; Dense Trajectories; R-CNN; LSTM RNN; Convolution Neural Networks; Recurrent Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Raptis, K. (2016). The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet. (Thesis). IUPUI. Retrieved from http://hdl.handle.net/1805/11827

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Raptis, Konstantinos. “The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet.” 2016. Thesis, IUPUI. Accessed July 16, 2019. http://hdl.handle.net/1805/11827.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Raptis, Konstantinos. “The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet.” 2016. Web. 16 Jul 2019.

Vancouver:

Raptis K. The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet. [Internet] [Thesis]. IUPUI; 2016. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/1805/11827.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Raptis K. The clash between two worlds in human action recognition: supervised feature training vs Recurrent ConvNet. [Thesis]. IUPUI; 2016. Available from: http://hdl.handle.net/1805/11827

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Toronto

10. Li, Yunpeng. Word Representation Using A Deep Neural Network.

Degree: 2016, University of Toronto

Word representation or word embedding is an important step in understanding languages. It maps similar words to vectors that are close in space. In this… (more)

Subjects/Keywords: Morphology; Natural Language Processing; Recurrent Neural Networks; Recursive Neural Networks; Word Embedding; 0800

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, Y. (2016). Word Representation Using A Deep Neural Network. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/75155

Chicago Manual of Style (16th Edition):

Li, Yunpeng. “Word Representation Using A Deep Neural Network.” 2016. Masters Thesis, University of Toronto. Accessed July 16, 2019. http://hdl.handle.net/1807/75155.

MLA Handbook (7th Edition):

Li, Yunpeng. “Word Representation Using A Deep Neural Network.” 2016. Web. 16 Jul 2019.

Vancouver:

Li Y. Word Representation Using A Deep Neural Network. [Internet] [Masters thesis]. University of Toronto; 2016. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/1807/75155.

Council of Science Editors:

Li Y. Word Representation Using A Deep Neural Network. [Masters Thesis]. University of Toronto; 2016. Available from: http://hdl.handle.net/1807/75155


University of Colorado

11. Maierhofer, Jeffrey Matthew. Lifetime Limited Memory Neural Networks.

Degree: MS, 2019, University of Colorado

  In the modern digital environment, many data sources can be characterized as event sequences. These event sequences describe a series of events and an… (more)

Subjects/Keywords: events; neural networks; prediction; recurrent neural networks; Applied Mathematics; Computer Sciences; Statistics and Probability

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Maierhofer, J. M. (2019). Lifetime Limited Memory Neural Networks. (Masters Thesis). University of Colorado. Retrieved from https://scholar.colorado.edu/appm_gradetds/149

Chicago Manual of Style (16th Edition):

Maierhofer, Jeffrey Matthew. “Lifetime Limited Memory Neural Networks.” 2019. Masters Thesis, University of Colorado. Accessed July 16, 2019. https://scholar.colorado.edu/appm_gradetds/149.

MLA Handbook (7th Edition):

Maierhofer, Jeffrey Matthew. “Lifetime Limited Memory Neural Networks.” 2019. Web. 16 Jul 2019.

Vancouver:

Maierhofer JM. Lifetime Limited Memory Neural Networks. [Internet] [Masters thesis]. University of Colorado; 2019. [cited 2019 Jul 16]. Available from: https://scholar.colorado.edu/appm_gradetds/149.

Council of Science Editors:

Maierhofer JM. Lifetime Limited Memory Neural Networks. [Masters Thesis]. University of Colorado; 2019. Available from: https://scholar.colorado.edu/appm_gradetds/149


Georgia Tech

12. Kim, Young Jin. A deep learning and parallel simulation methodology for air traffic management.

Degree: PhD, Aerospace Engineering, 2017, Georgia Tech

 Air traffic management is widely studied in several different fields because of its complexity and criticality to a variety of stakeholders including passengers, airlines, regulatory… (more)

Subjects/Keywords: parallel simulation; recurrent neural networks; air traffic management

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kim, Y. J. (2017). A deep learning and parallel simulation methodology for air traffic management. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/59180

Chicago Manual of Style (16th Edition):

Kim, Young Jin. “A deep learning and parallel simulation methodology for air traffic management.” 2017. Doctoral Dissertation, Georgia Tech. Accessed July 16, 2019. http://hdl.handle.net/1853/59180.

MLA Handbook (7th Edition):

Kim, Young Jin. “A deep learning and parallel simulation methodology for air traffic management.” 2017. Web. 16 Jul 2019.

Vancouver:

Kim YJ. A deep learning and parallel simulation methodology for air traffic management. [Internet] [Doctoral dissertation]. Georgia Tech; 2017. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/1853/59180.

Council of Science Editors:

Kim YJ. A deep learning and parallel simulation methodology for air traffic management. [Doctoral Dissertation]. Georgia Tech; 2017. Available from: http://hdl.handle.net/1853/59180


University of Melbourne

13. Mhammedi, Zakaria. Efficient orthogonal parametrisation of recurrent neural networks using householder reflections.

Degree: 2017, University of Melbourne

 In machine learning, Recurrent Neural Networks (RNNs) have been successfully used in many applications. They are particularly well suited for problems involving time-series. This is… (more)

Subjects/Keywords: machine learning; recurrent neural networks; exploding gradients; orthogonal parametrisation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mhammedi, Z. (2017). Efficient orthogonal parametrisation of recurrent neural networks using householder reflections. (Masters Thesis). University of Melbourne. Retrieved from http://hdl.handle.net/11343/192906

Chicago Manual of Style (16th Edition):

Mhammedi, Zakaria. “Efficient orthogonal parametrisation of recurrent neural networks using householder reflections.” 2017. Masters Thesis, University of Melbourne. Accessed July 16, 2019. http://hdl.handle.net/11343/192906.

MLA Handbook (7th Edition):

Mhammedi, Zakaria. “Efficient orthogonal parametrisation of recurrent neural networks using householder reflections.” 2017. Web. 16 Jul 2019.

Vancouver:

Mhammedi Z. Efficient orthogonal parametrisation of recurrent neural networks using householder reflections. [Internet] [Masters thesis]. University of Melbourne; 2017. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/11343/192906.

Council of Science Editors:

Mhammedi Z. Efficient orthogonal parametrisation of recurrent neural networks using householder reflections. [Masters Thesis]. University of Melbourne; 2017. Available from: http://hdl.handle.net/11343/192906


UCLA

14. Hardy, Nicholas. Neural network dynamics of temporal processing.

Degree: Neuroscience, 2018, UCLA

 Time is centrally involved in most tasks the brain performs. However, the neurobiological mechanisms of timing remain a mystery. Signatures of temporal processing related to… (more)

Subjects/Keywords: Neurosciences; Calcium imaging; Cortex; Recurrent neural networks; Theory; Timing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hardy, N. (2018). Neural network dynamics of temporal processing. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/9kd0p4h8

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hardy, Nicholas. “Neural network dynamics of temporal processing.” 2018. Thesis, UCLA. Accessed July 16, 2019. http://www.escholarship.org/uc/item/9kd0p4h8.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hardy, Nicholas. “Neural network dynamics of temporal processing.” 2018. Web. 16 Jul 2019.

Vancouver:

Hardy N. Neural network dynamics of temporal processing. [Internet] [Thesis]. UCLA; 2018. [cited 2019 Jul 16]. Available from: http://www.escholarship.org/uc/item/9kd0p4h8.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hardy N. Neural network dynamics of temporal processing. [Thesis]. UCLA; 2018. Available from: http://www.escholarship.org/uc/item/9kd0p4h8

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Waterloo

15. Mohajerin, Nima. Modeling Dynamic Systems for Multi-Step Prediction with Recurrent Neural Networks.

Degree: 2017, University of Waterloo

 This thesis investigates the applicability of Recurrent Neural Networks (RNNs) and Deep Learning methods for multi-step prediction of robotic systems. The unmodeled dynamics and simplifying… (more)

Subjects/Keywords: recurrent neural networks; system identification; multi-step prediction; deep learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mohajerin, N. (2017). Modeling Dynamic Systems for Multi-Step Prediction with Recurrent Neural Networks. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/12766

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Mohajerin, Nima. “Modeling Dynamic Systems for Multi-Step Prediction with Recurrent Neural Networks.” 2017. Thesis, University of Waterloo. Accessed July 16, 2019. http://hdl.handle.net/10012/12766.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Mohajerin, Nima. “Modeling Dynamic Systems for Multi-Step Prediction with Recurrent Neural Networks.” 2017. Web. 16 Jul 2019.

Vancouver:

Mohajerin N. Modeling Dynamic Systems for Multi-Step Prediction with Recurrent Neural Networks. [Internet] [Thesis]. University of Waterloo; 2017. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/10012/12766.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Mohajerin N. Modeling Dynamic Systems for Multi-Step Prediction with Recurrent Neural Networks. [Thesis]. University of Waterloo; 2017. Available from: http://hdl.handle.net/10012/12766

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Rice University

16. Michalenko, Joshua James. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.

Degree: MS, Engineering, 2019, Rice University

 We investigate the internal representations that a recurrent neural network (RNN) uses while learning to recognize a regular formal language. Specially, we train a RNN… (more)

Subjects/Keywords: Language recognition; Recurrent Neural Networks; Representation Learning; deterministic finite automaton; automaton

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Michalenko, J. J. (2019). Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. (Masters Thesis). Rice University. Retrieved from http://hdl.handle.net/1911/105421

Chicago Manual of Style (16th Edition):

Michalenko, Joshua James. “Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.” 2019. Masters Thesis, Rice University. Accessed July 16, 2019. http://hdl.handle.net/1911/105421.

MLA Handbook (7th Edition):

Michalenko, Joshua James. “Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.” 2019. Web. 16 Jul 2019.

Vancouver:

Michalenko JJ. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. [Internet] [Masters thesis]. Rice University; 2019. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/1911/105421.

Council of Science Editors:

Michalenko JJ. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. [Masters Thesis]. Rice University; 2019. Available from: http://hdl.handle.net/1911/105421


Brandeis University

17. Garimella, Manaswini. Detecting article errors in English learner essays with recurrent neural networks.

Degree: 2016, Brandeis University

 Article and determiner errors are common in the writing of English language learners, but automated systems of detecting and correcting them can be challenging to… (more)

Subjects/Keywords: computational linguistics; English language learning; grammatical error correction; recurrent neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Garimella, M. (2016). Detecting article errors in English learner essays with recurrent neural networks. (Thesis). Brandeis University. Retrieved from http://hdl.handle.net/10192/32890

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Garimella, Manaswini. “Detecting article errors in English learner essays with recurrent neural networks.” 2016. Thesis, Brandeis University. Accessed July 16, 2019. http://hdl.handle.net/10192/32890.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Garimella, Manaswini. “Detecting article errors in English learner essays with recurrent neural networks.” 2016. Web. 16 Jul 2019.

Vancouver:

Garimella M. Detecting article errors in English learner essays with recurrent neural networks. [Internet] [Thesis]. Brandeis University; 2016. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/10192/32890.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Garimella M. Detecting article errors in English learner essays with recurrent neural networks. [Thesis]. Brandeis University; 2016. Available from: http://hdl.handle.net/10192/32890

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Georgia Tech

18. Youmans, Michael Thomas. Identifying and Generating Candidate Antibacterial Peptides with Long Short-Term Memory Recurrent Neural Networks.

Degree: PhD, Biomedical Engineering (Joint GT/Emory Department), 2019, Georgia Tech

 There is a growing need to deal with increasing rates of resistance to antibiotics among pathogenic bacteria. The development of resistance in bacteria to current… (more)

Subjects/Keywords: Recurrent Neural Networks; Antibacterial Peptides; Antimicrobial Peptides; Long Short-Term Memory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Youmans, M. T. (2019). Identifying and Generating Candidate Antibacterial Peptides with Long Short-Term Memory Recurrent Neural Networks. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/61210

Chicago Manual of Style (16th Edition):

Youmans, Michael Thomas. “Identifying and Generating Candidate Antibacterial Peptides with Long Short-Term Memory Recurrent Neural Networks.” 2019. Doctoral Dissertation, Georgia Tech. Accessed July 16, 2019. http://hdl.handle.net/1853/61210.

MLA Handbook (7th Edition):

Youmans, Michael Thomas. “Identifying and Generating Candidate Antibacterial Peptides with Long Short-Term Memory Recurrent Neural Networks.” 2019. Web. 16 Jul 2019.

Vancouver:

Youmans MT. Identifying and Generating Candidate Antibacterial Peptides with Long Short-Term Memory Recurrent Neural Networks. [Internet] [Doctoral dissertation]. Georgia Tech; 2019. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/1853/61210.

Council of Science Editors:

Youmans MT. Identifying and Generating Candidate Antibacterial Peptides with Long Short-Term Memory Recurrent Neural Networks. [Doctoral Dissertation]. Georgia Tech; 2019. Available from: http://hdl.handle.net/1853/61210


Victoria University of Wellington

19. Chandra, Rohitash. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.

Degree: 2012, Victoria University of Wellington

 One way to train neural networks is to use evolutionary algorithms such as cooperative coevolution - a method that decomposes the network's learnable parameters into… (more)

Subjects/Keywords: Neural networks; Cooperative coevolution; Recurrent network; Co-operative co-evolution

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chandra, R. (2012). Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. (Doctoral Dissertation). Victoria University of Wellington. Retrieved from http://hdl.handle.net/10063/2110

Chicago Manual of Style (16th Edition):

Chandra, Rohitash. “Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.” 2012. Doctoral Dissertation, Victoria University of Wellington. Accessed July 16, 2019. http://hdl.handle.net/10063/2110.

MLA Handbook (7th Edition):

Chandra, Rohitash. “Problem Decomposition and Adaptation in Cooperative Neuro-Evolution.” 2012. Web. 16 Jul 2019.

Vancouver:

Chandra R. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. [Internet] [Doctoral dissertation]. Victoria University of Wellington; 2012. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/10063/2110.

Council of Science Editors:

Chandra R. Problem Decomposition and Adaptation in Cooperative Neuro-Evolution. [Doctoral Dissertation]. Victoria University of Wellington; 2012. Available from: http://hdl.handle.net/10063/2110


Rice University

20. Michalenko, Joshua James. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.

Degree: MS, Electrical & Computer Eng., 2019, Rice University

 We investigate the internal representations that a recurrent neural network (RNN) uses while learning to recognize a regular formal language. Specially, we train a RNN… (more)

Subjects/Keywords: Language recognition; Recurrent Neural Networks; Representation Learning; deterministic finite automaton; automaton

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Michalenko, J. J. (2019). Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. (Masters Thesis). Rice University. Retrieved from http://hdl.handle.net/1911/105422

Chicago Manual of Style (16th Edition):

Michalenko, Joshua James. “Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.” 2019. Masters Thesis, Rice University. Accessed July 16, 2019. http://hdl.handle.net/1911/105422.

MLA Handbook (7th Edition):

Michalenko, Joshua James. “Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks.” 2019. Web. 16 Jul 2019.

Vancouver:

Michalenko JJ. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. [Internet] [Masters thesis]. Rice University; 2019. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/1911/105422.

Council of Science Editors:

Michalenko JJ. Representing Formal Languages: A Comparison Between Finite Automata and Recurrent Neural Networks. [Masters Thesis]. Rice University; 2019. Available from: http://hdl.handle.net/1911/105422


University of Plymouth

21. Carmantini, Giovanni Sirio. Dynamical systems theory for transparent symbolic computation in neuronal networks.

Degree: PhD, 2017, University of Plymouth

 In this thesis, we explore the interface between symbolic and dynamical system computation, with particular regard to dynamical system models of neuronal networks. In doing… (more)

Subjects/Keywords: 006.3; Automata Theory; Recurrent Neural Networks; Representation Theory; Neural Symbolic Computation; Dynamical Systems; Symbolic Dynamics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Carmantini, G. S. (2017). Dynamical systems theory for transparent symbolic computation in neuronal networks. (Doctoral Dissertation). University of Plymouth. Retrieved from http://hdl.handle.net/10026.1/8647

Chicago Manual of Style (16th Edition):

Carmantini, Giovanni Sirio. “Dynamical systems theory for transparent symbolic computation in neuronal networks.” 2017. Doctoral Dissertation, University of Plymouth. Accessed July 16, 2019. http://hdl.handle.net/10026.1/8647.

MLA Handbook (7th Edition):

Carmantini, Giovanni Sirio. “Dynamical systems theory for transparent symbolic computation in neuronal networks.” 2017. Web. 16 Jul 2019.

Vancouver:

Carmantini GS. Dynamical systems theory for transparent symbolic computation in neuronal networks. [Internet] [Doctoral dissertation]. University of Plymouth; 2017. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/10026.1/8647.

Council of Science Editors:

Carmantini GS. Dynamical systems theory for transparent symbolic computation in neuronal networks. [Doctoral Dissertation]. University of Plymouth; 2017. Available from: http://hdl.handle.net/10026.1/8647

22. Tirumaladasu, Sai Subhakar. Autonomous Driving: Traffic Sign Classification.

Degree: 2019, , Department of Applied Signal Processing

  Autonomous Driving and Advance Driver Assistance Systems (ADAS) are revolutionizing the way we drive and the future of mobility. Among ADAS, Traffic Sign Classification… (more)

Subjects/Keywords: Autonomous Driving; Deep Learning; Image Processing; Convolutional Neural Networks; Recurrent Neural Networks; Generative Adversarial Networks; Engineering and Technology; Teknik och teknologier

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tirumaladasu, S. S. (2019). Autonomous Driving: Traffic Sign Classification. (Thesis). , Department of Applied Signal Processing. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17783

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tirumaladasu, Sai Subhakar. “Autonomous Driving: Traffic Sign Classification.” 2019. Thesis, , Department of Applied Signal Processing. Accessed July 16, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17783.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tirumaladasu, Sai Subhakar. “Autonomous Driving: Traffic Sign Classification.” 2019. Web. 16 Jul 2019.

Vancouver:

Tirumaladasu SS. Autonomous Driving: Traffic Sign Classification. [Internet] [Thesis]. , Department of Applied Signal Processing; 2019. [cited 2019 Jul 16]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17783.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tirumaladasu SS. Autonomous Driving: Traffic Sign Classification. [Thesis]. , Department of Applied Signal Processing; 2019. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17783

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

23. Rajesh, M V; Dr.Gopikakumari, R. Development and Evaluation of Blind Identification Techniques for Nonlinear Systems.

Degree: 2010, Cochin University of Science and Technology

Identification and Control of Non‐linear dynamical systems are challenging problems to the control engineers.The topic is equally relevant in communication,weather prediction ,bio medical systems and… (more)

Subjects/Keywords: Nonlinear system modeling; Blind Identification; Neural networks; Maximum Likelihood Estimation; Particle Filter; State space modeling; Recurrent neural networks; Intelligent Signal Processing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Rajesh, M V; Dr.Gopikakumari, R. (2010). Development and Evaluation of Blind Identification Techniques for Nonlinear Systems. (Thesis). Cochin University of Science and Technology. Retrieved from http://dyuthi.cusat.ac.in/purl/2943

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Rajesh, M V; Dr.Gopikakumari, R. “Development and Evaluation of Blind Identification Techniques for Nonlinear Systems.” 2010. Thesis, Cochin University of Science and Technology. Accessed July 16, 2019. http://dyuthi.cusat.ac.in/purl/2943.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Rajesh, M V; Dr.Gopikakumari, R. “Development and Evaluation of Blind Identification Techniques for Nonlinear Systems.” 2010. Web. 16 Jul 2019.

Vancouver:

Rajesh, M V; Dr.Gopikakumari R. Development and Evaluation of Blind Identification Techniques for Nonlinear Systems. [Internet] [Thesis]. Cochin University of Science and Technology; 2010. [cited 2019 Jul 16]. Available from: http://dyuthi.cusat.ac.in/purl/2943.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Rajesh, M V; Dr.Gopikakumari R. Development and Evaluation of Blind Identification Techniques for Nonlinear Systems. [Thesis]. Cochin University of Science and Technology; 2010. Available from: http://dyuthi.cusat.ac.in/purl/2943

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Lincoln University

24. Matroushi, Saeed M. M. S. Hybrid computational intelligence systems based on statistical and neural networks methods for time series forecasting: the case of gold price.

Degree: 2011, Lincoln University

 In this research, two hybrid systems are proposed whose components are the Autoregressive Integrated Moving Average (ARIMA) model, and two types of Artificial Neural Networks(more)

Subjects/Keywords: multilayer perceptron; Genetic Algorithm; time series; Elman Recurrent Neural Networks (ERNN); autoregressive; moving average; neural networks; hybrid systems; integrated

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Matroushi, S. M. M. S. (2011). Hybrid computational intelligence systems based on statistical and neural networks methods for time series forecasting: the case of gold price. (Thesis). Lincoln University. Retrieved from http://hdl.handle.net/10182/3986

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Matroushi, Saeed M M S. “Hybrid computational intelligence systems based on statistical and neural networks methods for time series forecasting: the case of gold price.” 2011. Thesis, Lincoln University. Accessed July 16, 2019. http://hdl.handle.net/10182/3986.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Matroushi, Saeed M M S. “Hybrid computational intelligence systems based on statistical and neural networks methods for time series forecasting: the case of gold price.” 2011. Web. 16 Jul 2019.

Vancouver:

Matroushi SMMS. Hybrid computational intelligence systems based on statistical and neural networks methods for time series forecasting: the case of gold price. [Internet] [Thesis]. Lincoln University; 2011. [cited 2019 Jul 16]. Available from: http://hdl.handle.net/10182/3986.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Matroushi SMMS. Hybrid computational intelligence systems based on statistical and neural networks methods for time series forecasting: the case of gold price. [Thesis]. Lincoln University; 2011. Available from: http://hdl.handle.net/10182/3986

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of California – San Diego

25. Ghewari, Rishikesh Sanjay. Action Recognition from Videos using Deep Neural Networks.

Degree: Computer Science, 2017, University of California – San Diego

 Convolutional neural network(CNN) models have been extensively used in recent years to solve the problem of image understanding giving state-of-the-art results in tasks like classification,… (more)

Subjects/Keywords: Artificial intelligence; Computer science; Action Recognition; Convolutional neural networks; Deep Learning; LSTM; recurrent neural networks; Video Classification

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ghewari, R. S. (2017). Action Recognition from Videos using Deep Neural Networks. (Thesis). University of California – San Diego. Retrieved from http://www.escholarship.org/uc/item/2mr798mn

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Ghewari, Rishikesh Sanjay. “Action Recognition from Videos using Deep Neural Networks.” 2017. Thesis, University of California – San Diego. Accessed July 16, 2019. http://www.escholarship.org/uc/item/2mr798mn.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Ghewari, Rishikesh Sanjay. “Action Recognition from Videos using Deep Neural Networks.” 2017. Web. 16 Jul 2019.

Vancouver:

Ghewari RS. Action Recognition from Videos using Deep Neural Networks. [Internet] [Thesis]. University of California – San Diego; 2017. [cited 2019 Jul 16]. Available from: http://www.escholarship.org/uc/item/2mr798mn.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ghewari RS. Action Recognition from Videos using Deep Neural Networks. [Thesis]. University of California – San Diego; 2017. Available from: http://www.escholarship.org/uc/item/2mr798mn

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


KTH

26. Svebrant, Henrik. Latent variable neural click models for web search.

Degree: Electrical Engineering and Computer Science (EECS), 2018, KTH

User click modeling in web search is most commonly done through probabilistic graphical models. Due to the successful use of machine learning techniques in… (more)

Subjects/Keywords: web search; click modeling; machine learning; recurrent neural networks; artificial neural networks; Computer Sciences; Datavetenskap (datalogi)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Svebrant, H. (2018). Latent variable neural click models for web search. (Thesis). KTH. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232311

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Svebrant, Henrik. “Latent variable neural click models for web search.” 2018. Thesis, KTH. Accessed July 16, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232311.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Svebrant, Henrik. “Latent variable neural click models for web search.” 2018. Web. 16 Jul 2019.

Vancouver:

Svebrant H. Latent variable neural click models for web search. [Internet] [Thesis]. KTH; 2018. [cited 2019 Jul 16]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232311.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Svebrant H. Latent variable neural click models for web search. [Thesis]. KTH; 2018. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232311

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


KTH

27. Rosell, Felicia. Tracking a ball during bounce and roll using recurrent neural networks.

Degree: Electrical Engineering and Computer Science (EECS), 2018, KTH

In many types of sports, on-screen graphics such as an reconstructed ball trajectory, can be displayed for spectators or players in order to increase… (more)

Subjects/Keywords: machine learning; ML; recurrent neural networks; RNN; deep learning; tracking; golf; bounce; synthetic data; maskininlärning; ML; recurrent neural networks; RNN; djupinlärning; följning; golf; studs; syntetiskt data; Computer Sciences; Datavetenskap (datalogi)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Rosell, F. (2018). Tracking a ball during bounce and roll using recurrent neural networks. (Thesis). KTH. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-239733

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Rosell, Felicia. “Tracking a ball during bounce and roll using recurrent neural networks.” 2018. Thesis, KTH. Accessed July 16, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-239733.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Rosell, Felicia. “Tracking a ball during bounce and roll using recurrent neural networks.” 2018. Web. 16 Jul 2019.

Vancouver:

Rosell F. Tracking a ball during bounce and roll using recurrent neural networks. [Internet] [Thesis]. KTH; 2018. [cited 2019 Jul 16]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-239733.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Rosell F. Tracking a ball during bounce and roll using recurrent neural networks. [Thesis]. KTH; 2018. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-239733

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Univerzitet u Beogradu

28. Jovanović, Marija V., 1980-. Primena različitih perfuzionih tehnika magnetne rezonance mozga u cilju diferenciranja postterapijskih sekvela i tumorskih promena kod osoba sa glioblastomom.

Degree: Medicinski fakultet, 2017, Univerzitet u Beogradu

Medicinske nauke - Radiologija / Medical science - Radiology

Iako je postoperativna primena radioterapije i hemioterapije produžila preživljavanja pacijenata obolelih od glioblastoma multiforme (GBM), ona… (more)

Subjects/Keywords: glioblastoma; recurrent tumor; treatment effects; magnetic resonance imaging; diffusion; susceptibility; spectroscopy; perfusion; artificial neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jovanović, Marija V., 1. (2017). Primena različitih perfuzionih tehnika magnetne rezonance mozga u cilju diferenciranja postterapijskih sekvela i tumorskih promena kod osoba sa glioblastomom. (Thesis). Univerzitet u Beogradu. Retrieved from https://fedorabg.bg.ac.rs/fedora/get/o:16674/bdef:Content/get

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Jovanović, Marija V., 1980-. “Primena različitih perfuzionih tehnika magnetne rezonance mozga u cilju diferenciranja postterapijskih sekvela i tumorskih promena kod osoba sa glioblastomom.” 2017. Thesis, Univerzitet u Beogradu. Accessed July 16, 2019. https://fedorabg.bg.ac.rs/fedora/get/o:16674/bdef:Content/get.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Jovanović, Marija V., 1980-. “Primena različitih perfuzionih tehnika magnetne rezonance mozga u cilju diferenciranja postterapijskih sekvela i tumorskih promena kod osoba sa glioblastomom.” 2017. Web. 16 Jul 2019.

Vancouver:

Jovanović, Marija V. 1. Primena različitih perfuzionih tehnika magnetne rezonance mozga u cilju diferenciranja postterapijskih sekvela i tumorskih promena kod osoba sa glioblastomom. [Internet] [Thesis]. Univerzitet u Beogradu; 2017. [cited 2019 Jul 16]. Available from: https://fedorabg.bg.ac.rs/fedora/get/o:16674/bdef:Content/get.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Jovanović, Marija V. 1. Primena različitih perfuzionih tehnika magnetne rezonance mozga u cilju diferenciranja postterapijskih sekvela i tumorskih promena kod osoba sa glioblastomom. [Thesis]. Univerzitet u Beogradu; 2017. Available from: https://fedorabg.bg.ac.rs/fedora/get/o:16674/bdef:Content/get

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


KTH

29. Hellstenius, Sasha. Model comparison of patient volume prediction in digital health care.

Degree: Electrical Engineering and Computer Science (EECS), 2018, KTH

Accurate predictions of patient volume are an essential tool to improve resource allocation and doctor utilization in the traditional, as well as the digital… (more)

Subjects/Keywords: Recurrent Neural Networks; LSTM; Patient Volume Prediction; Digital Healthcare; Computer Sciences; Datavetenskap (datalogi)

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hellstenius, S. (2018). Model comparison of patient volume prediction in digital health care. (Thesis). KTH. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229908

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hellstenius, Sasha. “Model comparison of patient volume prediction in digital health care.” 2018. Thesis, KTH. Accessed July 16, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229908.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hellstenius, Sasha. “Model comparison of patient volume prediction in digital health care.” 2018. Web. 16 Jul 2019.

Vancouver:

Hellstenius S. Model comparison of patient volume prediction in digital health care. [Internet] [Thesis]. KTH; 2018. [cited 2019 Jul 16]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229908.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hellstenius S. Model comparison of patient volume prediction in digital health care. [Thesis]. KTH; 2018. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229908

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Washington University in St. Louis

30. Jolley, Jennifer Marie. Applying Neural Network Models to Predict Recurrent Maltreatment in Child Welfare Cases with Static and Dynamic Risk Factors.

Degree: PhD, Social Work, 2012, Washington University in St. Louis

  Risk assessment in child welfare has a long tradition of being based on models that assume the likelihood of recurrent maltreatment is a linear… (more)

Subjects/Keywords: differential response; neural networks; nonlinear modeling; recurrent child maltreatment; risk assessment; risk-need-responsivity

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jolley, J. M. (2012). Applying Neural Network Models to Predict Recurrent Maltreatment in Child Welfare Cases with Static and Dynamic Risk Factors. (Doctoral Dissertation). Washington University in St. Louis. Retrieved from https://openscholarship.wustl.edu/etd/1009

Chicago Manual of Style (16th Edition):

Jolley, Jennifer Marie. “Applying Neural Network Models to Predict Recurrent Maltreatment in Child Welfare Cases with Static and Dynamic Risk Factors.” 2012. Doctoral Dissertation, Washington University in St. Louis. Accessed July 16, 2019. https://openscholarship.wustl.edu/etd/1009.

MLA Handbook (7th Edition):

Jolley, Jennifer Marie. “Applying Neural Network Models to Predict Recurrent Maltreatment in Child Welfare Cases with Static and Dynamic Risk Factors.” 2012. Web. 16 Jul 2019.

Vancouver:

Jolley JM. Applying Neural Network Models to Predict Recurrent Maltreatment in Child Welfare Cases with Static and Dynamic Risk Factors. [Internet] [Doctoral dissertation]. Washington University in St. Louis; 2012. [cited 2019 Jul 16]. Available from: https://openscholarship.wustl.edu/etd/1009.

Council of Science Editors:

Jolley JM. Applying Neural Network Models to Predict Recurrent Maltreatment in Child Welfare Cases with Static and Dynamic Risk Factors. [Doctoral Dissertation]. Washington University in St. Louis; 2012. Available from: https://openscholarship.wustl.edu/etd/1009

[1] [2] [3] [4] [5] [6]

.