Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Spiking Neural Networks). Showing records 1 – 30 of 94 total matches.

[1] [2] [3] [4]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Country

▼ Search Limiters


Texas A&M University

1. Mahadevuni, Amarnath. Autonomous Navigation Using Reinforcement Learning with Spiking Neural Networks.

Degree: MS, Computer Engineering, 2018, Texas A&M University

 The autonomous navigation of mobile robots is of great interest in mobile robotics. Algorithms such as simultaneous localization and mapping (SLAM) and artificial potential field… (more)

Subjects/Keywords: Autonomous navigation; Spiking Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mahadevuni, A. (2018). Autonomous Navigation Using Reinforcement Learning with Spiking Neural Networks. (Masters Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/173294

Chicago Manual of Style (16th Edition):

Mahadevuni, Amarnath. “Autonomous Navigation Using Reinforcement Learning with Spiking Neural Networks.” 2018. Masters Thesis, Texas A&M University. Accessed October 19, 2020. http://hdl.handle.net/1969.1/173294.

MLA Handbook (7th Edition):

Mahadevuni, Amarnath. “Autonomous Navigation Using Reinforcement Learning with Spiking Neural Networks.” 2018. Web. 19 Oct 2020.

Vancouver:

Mahadevuni A. Autonomous Navigation Using Reinforcement Learning with Spiking Neural Networks. [Internet] [Masters thesis]. Texas A&M University; 2018. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/1969.1/173294.

Council of Science Editors:

Mahadevuni A. Autonomous Navigation Using Reinforcement Learning with Spiking Neural Networks. [Masters Thesis]. Texas A&M University; 2018. Available from: http://hdl.handle.net/1969.1/173294


University of Windsor

2. Talaei, Amir Javid. Pattern Recognition Using Spiking Neural Networks.

Degree: MA, Electrical and Computer Engineering, 2020, University of Windsor

 Deep learning believed to be a promising approach for solving specific problems in the field of artificial intelligence whenever a large amount of data and… (more)

Subjects/Keywords: Pattern Recognition; Spiking Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Talaei, A. J. (2020). Pattern Recognition Using Spiking Neural Networks. (Masters Thesis). University of Windsor. Retrieved from https://scholar.uwindsor.ca/etd/8402

Chicago Manual of Style (16th Edition):

Talaei, Amir Javid. “Pattern Recognition Using Spiking Neural Networks.” 2020. Masters Thesis, University of Windsor. Accessed October 19, 2020. https://scholar.uwindsor.ca/etd/8402.

MLA Handbook (7th Edition):

Talaei, Amir Javid. “Pattern Recognition Using Spiking Neural Networks.” 2020. Web. 19 Oct 2020.

Vancouver:

Talaei AJ. Pattern Recognition Using Spiking Neural Networks. [Internet] [Masters thesis]. University of Windsor; 2020. [cited 2020 Oct 19]. Available from: https://scholar.uwindsor.ca/etd/8402.

Council of Science Editors:

Talaei AJ. Pattern Recognition Using Spiking Neural Networks. [Masters Thesis]. University of Windsor; 2020. Available from: https://scholar.uwindsor.ca/etd/8402


University of Waterloo

3. Bekolay, Trevor. Learning in large-scale spiking neural networks.

Degree: 2011, University of Waterloo

 Learning is central to the exploration of intelligence. Psychology and machine learning provide high-level explanations of how rational agents learn. Neuroscience provides low-level descriptions of… (more)

Subjects/Keywords: neuroplasticity; learning; neural networks; spiking neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bekolay, T. (2011). Learning in large-scale spiking neural networks. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/6195

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Bekolay, Trevor. “Learning in large-scale spiking neural networks.” 2011. Thesis, University of Waterloo. Accessed October 19, 2020. http://hdl.handle.net/10012/6195.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Bekolay, Trevor. “Learning in large-scale spiking neural networks.” 2011. Web. 19 Oct 2020.

Vancouver:

Bekolay T. Learning in large-scale spiking neural networks. [Internet] [Thesis]. University of Waterloo; 2011. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/10012/6195.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Bekolay T. Learning in large-scale spiking neural networks. [Thesis]. University of Waterloo; 2011. Available from: http://hdl.handle.net/10012/6195

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

4. Bekolay, Trevor. Biologically inspired methods in speech recognition and synthesis: closing the loop.

Degree: 2016, University of Waterloo

 Current state-of-the-art approaches to computational speech recognition and synthesis are based on statistical analyses of extremely large data sets. It is currently unknown how these… (more)

Subjects/Keywords: speech; spiking neural networks; computational neuroscience

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bekolay, T. (2016). Biologically inspired methods in speech recognition and synthesis: closing the loop. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/10269

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Bekolay, Trevor. “Biologically inspired methods in speech recognition and synthesis: closing the loop.” 2016. Thesis, University of Waterloo. Accessed October 19, 2020. http://hdl.handle.net/10012/10269.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Bekolay, Trevor. “Biologically inspired methods in speech recognition and synthesis: closing the loop.” 2016. Web. 19 Oct 2020.

Vancouver:

Bekolay T. Biologically inspired methods in speech recognition and synthesis: closing the loop. [Internet] [Thesis]. University of Waterloo; 2016. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/10012/10269.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Bekolay T. Biologically inspired methods in speech recognition and synthesis: closing the loop. [Thesis]. University of Waterloo; 2016. Available from: http://hdl.handle.net/10012/10269

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of St Andrews

5. Mansouri Benssassi, Esma. Bio-inspired multisensory integration of social signals.

Degree: PhD, 2020, University of St Andrews

 Emotions understanding represents a core aspect of human communication. Our social behaviours are closely linked to expressing our emotions and understanding others' emotional and mental… (more)

Subjects/Keywords: Multisensory integration; Spiking neural networks; Emotions recognition

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mansouri Benssassi, E. (2020). Bio-inspired multisensory integration of social signals. (Doctoral Dissertation). University of St Andrews. Retrieved from http://hdl.handle.net/10023/20182

Chicago Manual of Style (16th Edition):

Mansouri Benssassi, Esma. “Bio-inspired multisensory integration of social signals.” 2020. Doctoral Dissertation, University of St Andrews. Accessed October 19, 2020. http://hdl.handle.net/10023/20182.

MLA Handbook (7th Edition):

Mansouri Benssassi, Esma. “Bio-inspired multisensory integration of social signals.” 2020. Web. 19 Oct 2020.

Vancouver:

Mansouri Benssassi E. Bio-inspired multisensory integration of social signals. [Internet] [Doctoral dissertation]. University of St Andrews; 2020. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/10023/20182.

Council of Science Editors:

Mansouri Benssassi E. Bio-inspired multisensory integration of social signals. [Doctoral Dissertation]. University of St Andrews; 2020. Available from: http://hdl.handle.net/10023/20182


University of Waterloo

6. Hunsberger, Eric. Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition.

Degree: 2018, University of Waterloo

 Modern machine learning models are beginning to rival human performance on some realistic object recognition tasks, but we still lack a full understanding of how… (more)

Subjects/Keywords: learning; spiking neural networks; deep neural networks; object recognition

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hunsberger, E. (2018). Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/12819

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hunsberger, Eric. “Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition.” 2018. Thesis, University of Waterloo. Accessed October 19, 2020. http://hdl.handle.net/10012/12819.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hunsberger, Eric. “Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition.” 2018. Web. 19 Oct 2020.

Vancouver:

Hunsberger E. Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition. [Internet] [Thesis]. University of Waterloo; 2018. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/10012/12819.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hunsberger E. Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition. [Thesis]. University of Waterloo; 2018. Available from: http://hdl.handle.net/10012/12819

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Universitat de Valencia

7. Iakymchuk, Taras. Spiking Neural Networks models targeted for implementation on Reconfigurable Hardware .

Degree: 2017, Universitat de Valencia

 La tesis presentada se centra en la denominada tercera generación de redes neuronales artificiales, las Redes Neuronales Spiking (SNN) también llamadas ‘de espigas’ o ‘de… (more)

Subjects/Keywords: fpga; snn; neural networks; spiking neural networks; machine learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Iakymchuk, T. (2017). Spiking Neural Networks models targeted for implementation on Reconfigurable Hardware . (Doctoral Dissertation). Universitat de Valencia. Retrieved from http://hdl.handle.net/10550/60934

Chicago Manual of Style (16th Edition):

Iakymchuk, Taras. “Spiking Neural Networks models targeted for implementation on Reconfigurable Hardware .” 2017. Doctoral Dissertation, Universitat de Valencia. Accessed October 19, 2020. http://hdl.handle.net/10550/60934.

MLA Handbook (7th Edition):

Iakymchuk, Taras. “Spiking Neural Networks models targeted for implementation on Reconfigurable Hardware .” 2017. Web. 19 Oct 2020.

Vancouver:

Iakymchuk T. Spiking Neural Networks models targeted for implementation on Reconfigurable Hardware . [Internet] [Doctoral dissertation]. Universitat de Valencia; 2017. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/10550/60934.

Council of Science Editors:

Iakymchuk T. Spiking Neural Networks models targeted for implementation on Reconfigurable Hardware . [Doctoral Dissertation]. Universitat de Valencia; 2017. Available from: http://hdl.handle.net/10550/60934


Texas A&M University

8. Singh, Nityendra. Training Algorithms for Networks of Spiking Neurons.

Degree: MS, Electrical Engineering, 2014, Texas A&M University

Neural networks represent a type of computing that is based on the way that the brain performs computations. Neural networks are good at fitting non-linear… (more)

Subjects/Keywords: Spiking Neural Network; SpikeProp; Neural Networks; Discrete Weights

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Singh, N. (2014). Training Algorithms for Networks of Spiking Neurons. (Masters Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/154131

Chicago Manual of Style (16th Edition):

Singh, Nityendra. “Training Algorithms for Networks of Spiking Neurons.” 2014. Masters Thesis, Texas A&M University. Accessed October 19, 2020. http://hdl.handle.net/1969.1/154131.

MLA Handbook (7th Edition):

Singh, Nityendra. “Training Algorithms for Networks of Spiking Neurons.” 2014. Web. 19 Oct 2020.

Vancouver:

Singh N. Training Algorithms for Networks of Spiking Neurons. [Internet] [Masters thesis]. Texas A&M University; 2014. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/1969.1/154131.

Council of Science Editors:

Singh N. Training Algorithms for Networks of Spiking Neurons. [Masters Thesis]. Texas A&M University; 2014. Available from: http://hdl.handle.net/1969.1/154131


University of Manchester

9. Mundy, Andrew. Real time Spaun on SpiNNaker : functional brain simulation on a massively-parallel computer architecture.

Degree: PhD, 2017, University of Manchester

 Model building is a fundamental scientific tool. Increasingly there is interest in building neurally-implemented models of cognitive processes with the intention of modelling brains. However,… (more)

Subjects/Keywords: 006.3; SpiNNaker; Neural Engineering Framework; Spiking neural networks; Logic minimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mundy, A. (2017). Real time Spaun on SpiNNaker : functional brain simulation on a massively-parallel computer architecture. (Doctoral Dissertation). University of Manchester. Retrieved from https://www.research.manchester.ac.uk/portal/en/theses/real-time-spaun-on-spinnaker – functional-brain-simulation-on-a-massivelyparallel-computer-architecture(fcf5388c-4893-4b10-a6b4-577ffee2d562).html ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.728151

Chicago Manual of Style (16th Edition):

Mundy, Andrew. “Real time Spaun on SpiNNaker : functional brain simulation on a massively-parallel computer architecture.” 2017. Doctoral Dissertation, University of Manchester. Accessed October 19, 2020. https://www.research.manchester.ac.uk/portal/en/theses/real-time-spaun-on-spinnaker – functional-brain-simulation-on-a-massivelyparallel-computer-architecture(fcf5388c-4893-4b10-a6b4-577ffee2d562).html ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.728151.

MLA Handbook (7th Edition):

Mundy, Andrew. “Real time Spaun on SpiNNaker : functional brain simulation on a massively-parallel computer architecture.” 2017. Web. 19 Oct 2020.

Vancouver:

Mundy A. Real time Spaun on SpiNNaker : functional brain simulation on a massively-parallel computer architecture. [Internet] [Doctoral dissertation]. University of Manchester; 2017. [cited 2020 Oct 19]. Available from: https://www.research.manchester.ac.uk/portal/en/theses/real-time-spaun-on-spinnaker – functional-brain-simulation-on-a-massivelyparallel-computer-architecture(fcf5388c-4893-4b10-a6b4-577ffee2d562).html ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.728151.

Council of Science Editors:

Mundy A. Real time Spaun on SpiNNaker : functional brain simulation on a massively-parallel computer architecture. [Doctoral Dissertation]. University of Manchester; 2017. Available from: https://www.research.manchester.ac.uk/portal/en/theses/real-time-spaun-on-spinnaker – functional-brain-simulation-on-a-massivelyparallel-computer-architecture(fcf5388c-4893-4b10-a6b4-577ffee2d562).html ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.728151

10. Stromatias, Evangelos. SCALABILITY AND ROBUSTNESS OF ARTIFICIAL NEURAL NETWORKS.

Degree: 2016, University of Manchester

Artificial Neural Networks (ANNs) appear increasingly and routinely to gain popularity today, as they are being used in several diverse research fields and many different… (more)

Subjects/Keywords: SpiNNaker; Neuromorphic; Spiking; low-power; low-latency; scalable; robustness; limited weight precision; spiking neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Stromatias, E. (2016). SCALABILITY AND ROBUSTNESS OF ARTIFICIAL NEURAL NETWORKS. (Doctoral Dissertation). University of Manchester. Retrieved from http://www.manchester.ac.uk/escholar/uk-ac-man-scw:300115

Chicago Manual of Style (16th Edition):

Stromatias, Evangelos. “SCALABILITY AND ROBUSTNESS OF ARTIFICIAL NEURAL NETWORKS.” 2016. Doctoral Dissertation, University of Manchester. Accessed October 19, 2020. http://www.manchester.ac.uk/escholar/uk-ac-man-scw:300115.

MLA Handbook (7th Edition):

Stromatias, Evangelos. “SCALABILITY AND ROBUSTNESS OF ARTIFICIAL NEURAL NETWORKS.” 2016. Web. 19 Oct 2020.

Vancouver:

Stromatias E. SCALABILITY AND ROBUSTNESS OF ARTIFICIAL NEURAL NETWORKS. [Internet] [Doctoral dissertation]. University of Manchester; 2016. [cited 2020 Oct 19]. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:300115.

Council of Science Editors:

Stromatias E. SCALABILITY AND ROBUSTNESS OF ARTIFICIAL NEURAL NETWORKS. [Doctoral Dissertation]. University of Manchester; 2016. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:300115


University of Plymouth

11. de Azambuja, Ricardo. Action learning experiments using spiking neural networks and humanoid robots.

Degree: PhD, 2018, University of Plymouth

 The way our brain works is still an open question, but one thing seems to be clear: biological neural systems are computationally powerful, robust and… (more)

Subjects/Keywords: 006.3; Neural Networks; Liquid State Machines; Robotics; Action Learning; Spiking Neural Networks; LSM; Humanoid Robots

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

de Azambuja, R. (2018). Action learning experiments using spiking neural networks and humanoid robots. (Doctoral Dissertation). University of Plymouth. Retrieved from http://hdl.handle.net/10026.1/10767

Chicago Manual of Style (16th Edition):

de Azambuja, Ricardo. “Action learning experiments using spiking neural networks and humanoid robots.” 2018. Doctoral Dissertation, University of Plymouth. Accessed October 19, 2020. http://hdl.handle.net/10026.1/10767.

MLA Handbook (7th Edition):

de Azambuja, Ricardo. “Action learning experiments using spiking neural networks and humanoid robots.” 2018. Web. 19 Oct 2020.

Vancouver:

de Azambuja R. Action learning experiments using spiking neural networks and humanoid robots. [Internet] [Doctoral dissertation]. University of Plymouth; 2018. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/10026.1/10767.

Council of Science Editors:

de Azambuja R. Action learning experiments using spiking neural networks and humanoid robots. [Doctoral Dissertation]. University of Plymouth; 2018. Available from: http://hdl.handle.net/10026.1/10767


Arizona State University

12. Kolala Venkataramanaiah, Shreyas. Energy Efficient Hardware Design of Neural Networks.

Degree: Electrical Engineering, 2018, Arizona State University

 Hardware implementation of deep neural networks is earning significant importance nowadays. Deep neural networks are mathematical models that use learning algorithms inspired by the brain.… (more)

Subjects/Keywords: Engineering; Computer engineering; Accelerators; ASIC; Energy efficient; Hardware design; Neural Networks; Spiking neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kolala Venkataramanaiah, S. (2018). Energy Efficient Hardware Design of Neural Networks. (Masters Thesis). Arizona State University. Retrieved from http://repository.asu.edu/items/51597

Chicago Manual of Style (16th Edition):

Kolala Venkataramanaiah, Shreyas. “Energy Efficient Hardware Design of Neural Networks.” 2018. Masters Thesis, Arizona State University. Accessed October 19, 2020. http://repository.asu.edu/items/51597.

MLA Handbook (7th Edition):

Kolala Venkataramanaiah, Shreyas. “Energy Efficient Hardware Design of Neural Networks.” 2018. Web. 19 Oct 2020.

Vancouver:

Kolala Venkataramanaiah S. Energy Efficient Hardware Design of Neural Networks. [Internet] [Masters thesis]. Arizona State University; 2018. [cited 2020 Oct 19]. Available from: http://repository.asu.edu/items/51597.

Council of Science Editors:

Kolala Venkataramanaiah S. Energy Efficient Hardware Design of Neural Networks. [Masters Thesis]. Arizona State University; 2018. Available from: http://repository.asu.edu/items/51597


Texas A&M University

13. Li, Youjie. Energy Efficient Spiking Neuromorphic Architectures for Pattern Recognition.

Degree: MS, Computer Engineering, 2016, Texas A&M University

 There is a growing concern over reliability, power consumption, and performance of traditional Von Neumann machines, especially when dealing with complex tasks like pattern recognition.… (more)

Subjects/Keywords: Neuromorphic VLSI; spiking neural networks; approximate computing; pattern recognition

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, Y. (2016). Energy Efficient Spiking Neuromorphic Architectures for Pattern Recognition. (Masters Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/156946

Chicago Manual of Style (16th Edition):

Li, Youjie. “Energy Efficient Spiking Neuromorphic Architectures for Pattern Recognition.” 2016. Masters Thesis, Texas A&M University. Accessed October 19, 2020. http://hdl.handle.net/1969.1/156946.

MLA Handbook (7th Edition):

Li, Youjie. “Energy Efficient Spiking Neuromorphic Architectures for Pattern Recognition.” 2016. Web. 19 Oct 2020.

Vancouver:

Li Y. Energy Efficient Spiking Neuromorphic Architectures for Pattern Recognition. [Internet] [Masters thesis]. Texas A&M University; 2016. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/1969.1/156946.

Council of Science Editors:

Li Y. Energy Efficient Spiking Neuromorphic Architectures for Pattern Recognition. [Masters Thesis]. Texas A&M University; 2016. Available from: http://hdl.handle.net/1969.1/156946


Texas A&M University

14. Thulasiraman, Kumaran. Enhanced Reinforcement Learning with Attentional Feedback and Temporally Attenuated Distal Rewards.

Degree: MS, Computer Engineering, 2015, Texas A&M University

 This thesis presents a new reinforcement learning mechanism suitable to be employed in artificial spiking neural networks of leaky integrate-and-fire (LIF) or Izhikevich neurons. The… (more)

Subjects/Keywords: reinforcement learning; spiking neural networks; dopamine-modulated; STDP

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Thulasiraman, K. (2015). Enhanced Reinforcement Learning with Attentional Feedback and Temporally Attenuated Distal Rewards. (Masters Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/155519

Chicago Manual of Style (16th Edition):

Thulasiraman, Kumaran. “Enhanced Reinforcement Learning with Attentional Feedback and Temporally Attenuated Distal Rewards.” 2015. Masters Thesis, Texas A&M University. Accessed October 19, 2020. http://hdl.handle.net/1969.1/155519.

MLA Handbook (7th Edition):

Thulasiraman, Kumaran. “Enhanced Reinforcement Learning with Attentional Feedback and Temporally Attenuated Distal Rewards.” 2015. Web. 19 Oct 2020.

Vancouver:

Thulasiraman K. Enhanced Reinforcement Learning with Attentional Feedback and Temporally Attenuated Distal Rewards. [Internet] [Masters thesis]. Texas A&M University; 2015. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/1969.1/155519.

Council of Science Editors:

Thulasiraman K. Enhanced Reinforcement Learning with Attentional Feedback and Temporally Attenuated Distal Rewards. [Masters Thesis]. Texas A&M University; 2015. Available from: http://hdl.handle.net/1969.1/155519


University of Newcastle

15. Wiklendt, Lukasz. Spiking neural networks for robot locomotion control.

Degree: PhD, 2014, University of Newcastle

Research Doctorate - Doctor of Philosophy (PhD)

Spiking neural networks (SNNs) are computational models of biological neurons and the synapses that connect them. They are… (more)

Subjects/Keywords: spiking neural networks; robot locomotion; evolutionary computation; acrobot; biped

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wiklendt, L. (2014). Spiking neural networks for robot locomotion control. (Doctoral Dissertation). University of Newcastle. Retrieved from http://hdl.handle.net/1959.13/1049196

Chicago Manual of Style (16th Edition):

Wiklendt, Lukasz. “Spiking neural networks for robot locomotion control.” 2014. Doctoral Dissertation, University of Newcastle. Accessed October 19, 2020. http://hdl.handle.net/1959.13/1049196.

MLA Handbook (7th Edition):

Wiklendt, Lukasz. “Spiking neural networks for robot locomotion control.” 2014. Web. 19 Oct 2020.

Vancouver:

Wiklendt L. Spiking neural networks for robot locomotion control. [Internet] [Doctoral dissertation]. University of Newcastle; 2014. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/1959.13/1049196.

Council of Science Editors:

Wiklendt L. Spiking neural networks for robot locomotion control. [Doctoral Dissertation]. University of Newcastle; 2014. Available from: http://hdl.handle.net/1959.13/1049196


Virginia Commonwealth University

16. Donachy, Shaun. Spiking Neural Networks: Neuron Models, Plasticity, and Graph Applications.

Degree: MS, Computer Science, 2015, Virginia Commonwealth University

Networks of spiking neurons can be used not only for brain modeling but also to solve graph problems. With the use of a computationally… (more)

Subjects/Keywords: Spiking Neural Networks; Plasticity; Shortest Path; Graph Clustering; Theory and Algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Donachy, S. (2015). Spiking Neural Networks: Neuron Models, Plasticity, and Graph Applications. (Thesis). Virginia Commonwealth University. Retrieved from https://doi.org/10.25772/D6AZ-JB33 ; https://scholarscompass.vcu.edu/etd/3984

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Donachy, Shaun. “Spiking Neural Networks: Neuron Models, Plasticity, and Graph Applications.” 2015. Thesis, Virginia Commonwealth University. Accessed October 19, 2020. https://doi.org/10.25772/D6AZ-JB33 ; https://scholarscompass.vcu.edu/etd/3984.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Donachy, Shaun. “Spiking Neural Networks: Neuron Models, Plasticity, and Graph Applications.” 2015. Web. 19 Oct 2020.

Vancouver:

Donachy S. Spiking Neural Networks: Neuron Models, Plasticity, and Graph Applications. [Internet] [Thesis]. Virginia Commonwealth University; 2015. [cited 2020 Oct 19]. Available from: https://doi.org/10.25772/D6AZ-JB33 ; https://scholarscompass.vcu.edu/etd/3984.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Donachy S. Spiking Neural Networks: Neuron Models, Plasticity, and Graph Applications. [Thesis]. Virginia Commonwealth University; 2015. Available from: https://doi.org/10.25772/D6AZ-JB33 ; https://scholarscompass.vcu.edu/etd/3984

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Delft University of Technology

17. Büller, Bas (author). Supervised Learning in Spiking Neural Networks.

Degree: 2020, Delft University of Technology

Spiking neural networks are notoriously hard to train because of their complex dynamics and sparse spiking signals. However, in part due to these properties, spiking(more)

Subjects/Keywords: Spiking Neural Networks(SNNs)); Supervised Learning; Online learning; Neuromorphic computing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Büller, B. (. (2020). Supervised Learning in Spiking Neural Networks. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:256f7044-862d-4b53-b395-973dadbb7a00

Chicago Manual of Style (16th Edition):

Büller, Bas (author). “Supervised Learning in Spiking Neural Networks.” 2020. Masters Thesis, Delft University of Technology. Accessed October 19, 2020. http://resolver.tudelft.nl/uuid:256f7044-862d-4b53-b395-973dadbb7a00.

MLA Handbook (7th Edition):

Büller, Bas (author). “Supervised Learning in Spiking Neural Networks.” 2020. Web. 19 Oct 2020.

Vancouver:

Büller B(. Supervised Learning in Spiking Neural Networks. [Internet] [Masters thesis]. Delft University of Technology; 2020. [cited 2020 Oct 19]. Available from: http://resolver.tudelft.nl/uuid:256f7044-862d-4b53-b395-973dadbb7a00.

Council of Science Editors:

Büller B(. Supervised Learning in Spiking Neural Networks. [Masters Thesis]. Delft University of Technology; 2020. Available from: http://resolver.tudelft.nl/uuid:256f7044-862d-4b53-b395-973dadbb7a00


Delft University of Technology

18. Hagenaars, Jesse (author). Evolved Neuromorphic Control for High Speed Divergence-based Landings of MAVs.

Degree: 2020, Delft University of Technology

Flying insects are capable of autonomous vision-based navigation in cluttered environments, reliably avoiding objects through fast and agile manoeuvres. Meanwhile, insect-scale micro air vehicles still… (more)

Subjects/Keywords: spiking neural networks; optical flow; micro air vehicles; neuroevolution

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hagenaars, J. (. (2020). Evolved Neuromorphic Control for High Speed Divergence-based Landings of MAVs. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:48040e88-f507-4676-a5da-2b701a07f387

Chicago Manual of Style (16th Edition):

Hagenaars, Jesse (author). “Evolved Neuromorphic Control for High Speed Divergence-based Landings of MAVs.” 2020. Masters Thesis, Delft University of Technology. Accessed October 19, 2020. http://resolver.tudelft.nl/uuid:48040e88-f507-4676-a5da-2b701a07f387.

MLA Handbook (7th Edition):

Hagenaars, Jesse (author). “Evolved Neuromorphic Control for High Speed Divergence-based Landings of MAVs.” 2020. Web. 19 Oct 2020.

Vancouver:

Hagenaars J(. Evolved Neuromorphic Control for High Speed Divergence-based Landings of MAVs. [Internet] [Masters thesis]. Delft University of Technology; 2020. [cited 2020 Oct 19]. Available from: http://resolver.tudelft.nl/uuid:48040e88-f507-4676-a5da-2b701a07f387.

Council of Science Editors:

Hagenaars J(. Evolved Neuromorphic Control for High Speed Divergence-based Landings of MAVs. [Masters Thesis]. Delft University of Technology; 2020. Available from: http://resolver.tudelft.nl/uuid:48040e88-f507-4676-a5da-2b701a07f387


University of Tennessee – Knoxville

19. Mitchell, John Parker. DANNA2: Dynamic Adaptive Neural Network Arrays.

Degree: MS, Computer Engineering, 2018, University of Tennessee – Knoxville

 Traditional Von Neumann architectures have been at the center of computing for decades thanks in part to Moore's Law and Dennard Scaling. However, MOSFET scaling… (more)

Subjects/Keywords: neuromorphic; spiking neural networks; computer architecture; machine learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mitchell, J. P. (2018). DANNA2: Dynamic Adaptive Neural Network Arrays. (Thesis). University of Tennessee – Knoxville. Retrieved from https://trace.tennessee.edu/utk_gradthes/5167

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Mitchell, John Parker. “DANNA2: Dynamic Adaptive Neural Network Arrays.” 2018. Thesis, University of Tennessee – Knoxville. Accessed October 19, 2020. https://trace.tennessee.edu/utk_gradthes/5167.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Mitchell, John Parker. “DANNA2: Dynamic Adaptive Neural Network Arrays.” 2018. Web. 19 Oct 2020.

Vancouver:

Mitchell JP. DANNA2: Dynamic Adaptive Neural Network Arrays. [Internet] [Thesis]. University of Tennessee – Knoxville; 2018. [cited 2020 Oct 19]. Available from: https://trace.tennessee.edu/utk_gradthes/5167.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Mitchell JP. DANNA2: Dynamic Adaptive Neural Network Arrays. [Thesis]. University of Tennessee – Knoxville; 2018. Available from: https://trace.tennessee.edu/utk_gradthes/5167

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Rochester Institute of Technology

20. Hays, Lydia M. Design Considerations for Training Memristor Crossbars Used in Spiking Neural Networks.

Degree: MS, Computer Engineering, 2018, Rochester Institute of Technology

  CMOS/Memristor integrated architectures have shown to be powerful for realizing energy-efficient learning machines. These architectures are recently demonstrated in reservoir computing networks, which have… (more)

Subjects/Keywords: Memristor; Memristor crossbar; Mixed signal design; Neuromorphic computing; Spiking neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hays, L. M. (2018). Design Considerations for Training Memristor Crossbars Used in Spiking Neural Networks. (Masters Thesis). Rochester Institute of Technology. Retrieved from https://scholarworks.rit.edu/theses/9732

Chicago Manual of Style (16th Edition):

Hays, Lydia M. “Design Considerations for Training Memristor Crossbars Used in Spiking Neural Networks.” 2018. Masters Thesis, Rochester Institute of Technology. Accessed October 19, 2020. https://scholarworks.rit.edu/theses/9732.

MLA Handbook (7th Edition):

Hays, Lydia M. “Design Considerations for Training Memristor Crossbars Used in Spiking Neural Networks.” 2018. Web. 19 Oct 2020.

Vancouver:

Hays LM. Design Considerations for Training Memristor Crossbars Used in Spiking Neural Networks. [Internet] [Masters thesis]. Rochester Institute of Technology; 2018. [cited 2020 Oct 19]. Available from: https://scholarworks.rit.edu/theses/9732.

Council of Science Editors:

Hays LM. Design Considerations for Training Memristor Crossbars Used in Spiking Neural Networks. [Masters Thesis]. Rochester Institute of Technology; 2018. Available from: https://scholarworks.rit.edu/theses/9732


University of Cambridge

21. Fox, Paul James. Massively parallel neural computation.

Degree: PhD, 2013, University of Cambridge

 Reverse-engineering the brain is one of the US National Academy of Engineering’s “Grand Challenges.” The structure of the brain can be examined at many different… (more)

Subjects/Keywords: 006.32; FPGA; Neural network; Scientific computing; Computer architecture; Spiking neural networks; Real-time systems

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fox, P. J. (2013). Massively parallel neural computation. (Doctoral Dissertation). University of Cambridge. Retrieved from https://doi.org/10.17863/CAM.16380 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.590234

Chicago Manual of Style (16th Edition):

Fox, Paul James. “Massively parallel neural computation.” 2013. Doctoral Dissertation, University of Cambridge. Accessed October 19, 2020. https://doi.org/10.17863/CAM.16380 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.590234.

MLA Handbook (7th Edition):

Fox, Paul James. “Massively parallel neural computation.” 2013. Web. 19 Oct 2020.

Vancouver:

Fox PJ. Massively parallel neural computation. [Internet] [Doctoral dissertation]. University of Cambridge; 2013. [cited 2020 Oct 19]. Available from: https://doi.org/10.17863/CAM.16380 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.590234.

Council of Science Editors:

Fox PJ. Massively parallel neural computation. [Doctoral Dissertation]. University of Cambridge; 2013. Available from: https://doi.org/10.17863/CAM.16380 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.590234

22. Fox, Paul James. Massively parallel neural computation.

Degree: PhD, 2013, University of Cambridge

 Reverse-engineering the brain is one of the US National Academy of Engineering’s “Grand Challenges.” The structure of the brain can be examined at many different… (more)

Subjects/Keywords: FPGA; Neural network; Scientific computing; Computer architecture; Spiking neural networks; Real-time systems

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fox, P. J. (2013). Massively parallel neural computation. (Doctoral Dissertation). University of Cambridge. Retrieved from https://www.repository.cam.ac.uk/handle/1810/245013https://www.repository.cam.ac.uk/bitstream/1810/245013/2/license.txt ; https://www.repository.cam.ac.uk/bitstream/1810/245013/5/pjf33_thesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/245013/6/pjf33_thesis.pdf.jpg

Chicago Manual of Style (16th Edition):

Fox, Paul James. “Massively parallel neural computation.” 2013. Doctoral Dissertation, University of Cambridge. Accessed October 19, 2020. https://www.repository.cam.ac.uk/handle/1810/245013https://www.repository.cam.ac.uk/bitstream/1810/245013/2/license.txt ; https://www.repository.cam.ac.uk/bitstream/1810/245013/5/pjf33_thesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/245013/6/pjf33_thesis.pdf.jpg.

MLA Handbook (7th Edition):

Fox, Paul James. “Massively parallel neural computation.” 2013. Web. 19 Oct 2020.

Vancouver:

Fox PJ. Massively parallel neural computation. [Internet] [Doctoral dissertation]. University of Cambridge; 2013. [cited 2020 Oct 19]. Available from: https://www.repository.cam.ac.uk/handle/1810/245013https://www.repository.cam.ac.uk/bitstream/1810/245013/2/license.txt ; https://www.repository.cam.ac.uk/bitstream/1810/245013/5/pjf33_thesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/245013/6/pjf33_thesis.pdf.jpg.

Council of Science Editors:

Fox PJ. Massively parallel neural computation. [Doctoral Dissertation]. University of Cambridge; 2013. Available from: https://www.repository.cam.ac.uk/handle/1810/245013https://www.repository.cam.ac.uk/bitstream/1810/245013/2/license.txt ; https://www.repository.cam.ac.uk/bitstream/1810/245013/5/pjf33_thesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/245013/6/pjf33_thesis.pdf.jpg

23. Payvand, Melika. Area-efficient Neuromorphic Silicon Circuits and Architectures using Spatial and Spatio-Temporal Approaches.

Degree: 2016, University of California – eScholarship, University of California

 In the field of neuromorphic VLSI connectivity is a huge bottleneck in implementing brain-inspired circuits due to the large number of synapses needed for performing… (more)

Subjects/Keywords: Electrical engineering; Neurosciences; Memristive Synapses; Neural Coding; Neuromorphic VLSI; Spiking Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Payvand, M. (2016). Area-efficient Neuromorphic Silicon Circuits and Architectures using Spatial and Spatio-Temporal Approaches. (Thesis). University of California – eScholarship, University of California. Retrieved from http://www.escholarship.org/uc/item/61x78103

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Payvand, Melika. “Area-efficient Neuromorphic Silicon Circuits and Architectures using Spatial and Spatio-Temporal Approaches.” 2016. Thesis, University of California – eScholarship, University of California. Accessed October 19, 2020. http://www.escholarship.org/uc/item/61x78103.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Payvand, Melika. “Area-efficient Neuromorphic Silicon Circuits and Architectures using Spatial and Spatio-Temporal Approaches.” 2016. Web. 19 Oct 2020.

Vancouver:

Payvand M. Area-efficient Neuromorphic Silicon Circuits and Architectures using Spatial and Spatio-Temporal Approaches. [Internet] [Thesis]. University of California – eScholarship, University of California; 2016. [cited 2020 Oct 19]. Available from: http://www.escholarship.org/uc/item/61x78103.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Payvand M. Area-efficient Neuromorphic Silicon Circuits and Architectures using Spatial and Spatio-Temporal Approaches. [Thesis]. University of California – eScholarship, University of California; 2016. Available from: http://www.escholarship.org/uc/item/61x78103

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


ETH Zürich

24. Neil, Daniel. Deep Neural Networks and Hardware Systems for Event-driven Data.

Degree: 2017, ETH Zürich

 Event-based sensors, built with biological inspiration, differ greatly from traditional sensor types. A standard vision sensor uses a pixel array to produce a frame containing… (more)

Subjects/Keywords: Deep Neural Networks; Event-driven sensors; Deep neural networks (DNNs); Spiking deep neural networks; Recurrent Neural Networks; Convolutional neural networks; info:eu-repo/classification/ddc/4; Data processing, computer science

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Neil, D. (2017). Deep Neural Networks and Hardware Systems for Event-driven Data. (Doctoral Dissertation). ETH Zürich. Retrieved from http://hdl.handle.net/20.500.11850/168865

Chicago Manual of Style (16th Edition):

Neil, Daniel. “Deep Neural Networks and Hardware Systems for Event-driven Data.” 2017. Doctoral Dissertation, ETH Zürich. Accessed October 19, 2020. http://hdl.handle.net/20.500.11850/168865.

MLA Handbook (7th Edition):

Neil, Daniel. “Deep Neural Networks and Hardware Systems for Event-driven Data.” 2017. Web. 19 Oct 2020.

Vancouver:

Neil D. Deep Neural Networks and Hardware Systems for Event-driven Data. [Internet] [Doctoral dissertation]. ETH Zürich; 2017. [cited 2020 Oct 19]. Available from: http://hdl.handle.net/20.500.11850/168865.

Council of Science Editors:

Neil D. Deep Neural Networks and Hardware Systems for Event-driven Data. [Doctoral Dissertation]. ETH Zürich; 2017. Available from: http://hdl.handle.net/20.500.11850/168865


University of Manchester

25. Stromatias, Evangelos. Scalability and robustness of artificial neural networks.

Degree: PhD, 2016, University of Manchester

 Artificial Neural Networks (ANNs) appear increasingly and routinely to gain popularity today, as they are being used in several diverse research fields and many different… (more)

Subjects/Keywords: 006.3; SpiNNaker; Neuromorphic; Spiking; low-power; low-latency; scalable; robustness; limited weight precision; spiking neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Stromatias, E. (2016). Scalability and robustness of artificial neural networks. (Doctoral Dissertation). University of Manchester. Retrieved from https://www.research.manchester.ac.uk/portal/en/theses/scalability-and-robustness-of-artificial-neural-networks(b73b3f77-2bc3-4197-bd0f-dc7501b872cb).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.686773

Chicago Manual of Style (16th Edition):

Stromatias, Evangelos. “Scalability and robustness of artificial neural networks.” 2016. Doctoral Dissertation, University of Manchester. Accessed October 19, 2020. https://www.research.manchester.ac.uk/portal/en/theses/scalability-and-robustness-of-artificial-neural-networks(b73b3f77-2bc3-4197-bd0f-dc7501b872cb).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.686773.

MLA Handbook (7th Edition):

Stromatias, Evangelos. “Scalability and robustness of artificial neural networks.” 2016. Web. 19 Oct 2020.

Vancouver:

Stromatias E. Scalability and robustness of artificial neural networks. [Internet] [Doctoral dissertation]. University of Manchester; 2016. [cited 2020 Oct 19]. Available from: https://www.research.manchester.ac.uk/portal/en/theses/scalability-and-robustness-of-artificial-neural-networks(b73b3f77-2bc3-4197-bd0f-dc7501b872cb).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.686773.

Council of Science Editors:

Stromatias E. Scalability and robustness of artificial neural networks. [Doctoral Dissertation]. University of Manchester; 2016. Available from: https://www.research.manchester.ac.uk/portal/en/theses/scalability-and-robustness-of-artificial-neural-networks(b73b3f77-2bc3-4197-bd0f-dc7501b872cb).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.686773


University of California – Merced

26. Shea, Timothy Michael. Dynamic Decisions.

Degree: Cognitive and Information Sciences, 2019, University of California – Merced

 The science of decision-making is dominated by tasks which constrain decisions to occurin fixed intervals of time with minimal movement of the body or environment.… (more)

Subjects/Keywords: Neurosciences; Cognitive psychology; Basal Ganglia; Critical Branching; Deep Neural Networks; Neuronal Dynamics; Representational Transience; Spiking Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Shea, T. M. (2019). Dynamic Decisions. (Thesis). University of California – Merced. Retrieved from http://www.escholarship.org/uc/item/3nv3f9dz

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Shea, Timothy Michael. “Dynamic Decisions.” 2019. Thesis, University of California – Merced. Accessed October 19, 2020. http://www.escholarship.org/uc/item/3nv3f9dz.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Shea, Timothy Michael. “Dynamic Decisions.” 2019. Web. 19 Oct 2020.

Vancouver:

Shea TM. Dynamic Decisions. [Internet] [Thesis]. University of California – Merced; 2019. [cited 2020 Oct 19]. Available from: http://www.escholarship.org/uc/item/3nv3f9dz.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Shea TM. Dynamic Decisions. [Thesis]. University of California – Merced; 2019. Available from: http://www.escholarship.org/uc/item/3nv3f9dz

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Manchester

27. James, Robert. Spikes from sound: A model of the human auditory periphery on SpiNNaker.

Degree: 2020, University of Manchester

 From a computational perspective much can be learned from studying the brain. For auditory processing three biological attributes are presented as being responsible for good… (more)

Subjects/Keywords: auditory pathway; cochlear modelling; SpiNNaker; neuromorphic hardware; spiking neural networks; large scale; parallel computing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

James, R. (2020). Spikes from sound: A model of the human auditory periphery on SpiNNaker. (Doctoral Dissertation). University of Manchester. Retrieved from http://www.manchester.ac.uk/escholar/uk-ac-man-scw:323501

Chicago Manual of Style (16th Edition):

James, Robert. “Spikes from sound: A model of the human auditory periphery on SpiNNaker.” 2020. Doctoral Dissertation, University of Manchester. Accessed October 19, 2020. http://www.manchester.ac.uk/escholar/uk-ac-man-scw:323501.

MLA Handbook (7th Edition):

James, Robert. “Spikes from sound: A model of the human auditory periphery on SpiNNaker.” 2020. Web. 19 Oct 2020.

Vancouver:

James R. Spikes from sound: A model of the human auditory periphery on SpiNNaker. [Internet] [Doctoral dissertation]. University of Manchester; 2020. [cited 2020 Oct 19]. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:323501.

Council of Science Editors:

James R. Spikes from sound: A model of the human auditory periphery on SpiNNaker. [Doctoral Dissertation]. University of Manchester; 2020. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:323501


Syracuse University

28. Ahmed, Khadeer. Efficient Implementation of Stochastic Inference on Heterogeneous Clusters and Spiking Neural Networks.

Degree: PhD, Electrical Engineering and Computer Science, 2017, Syracuse University

  Neuromorphic computing refers to brain inspired algorithms and architectures. This paradigm of computing can solve complex problems which were not possible with traditional computing… (more)

Subjects/Keywords: Bayesian inference; Digital neuron; High performance computing; Neuromorphic computing; Simulation; Spiking neural networks; Engineering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ahmed, K. (2017). Efficient Implementation of Stochastic Inference on Heterogeneous Clusters and Spiking Neural Networks. (Doctoral Dissertation). Syracuse University. Retrieved from https://surface.syr.edu/etd/788

Chicago Manual of Style (16th Edition):

Ahmed, Khadeer. “Efficient Implementation of Stochastic Inference on Heterogeneous Clusters and Spiking Neural Networks.” 2017. Doctoral Dissertation, Syracuse University. Accessed October 19, 2020. https://surface.syr.edu/etd/788.

MLA Handbook (7th Edition):

Ahmed, Khadeer. “Efficient Implementation of Stochastic Inference on Heterogeneous Clusters and Spiking Neural Networks.” 2017. Web. 19 Oct 2020.

Vancouver:

Ahmed K. Efficient Implementation of Stochastic Inference on Heterogeneous Clusters and Spiking Neural Networks. [Internet] [Doctoral dissertation]. Syracuse University; 2017. [cited 2020 Oct 19]. Available from: https://surface.syr.edu/etd/788.

Council of Science Editors:

Ahmed K. Efficient Implementation of Stochastic Inference on Heterogeneous Clusters and Spiking Neural Networks. [Doctoral Dissertation]. Syracuse University; 2017. Available from: https://surface.syr.edu/etd/788


Delft University of Technology

29. Paredes Valles, Federico (author). Neuromorphic Computing of Event-Based Data for High-Speed Vision-Based Navigation.

Degree: 2018, Delft University of Technology

 The combination of Spiking Neural Networks and event-based vision sensors holds the potential of highly efficient and high-bandwidth optical flow estimation. This thesis presents, to… (more)

Subjects/Keywords: Event-based vision; Neuromorphic; Optical flow; Spiking Neural Networks; Spike-Timing-Dependent Plasticity

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Paredes Valles, F. (. (2018). Neuromorphic Computing of Event-Based Data for High-Speed Vision-Based Navigation. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:aa13959b-79b9-4dfc-b5e0-7c501d9d3e2f

Chicago Manual of Style (16th Edition):

Paredes Valles, Federico (author). “Neuromorphic Computing of Event-Based Data for High-Speed Vision-Based Navigation.” 2018. Masters Thesis, Delft University of Technology. Accessed October 19, 2020. http://resolver.tudelft.nl/uuid:aa13959b-79b9-4dfc-b5e0-7c501d9d3e2f.

MLA Handbook (7th Edition):

Paredes Valles, Federico (author). “Neuromorphic Computing of Event-Based Data for High-Speed Vision-Based Navigation.” 2018. Web. 19 Oct 2020.

Vancouver:

Paredes Valles F(. Neuromorphic Computing of Event-Based Data for High-Speed Vision-Based Navigation. [Internet] [Masters thesis]. Delft University of Technology; 2018. [cited 2020 Oct 19]. Available from: http://resolver.tudelft.nl/uuid:aa13959b-79b9-4dfc-b5e0-7c501d9d3e2f.

Council of Science Editors:

Paredes Valles F(. Neuromorphic Computing of Event-Based Data for High-Speed Vision-Based Navigation. [Masters Thesis]. Delft University of Technology; 2018. Available from: http://resolver.tudelft.nl/uuid:aa13959b-79b9-4dfc-b5e0-7c501d9d3e2f


Delft University of Technology

30. Kolağasioğlu, Eralp (author). Energy Efficient Feature Extraction for Single-Lead ECG Classification Based On Spiking Neural Networks.

Degree: 2018, Delft University of Technology

Cardiovascular diseases are the leading cause of death in the devel- oped world. Preventing these deaths, require long term monitoring and manual inspection of ECG… (more)

Subjects/Keywords: neuromorphic machines; spiking neural networks; ecg beat classification; feature extraction; low power; global calssification

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kolağasioğlu, E. (. (2018). Energy Efficient Feature Extraction for Single-Lead ECG Classification Based On Spiking Neural Networks. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:3b7de584-23c6-4d46-b32a-0fe57c4f8dbd

Chicago Manual of Style (16th Edition):

Kolağasioğlu, Eralp (author). “Energy Efficient Feature Extraction for Single-Lead ECG Classification Based On Spiking Neural Networks.” 2018. Masters Thesis, Delft University of Technology. Accessed October 19, 2020. http://resolver.tudelft.nl/uuid:3b7de584-23c6-4d46-b32a-0fe57c4f8dbd.

MLA Handbook (7th Edition):

Kolağasioğlu, Eralp (author). “Energy Efficient Feature Extraction for Single-Lead ECG Classification Based On Spiking Neural Networks.” 2018. Web. 19 Oct 2020.

Vancouver:

Kolağasioğlu E(. Energy Efficient Feature Extraction for Single-Lead ECG Classification Based On Spiking Neural Networks. [Internet] [Masters thesis]. Delft University of Technology; 2018. [cited 2020 Oct 19]. Available from: http://resolver.tudelft.nl/uuid:3b7de584-23c6-4d46-b32a-0fe57c4f8dbd.

Council of Science Editors:

Kolağasioğlu E(. Energy Efficient Feature Extraction for Single-Lead ECG Classification Based On Spiking Neural Networks. [Masters Thesis]. Delft University of Technology; 2018. Available from: http://resolver.tudelft.nl/uuid:3b7de584-23c6-4d46-b32a-0fe57c4f8dbd

[1] [2] [3] [4]

.