Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Hebbian learning). Showing records 1 – 28 of 28 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters

1. Prabakaran, N. A security on neural cryptography with Multiple transfer functions using feedback And multi feedback tree parity machine; -.

Degree: Science and Humanities, 2014, Anna University

A common secret key is generated using neural networks and newlineCryptography This can be achieved by two Tree Parity Machines TPMs newlinewhich are trained on… (more)

Subjects/Keywords: Hebbian learning; Pseudo Random Number Generators

Page 1

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Prabakaran, N. (2014). A security on neural cryptography with Multiple transfer functions using feedback And multi feedback tree parity machine; -. (Thesis). Anna University. Retrieved from http://shodhganga.inflibnet.ac.in/handle/10603/26640

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Prabakaran, N. “A security on neural cryptography with Multiple transfer functions using feedback And multi feedback tree parity machine; -.” 2014. Thesis, Anna University. Accessed October 21, 2020. http://shodhganga.inflibnet.ac.in/handle/10603/26640.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Prabakaran, N. “A security on neural cryptography with Multiple transfer functions using feedback And multi feedback tree parity machine; -.” 2014. Web. 21 Oct 2020.

Vancouver:

Prabakaran N. A security on neural cryptography with Multiple transfer functions using feedback And multi feedback tree parity machine; -. [Internet] [Thesis]. Anna University; 2014. [cited 2020 Oct 21]. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/26640.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Prabakaran N. A security on neural cryptography with Multiple transfer functions using feedback And multi feedback tree parity machine; -. [Thesis]. Anna University; 2014. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/26640

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Delft University of Technology

2. Husić, Ajdin (author). Learning to Control Multi- Dimensional Autonomous Agents using Hebbian Learning: A Global Reward Approach.

Degree: 2018, Delft University of Technology

The novelty-raahn algorithm has been shown to effectively learn a desired behavior from raw inputs by connecting an autoencoder with a Hebbian network. Hebbian learning(more)

Subjects/Keywords: Hebbian learning; Machine Learning; robot control; Autonomous cars

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Husić, A. (. (2018). Learning to Control Multi- Dimensional Autonomous Agents using Hebbian Learning: A Global Reward Approach. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:9e2b4a67-041c-4ee7-be9a-e03891d3d17d

Chicago Manual of Style (16th Edition):

Husić, Ajdin (author). “Learning to Control Multi- Dimensional Autonomous Agents using Hebbian Learning: A Global Reward Approach.” 2018. Masters Thesis, Delft University of Technology. Accessed October 21, 2020. http://resolver.tudelft.nl/uuid:9e2b4a67-041c-4ee7-be9a-e03891d3d17d.

MLA Handbook (7th Edition):

Husić, Ajdin (author). “Learning to Control Multi- Dimensional Autonomous Agents using Hebbian Learning: A Global Reward Approach.” 2018. Web. 21 Oct 2020.

Vancouver:

Husić A(. Learning to Control Multi- Dimensional Autonomous Agents using Hebbian Learning: A Global Reward Approach. [Internet] [Masters thesis]. Delft University of Technology; 2018. [cited 2020 Oct 21]. Available from: http://resolver.tudelft.nl/uuid:9e2b4a67-041c-4ee7-be9a-e03891d3d17d.

Council of Science Editors:

Husić A(. Learning to Control Multi- Dimensional Autonomous Agents using Hebbian Learning: A Global Reward Approach. [Masters Thesis]. Delft University of Technology; 2018. Available from: http://resolver.tudelft.nl/uuid:9e2b4a67-041c-4ee7-be9a-e03891d3d17d


Loughborough University

3. Bahroun, Yanis. Exploration and extension of the similarity matching framework : feature learning, nonlinear methods and transformation learning.

Degree: PhD, 2020, Loughborough University

 Similarity matching (SM) is a framework introduced recently for deriving biologically plausible neural networks from objective functions. Three key biological properties associated with these networks… (more)

Subjects/Keywords: Feature learning; Neural networks; Unsupervised learning; Sparse coding; Dimensionality reduction; Hebbian learning; Online learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bahroun, Y. (2020). Exploration and extension of the similarity matching framework : feature learning, nonlinear methods and transformation learning. (Doctoral Dissertation). Loughborough University. Retrieved from https://doi.org/10.26174/thesis.lboro.12683804.v1 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.812972

Chicago Manual of Style (16th Edition):

Bahroun, Yanis. “Exploration and extension of the similarity matching framework : feature learning, nonlinear methods and transformation learning.” 2020. Doctoral Dissertation, Loughborough University. Accessed October 21, 2020. https://doi.org/10.26174/thesis.lboro.12683804.v1 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.812972.

MLA Handbook (7th Edition):

Bahroun, Yanis. “Exploration and extension of the similarity matching framework : feature learning, nonlinear methods and transformation learning.” 2020. Web. 21 Oct 2020.

Vancouver:

Bahroun Y. Exploration and extension of the similarity matching framework : feature learning, nonlinear methods and transformation learning. [Internet] [Doctoral dissertation]. Loughborough University; 2020. [cited 2020 Oct 21]. Available from: https://doi.org/10.26174/thesis.lboro.12683804.v1 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.812972.

Council of Science Editors:

Bahroun Y. Exploration and extension of the similarity matching framework : feature learning, nonlinear methods and transformation learning. [Doctoral Dissertation]. Loughborough University; 2020. Available from: https://doi.org/10.26174/thesis.lboro.12683804.v1 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.812972


University of Southern California

4. Liu, Jiajuan. Mode of visual perceptual learning: augmented Hebbian learning explains the function of feedback and beyond.

Degree: PhD, Neuroscience, 2011, University of Southern California

 This thesis concerns the augmented Hebbian reweighting model (AHRM) in perceptual learning. The development of AHRM was inspired by two sets of research endeavors: first,… (more)

Subjects/Keywords: perceptual learning; augmented Hebbian re-weighting; feedback; psychophysics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liu, J. (2011). Mode of visual perceptual learning: augmented Hebbian learning explains the function of feedback and beyond. (Doctoral Dissertation). University of Southern California. Retrieved from http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/630039/rec/4111

Chicago Manual of Style (16th Edition):

Liu, Jiajuan. “Mode of visual perceptual learning: augmented Hebbian learning explains the function of feedback and beyond.” 2011. Doctoral Dissertation, University of Southern California. Accessed October 21, 2020. http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/630039/rec/4111.

MLA Handbook (7th Edition):

Liu, Jiajuan. “Mode of visual perceptual learning: augmented Hebbian learning explains the function of feedback and beyond.” 2011. Web. 21 Oct 2020.

Vancouver:

Liu J. Mode of visual perceptual learning: augmented Hebbian learning explains the function of feedback and beyond. [Internet] [Doctoral dissertation]. University of Southern California; 2011. [cited 2020 Oct 21]. Available from: http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/630039/rec/4111.

Council of Science Editors:

Liu J. Mode of visual perceptual learning: augmented Hebbian learning explains the function of feedback and beyond. [Doctoral Dissertation]. University of Southern California; 2011. Available from: http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/630039/rec/4111

5. Fyfe, Colin. Negative feedback as an organising principle for artificial neural networks.

Degree: PhD, 1995, University of Strathclyde

 We investigate the properties of an unsupervised neural network which uses simple Hebbian learning and negative feedback of activation in order to self-organise. The negative… (more)

Subjects/Keywords: 003.5; Cybernetics; Hebbian learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fyfe, C. (1995). Negative feedback as an organising principle for artificial neural networks. (Doctoral Dissertation). University of Strathclyde. Retrieved from http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=21390 ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.363303

Chicago Manual of Style (16th Edition):

Fyfe, Colin. “Negative feedback as an organising principle for artificial neural networks.” 1995. Doctoral Dissertation, University of Strathclyde. Accessed October 21, 2020. http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=21390 ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.363303.

MLA Handbook (7th Edition):

Fyfe, Colin. “Negative feedback as an organising principle for artificial neural networks.” 1995. Web. 21 Oct 2020.

Vancouver:

Fyfe C. Negative feedback as an organising principle for artificial neural networks. [Internet] [Doctoral dissertation]. University of Strathclyde; 1995. [cited 2020 Oct 21]. Available from: http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=21390 ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.363303.

Council of Science Editors:

Fyfe C. Negative feedback as an organising principle for artificial neural networks. [Doctoral Dissertation]. University of Strathclyde; 1995. Available from: http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=21390 ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.363303


Universiteit Utrecht

6. Kuipers, T. Spread Maximization - A Novel Unsupervised Learning Paradigm Applied to Convolutional Neural Networks.

Degree: 2014, Universiteit Utrecht

 Unsupervised learning provides a way to extract features from data which can be used to pre-train Artificial Neural Networks (ANNs) improving on the performance of… (more)

Subjects/Keywords: Convolutional Neural Networks; Neural Networks; Unsupervised Learning; Auto-Encoders; Hebbian Learning; Principal Component Analysis; Pooled Convolutional Component Analysis; Uniformization; Dichotomization; Spread Maximization; Eigenvolume Expansion; Soft Pooling; Generalized Hebbian Algorithm

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kuipers, T. (2014). Spread Maximization - A Novel Unsupervised Learning Paradigm Applied to Convolutional Neural Networks. (Masters Thesis). Universiteit Utrecht. Retrieved from http://dspace.library.uu.nl:8080/handle/1874/298569

Chicago Manual of Style (16th Edition):

Kuipers, T. “Spread Maximization - A Novel Unsupervised Learning Paradigm Applied to Convolutional Neural Networks.” 2014. Masters Thesis, Universiteit Utrecht. Accessed October 21, 2020. http://dspace.library.uu.nl:8080/handle/1874/298569.

MLA Handbook (7th Edition):

Kuipers, T. “Spread Maximization - A Novel Unsupervised Learning Paradigm Applied to Convolutional Neural Networks.” 2014. Web. 21 Oct 2020.

Vancouver:

Kuipers T. Spread Maximization - A Novel Unsupervised Learning Paradigm Applied to Convolutional Neural Networks. [Internet] [Masters thesis]. Universiteit Utrecht; 2014. [cited 2020 Oct 21]. Available from: http://dspace.library.uu.nl:8080/handle/1874/298569.

Council of Science Editors:

Kuipers T. Spread Maximization - A Novel Unsupervised Learning Paradigm Applied to Convolutional Neural Networks. [Masters Thesis]. Universiteit Utrecht; 2014. Available from: http://dspace.library.uu.nl:8080/handle/1874/298569


Universidade do Rio Grande do Sul

7. Volpe, Isabel Cristina. Cell assemblies para expansão de consultas.

Degree: 2011, Universidade do Rio Grande do Sul

Uma das principais tarefas de Recuperação de Informações é encontrar documentos que sejam relevantes a uma consulta. Esta tarefa é difícil porque, em muitos casos… (more)

Subjects/Keywords: Recuperacao : Informacao; Query expansion; Redes neurais; Information retrieval; Neural networks; Hebbian learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Volpe, I. C. (2011). Cell assemblies para expansão de consultas. (Thesis). Universidade do Rio Grande do Sul. Retrieved from http://hdl.handle.net/10183/32858

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Volpe, Isabel Cristina. “Cell assemblies para expansão de consultas.” 2011. Thesis, Universidade do Rio Grande do Sul. Accessed October 21, 2020. http://hdl.handle.net/10183/32858.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Volpe, Isabel Cristina. “Cell assemblies para expansão de consultas.” 2011. Web. 21 Oct 2020.

Vancouver:

Volpe IC. Cell assemblies para expansão de consultas. [Internet] [Thesis]. Universidade do Rio Grande do Sul; 2011. [cited 2020 Oct 21]. Available from: http://hdl.handle.net/10183/32858.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Volpe IC. Cell assemblies para expansão de consultas. [Thesis]. Universidade do Rio Grande do Sul; 2011. Available from: http://hdl.handle.net/10183/32858

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Oregon

8. Holt, Caleb. The Interplay of Neural Dynamics and Connectivity Structures in the Visual Cortex.

Degree: PhD, Department of Physics, 2020, University of Oregon

 Here, we theoretically study how cortical networks’ synaptic connectivity shapes their spiking activity dynamics, and in turn, how dynamics shape the structure of synaptic connectivity.… (more)

Subjects/Keywords: dynamics; gamma oscillations; hebbian learning; receptive fields; stabilized supralinear network; visual cortex

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Holt, C. (2020). The Interplay of Neural Dynamics and Connectivity Structures in the Visual Cortex. (Doctoral Dissertation). University of Oregon. Retrieved from https://scholarsbank.uoregon.edu/xmlui/handle/1794/25600

Chicago Manual of Style (16th Edition):

Holt, Caleb. “The Interplay of Neural Dynamics and Connectivity Structures in the Visual Cortex.” 2020. Doctoral Dissertation, University of Oregon. Accessed October 21, 2020. https://scholarsbank.uoregon.edu/xmlui/handle/1794/25600.

MLA Handbook (7th Edition):

Holt, Caleb. “The Interplay of Neural Dynamics and Connectivity Structures in the Visual Cortex.” 2020. Web. 21 Oct 2020.

Vancouver:

Holt C. The Interplay of Neural Dynamics and Connectivity Structures in the Visual Cortex. [Internet] [Doctoral dissertation]. University of Oregon; 2020. [cited 2020 Oct 21]. Available from: https://scholarsbank.uoregon.edu/xmlui/handle/1794/25600.

Council of Science Editors:

Holt C. The Interplay of Neural Dynamics and Connectivity Structures in the Visual Cortex. [Doctoral Dissertation]. University of Oregon; 2020. Available from: https://scholarsbank.uoregon.edu/xmlui/handle/1794/25600


University of Cincinnati

9. SHAH, PAYAL D. DISTRIBUTED HEBBIAN INFERENCE OF ENVIRONMENT STRUCTURE IN SELF-ORGANIZED SENSOR NETWORKS.

Degree: MS, Engineering : Electrical Engineering, 2007, University of Cincinnati

 Ad hoc wireless sensor networks are emerging as an important technology for applications such as environmental monitoring, battlefield surveillance and infrastructure security. Centralized processing in… (more)

Subjects/Keywords: Self-Organization; Distributed Sensor Network; Hebbian Learning; Topology Inference

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

SHAH, P. D. (2007). DISTRIBUTED HEBBIAN INFERENCE OF ENVIRONMENT STRUCTURE IN SELF-ORGANIZED SENSOR NETWORKS. (Masters Thesis). University of Cincinnati. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=ucin1177083367

Chicago Manual of Style (16th Edition):

SHAH, PAYAL D. “DISTRIBUTED HEBBIAN INFERENCE OF ENVIRONMENT STRUCTURE IN SELF-ORGANIZED SENSOR NETWORKS.” 2007. Masters Thesis, University of Cincinnati. Accessed October 21, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1177083367.

MLA Handbook (7th Edition):

SHAH, PAYAL D. “DISTRIBUTED HEBBIAN INFERENCE OF ENVIRONMENT STRUCTURE IN SELF-ORGANIZED SENSOR NETWORKS.” 2007. Web. 21 Oct 2020.

Vancouver:

SHAH PD. DISTRIBUTED HEBBIAN INFERENCE OF ENVIRONMENT STRUCTURE IN SELF-ORGANIZED SENSOR NETWORKS. [Internet] [Masters thesis]. University of Cincinnati; 2007. [cited 2020 Oct 21]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1177083367.

Council of Science Editors:

SHAH PD. DISTRIBUTED HEBBIAN INFERENCE OF ENVIRONMENT STRUCTURE IN SELF-ORGANIZED SENSOR NETWORKS. [Masters Thesis]. University of Cincinnati; 2007. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1177083367

10. Topalidou, Meropi. Neuroscience of decision making : from goal-directed actions to habits : Neuroscience de la prise de décision : des actions dirigées vers un but aux habitudes.

Degree: Docteur es, Informatique, 2016, Bordeaux

Les processus de type “action-conséquence” (orienté vers un but) et stimulus-réponse sont deux composants importants du comportement. Le premier évalue le bénéfice d’une action pour… (more)

Subjects/Keywords: Habitude; Ganglion de la base; Cortex; Neuroscience informatique; Renforcement apprentissage; Hebbian apprentissage; Prise de décision; Action orientée; Habit; Basal ganglia; Cortex; Computational neuroscience; Reinforcement learning; Hebbian learning; Decision-making; Goal-directed action

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Topalidou, M. (2016). Neuroscience of decision making : from goal-directed actions to habits : Neuroscience de la prise de décision : des actions dirigées vers un but aux habitudes. (Doctoral Dissertation). Bordeaux. Retrieved from http://www.theses.fr/2016BORD0174

Chicago Manual of Style (16th Edition):

Topalidou, Meropi. “Neuroscience of decision making : from goal-directed actions to habits : Neuroscience de la prise de décision : des actions dirigées vers un but aux habitudes.” 2016. Doctoral Dissertation, Bordeaux. Accessed October 21, 2020. http://www.theses.fr/2016BORD0174.

MLA Handbook (7th Edition):

Topalidou, Meropi. “Neuroscience of decision making : from goal-directed actions to habits : Neuroscience de la prise de décision : des actions dirigées vers un but aux habitudes.” 2016. Web. 21 Oct 2020.

Vancouver:

Topalidou M. Neuroscience of decision making : from goal-directed actions to habits : Neuroscience de la prise de décision : des actions dirigées vers un but aux habitudes. [Internet] [Doctoral dissertation]. Bordeaux; 2016. [cited 2020 Oct 21]. Available from: http://www.theses.fr/2016BORD0174.

Council of Science Editors:

Topalidou M. Neuroscience of decision making : from goal-directed actions to habits : Neuroscience de la prise de décision : des actions dirigées vers un but aux habitudes. [Doctoral Dissertation]. Bordeaux; 2016. Available from: http://www.theses.fr/2016BORD0174


Univerzitet u Beogradu

11. Janković, Marko V., 1968-. Samoorganizujuće neuralne mreže za analizu glavnih komponenata.

Degree: Elektrotehnički fakultet, 2014, Univerzitet u Beogradu

Datum odbrane: 23.03.2006.

Оvај rаd је pоsvеćеn јеdnоstаvnim biоlоški vеrоvаtnim аlgоritmimа zа еkstrаkciјu glаvnih/spоrеdnih kоmpоnеnаtа i/ili njihоvih pоtprоstоrа iz kоvаriјаnsnе mаtricе ulаznоg signаlа, kао i… (more)

Subjects/Keywords: unsupervised neural networks; principal component analysis; principal subspace analysis; time oriented heirarchical method; minor component analysis; Hebbian learning rule; biologically inspired learninig algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Janković, Marko V., 1. (2014). Samoorganizujuće neuralne mreže za analizu glavnih komponenata. (Thesis). Univerzitet u Beogradu. Retrieved from https://fedorabg.bg.ac.rs/fedora/get/o:7259/bdef:Content/get

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Janković, Marko V., 1968-. “Samoorganizujuće neuralne mreže za analizu glavnih komponenata.” 2014. Thesis, Univerzitet u Beogradu. Accessed October 21, 2020. https://fedorabg.bg.ac.rs/fedora/get/o:7259/bdef:Content/get.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Janković, Marko V., 1968-. “Samoorganizujuće neuralne mreže za analizu glavnih komponenata.” 2014. Web. 21 Oct 2020.

Vancouver:

Janković, Marko V. 1. Samoorganizujuće neuralne mreže za analizu glavnih komponenata. [Internet] [Thesis]. Univerzitet u Beogradu; 2014. [cited 2020 Oct 21]. Available from: https://fedorabg.bg.ac.rs/fedora/get/o:7259/bdef:Content/get.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Janković, Marko V. 1. Samoorganizujuće neuralne mreže za analizu glavnih komponenata. [Thesis]. Univerzitet u Beogradu; 2014. Available from: https://fedorabg.bg.ac.rs/fedora/get/o:7259/bdef:Content/get

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

12. Ingemanson, Morgan Leigh. Proprioception and motor learning after stroke – insights from neuroimaging studies.

Degree: Biomedical Sciences, 2017, University of California – Irvine

 Stroke is a leading cause of adult disability and patient response to treatment is highly variable. To understand this heterogeneity, the anatomical integrity and functional… (more)

Subjects/Keywords: Neurosciences; Hebbian plasticity; motor learning; neuroimaging; proprioception; rehabilitation; stroke

…first to directly support the concept of somatosensory-induced Hebbian-like learning within… …third and final aim explored a Hebbian concept of motor learning and examined which… …these results are the first to directly support the concept of Hebbian-like learning derived… …enhancing proprioceptive input and inducing a Hebbian-like learning paradigm [9], but… …that have helped shape this dissertation. Your guidance has reminded me that motor learning… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ingemanson, M. L. (2017). Proprioception and motor learning after stroke – insights from neuroimaging studies. (Thesis). University of California – Irvine. Retrieved from http://www.escholarship.org/uc/item/9c21k94w

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Ingemanson, Morgan Leigh. “Proprioception and motor learning after stroke – insights from neuroimaging studies.” 2017. Thesis, University of California – Irvine. Accessed October 21, 2020. http://www.escholarship.org/uc/item/9c21k94w.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Ingemanson, Morgan Leigh. “Proprioception and motor learning after stroke – insights from neuroimaging studies.” 2017. Web. 21 Oct 2020.

Vancouver:

Ingemanson ML. Proprioception and motor learning after stroke – insights from neuroimaging studies. [Internet] [Thesis]. University of California – Irvine; 2017. [cited 2020 Oct 21]. Available from: http://www.escholarship.org/uc/item/9c21k94w.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ingemanson ML. Proprioception and motor learning after stroke – insights from neuroimaging studies. [Thesis]. University of California – Irvine; 2017. Available from: http://www.escholarship.org/uc/item/9c21k94w

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Southern California

13. Lu, Bing. Noise-robust spectro-temporal acoustic signature recognition using nonlinear Hebbian learning.

Degree: PhD, Biomedical Engineering, 2009, University of Southern California

 How to recognize the acoustic signal of interest in open environments where many other acoustic noises exist? The efficient auditory signal processing and intelligent neural… (more)

Subjects/Keywords: acoustic signal recognition; biological inspiration; noise robustness; nonlinear Hebbian learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lu, B. (2009). Noise-robust spectro-temporal acoustic signature recognition using nonlinear Hebbian learning. (Doctoral Dissertation). University of Southern California. Retrieved from http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/239039/rec/4434

Chicago Manual of Style (16th Edition):

Lu, Bing. “Noise-robust spectro-temporal acoustic signature recognition using nonlinear Hebbian learning.” 2009. Doctoral Dissertation, University of Southern California. Accessed October 21, 2020. http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/239039/rec/4434.

MLA Handbook (7th Edition):

Lu, Bing. “Noise-robust spectro-temporal acoustic signature recognition using nonlinear Hebbian learning.” 2009. Web. 21 Oct 2020.

Vancouver:

Lu B. Noise-robust spectro-temporal acoustic signature recognition using nonlinear Hebbian learning. [Internet] [Doctoral dissertation]. University of Southern California; 2009. [cited 2020 Oct 21]. Available from: http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/239039/rec/4434.

Council of Science Editors:

Lu B. Noise-robust spectro-temporal acoustic signature recognition using nonlinear Hebbian learning. [Doctoral Dissertation]. University of Southern California; 2009. Available from: http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/239039/rec/4434

14. Yogeswaran, Arjun. Self-Organizing Neural Visual Models to Learn Feature Detectors and Motion Tracking Behaviour by Exposure to Real-World Data .

Degree: 2018, University of Ottawa

 Advances in unsupervised learning and deep neural networks have led to increased performance in a number of domains, and to the ability to draw strong… (more)

Subjects/Keywords: restricted Boltzmann machine; self-organization; deep learning; deep belief network; unsupervised learning; smooth pursuit; saccade; motion tracking; feature learning; invariance; Gaussian filter; neural network; Hebbian learning; real-world data; biologically-inspired; image classification; feature extraction; visual attention; retinal slip; saliency map

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yogeswaran, A. (2018). Self-Organizing Neural Visual Models to Learn Feature Detectors and Motion Tracking Behaviour by Exposure to Real-World Data . (Thesis). University of Ottawa. Retrieved from http://hdl.handle.net/10393/37096

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Yogeswaran, Arjun. “Self-Organizing Neural Visual Models to Learn Feature Detectors and Motion Tracking Behaviour by Exposure to Real-World Data .” 2018. Thesis, University of Ottawa. Accessed October 21, 2020. http://hdl.handle.net/10393/37096.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Yogeswaran, Arjun. “Self-Organizing Neural Visual Models to Learn Feature Detectors and Motion Tracking Behaviour by Exposure to Real-World Data .” 2018. Web. 21 Oct 2020.

Vancouver:

Yogeswaran A. Self-Organizing Neural Visual Models to Learn Feature Detectors and Motion Tracking Behaviour by Exposure to Real-World Data . [Internet] [Thesis]. University of Ottawa; 2018. [cited 2020 Oct 21]. Available from: http://hdl.handle.net/10393/37096.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Yogeswaran A. Self-Organizing Neural Visual Models to Learn Feature Detectors and Motion Tracking Behaviour by Exposure to Real-World Data . [Thesis]. University of Ottawa; 2018. Available from: http://hdl.handle.net/10393/37096

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Brigham Young University

15. Norton, R David. Improving Liquid State Machines Through Iterative Refinement of the Reservoir.

Degree: MS, 2008, Brigham Young University

  Liquid State Machines (LSMs) exploit the power of recurrent spiking neural networks (SNNs) without training the SNN. Instead, a reservoir, or liquid, is randomly… (more)

Subjects/Keywords: computer; liquid state machine; Hebbian learning; reinforcement learning; neural network; spiking neural network; machine learning; Computer Sciences

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Norton, R. D. (2008). Improving Liquid State Machines Through Iterative Refinement of the Reservoir. (Masters Thesis). Brigham Young University. Retrieved from https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=2353&context=etd

Chicago Manual of Style (16th Edition):

Norton, R David. “Improving Liquid State Machines Through Iterative Refinement of the Reservoir.” 2008. Masters Thesis, Brigham Young University. Accessed October 21, 2020. https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=2353&context=etd.

MLA Handbook (7th Edition):

Norton, R David. “Improving Liquid State Machines Through Iterative Refinement of the Reservoir.” 2008. Web. 21 Oct 2020.

Vancouver:

Norton RD. Improving Liquid State Machines Through Iterative Refinement of the Reservoir. [Internet] [Masters thesis]. Brigham Young University; 2008. [cited 2020 Oct 21]. Available from: https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=2353&context=etd.

Council of Science Editors:

Norton RD. Improving Liquid State Machines Through Iterative Refinement of the Reservoir. [Masters Thesis]. Brigham Young University; 2008. Available from: https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=2353&context=etd

16. Meschede-Krasa, Benyamin. Connectional Constraints and Feed-Forward Inhibition Allow the Development of Robust Cortical Direction Selectivity from Sparse Initial Inputs.

Degree: 2017, Brandeis University

 The development of direction selectivity (DS) in the carnivore primary visual cortex (V1) requires visual experience, although the mechanisms by which visually-driven activity sculpts the… (more)

Subjects/Keywords: feed-forward; Hebbian learning; unsupervised learning; Reichardt detector; logical AND-gate; direction selectivity, Visual processing

…mechanism of learning. Simulations of visual stimuli were only in two directions along a single… …gray circles). In Possible State II learning is restricted for all inputs (white… …independently determined weights, also following the same STDP learning rules throughout training. The… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Meschede-Krasa, B. (2017). Connectional Constraints and Feed-Forward Inhibition Allow the Development of Robust Cortical Direction Selectivity from Sparse Initial Inputs. (Thesis). Brandeis University. Retrieved from http://hdl.handle.net/10192/33875

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Meschede-Krasa, Benyamin. “Connectional Constraints and Feed-Forward Inhibition Allow the Development of Robust Cortical Direction Selectivity from Sparse Initial Inputs.” 2017. Thesis, Brandeis University. Accessed October 21, 2020. http://hdl.handle.net/10192/33875.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Meschede-Krasa, Benyamin. “Connectional Constraints and Feed-Forward Inhibition Allow the Development of Robust Cortical Direction Selectivity from Sparse Initial Inputs.” 2017. Web. 21 Oct 2020.

Vancouver:

Meschede-Krasa B. Connectional Constraints and Feed-Forward Inhibition Allow the Development of Robust Cortical Direction Selectivity from Sparse Initial Inputs. [Internet] [Thesis]. Brandeis University; 2017. [cited 2020 Oct 21]. Available from: http://hdl.handle.net/10192/33875.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Meschede-Krasa B. Connectional Constraints and Feed-Forward Inhibition Allow the Development of Robust Cortical Direction Selectivity from Sparse Initial Inputs. [Thesis]. Brandeis University; 2017. Available from: http://hdl.handle.net/10192/33875

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Otago

17. Blanchette, Glenn Clifford. The Boltzmann Machine: a Connectionist Model for Supra-Classical Logic .

Degree: University of Otago

 This thesis moves towards reconciliation of two of the major paradigms of artificial intelligence: by exploring the representation of symbolic logic in an artificial neural… (more)

Subjects/Keywords: Boltzmann machine; supra-classical non-monotonic logic; knowledge representation; typicality; belief revision; cognition; predictive inference; neural networks; Hebbian learning; simulated annealing

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Blanchette, G. C. (n.d.). The Boltzmann Machine: a Connectionist Model for Supra-Classical Logic . (Doctoral Dissertation). University of Otago. Retrieved from http://hdl.handle.net/10523/8312

Note: this citation may be lacking information needed for this citation format:
No year of publication.

Chicago Manual of Style (16th Edition):

Blanchette, Glenn Clifford. “The Boltzmann Machine: a Connectionist Model for Supra-Classical Logic .” Doctoral Dissertation, University of Otago. Accessed October 21, 2020. http://hdl.handle.net/10523/8312.

Note: this citation may be lacking information needed for this citation format:
No year of publication.

MLA Handbook (7th Edition):

Blanchette, Glenn Clifford. “The Boltzmann Machine: a Connectionist Model for Supra-Classical Logic .” Web. 21 Oct 2020.

Note: this citation may be lacking information needed for this citation format:
No year of publication.

Vancouver:

Blanchette GC. The Boltzmann Machine: a Connectionist Model for Supra-Classical Logic . [Internet] [Doctoral dissertation]. University of Otago; [cited 2020 Oct 21]. Available from: http://hdl.handle.net/10523/8312.

Note: this citation may be lacking information needed for this citation format:
No year of publication.

Council of Science Editors:

Blanchette GC. The Boltzmann Machine: a Connectionist Model for Supra-Classical Logic . [Doctoral Dissertation]. University of Otago; Available from: http://hdl.handle.net/10523/8312

Note: this citation may be lacking information needed for this citation format:
No year of publication.

18. Parianen Lesemann, F.H. Tactile Stimulation Interventions: Influence of stimulation parameters on sensorimotor behavior and neurophysiological correlates in healthy and clinical samples.

Degree: 2014, Universiteit Utrecht

 The pure exposure to extensive tactile stimulation has been revealed to enhance sensorimotor functioning, without the requirement of attention or training. The induced effects, including… (more)

Subjects/Keywords: plasticity; repetitive Tactile Stimulation; hebbian learning; stimulation parameters; sensory perception; motor performance

hebbian learning (Kalisch et al., 2010). For an overview of the most frequent… …perceptual learning. Its consolidating role was also shown regarding other forms of sensory… …learning (Fahle, 2005). In summary, it seems the recovery of effects can be prolonged… …proposed in their review (2007) that not all sensory input results in learning. Instead… …their model of perceptual learning includes a gating system that decides after which stimulus… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Parianen Lesemann, F. H. (2014). Tactile Stimulation Interventions: Influence of stimulation parameters on sensorimotor behavior and neurophysiological correlates in healthy and clinical samples. (Masters Thesis). Universiteit Utrecht. Retrieved from http://dspace.library.uu.nl:8080/handle/1874/296053

Chicago Manual of Style (16th Edition):

Parianen Lesemann, F H. “Tactile Stimulation Interventions: Influence of stimulation parameters on sensorimotor behavior and neurophysiological correlates in healthy and clinical samples.” 2014. Masters Thesis, Universiteit Utrecht. Accessed October 21, 2020. http://dspace.library.uu.nl:8080/handle/1874/296053.

MLA Handbook (7th Edition):

Parianen Lesemann, F H. “Tactile Stimulation Interventions: Influence of stimulation parameters on sensorimotor behavior and neurophysiological correlates in healthy and clinical samples.” 2014. Web. 21 Oct 2020.

Vancouver:

Parianen Lesemann FH. Tactile Stimulation Interventions: Influence of stimulation parameters on sensorimotor behavior and neurophysiological correlates in healthy and clinical samples. [Internet] [Masters thesis]. Universiteit Utrecht; 2014. [cited 2020 Oct 21]. Available from: http://dspace.library.uu.nl:8080/handle/1874/296053.

Council of Science Editors:

Parianen Lesemann FH. Tactile Stimulation Interventions: Influence of stimulation parameters on sensorimotor behavior and neurophysiological correlates in healthy and clinical samples. [Masters Thesis]. Universiteit Utrecht; 2014. Available from: http://dspace.library.uu.nl:8080/handle/1874/296053


University of Cambridge

19. Garagnani, Max. Understanding language and attention : brain-based model and neurophysiological experiments.

Degree: PhD, 2009, University of Cambridge

 This work concerns the investigation of the neuronal mechanisms at the basis of language acquisition and processing, and the complex interactions of language and attention… (more)

Subjects/Keywords: 612.8; Neural network; Language; Neurophysiology; Hebbian learning; Cell assembly; Simulation; Connectivity; Mismatch negativity; Attention; N400

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Garagnani, M. (2009). Understanding language and attention : brain-based model and neurophysiological experiments. (Doctoral Dissertation). University of Cambridge. Retrieved from https://doi.org/10.17863/CAM.16151 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557878

Chicago Manual of Style (16th Edition):

Garagnani, Max. “Understanding language and attention : brain-based model and neurophysiological experiments.” 2009. Doctoral Dissertation, University of Cambridge. Accessed October 21, 2020. https://doi.org/10.17863/CAM.16151 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557878.

MLA Handbook (7th Edition):

Garagnani, Max. “Understanding language and attention : brain-based model and neurophysiological experiments.” 2009. Web. 21 Oct 2020.

Vancouver:

Garagnani M. Understanding language and attention : brain-based model and neurophysiological experiments. [Internet] [Doctoral dissertation]. University of Cambridge; 2009. [cited 2020 Oct 21]. Available from: https://doi.org/10.17863/CAM.16151 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557878.

Council of Science Editors:

Garagnani M. Understanding language and attention : brain-based model and neurophysiological experiments. [Doctoral Dissertation]. University of Cambridge; 2009. Available from: https://doi.org/10.17863/CAM.16151 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557878


University of Cambridge

20. Garagnani, Max. Understanding language and attention: brain-based model and neurophysiological experiments.

Degree: PhD, 2009, University of Cambridge

 This work concerns the investigation of the neuronal mechanisms at the basis of language acquisition and processing, and the complex interactions of language and attention… (more)

Subjects/Keywords: Neural network; Language; Neurophysiology; Hebbian learning; Cell assembly; Simulation; Connectivity; Mismatch negativity; Attention; N400

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Garagnani, M. (2009). Understanding language and attention: brain-based model and neurophysiological experiments. (Doctoral Dissertation). University of Cambridge. Retrieved from http://www.dspace.cam.ac.uk/handle/1810/243852https://www.repository.cam.ac.uk/bitstream/1810/243852/3/license_url ; https://www.repository.cam.ac.uk/bitstream/1810/243852/4/license_text ; https://www.repository.cam.ac.uk/bitstream/1810/243852/5/license_rdf ; https://www.repository.cam.ac.uk/bitstream/1810/243852/8/RevisedThesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/243852/6/RevisedThesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/243852/9/RevisedThesis.pdf.jpg

Chicago Manual of Style (16th Edition):

Garagnani, Max. “Understanding language and attention: brain-based model and neurophysiological experiments.” 2009. Doctoral Dissertation, University of Cambridge. Accessed October 21, 2020. http://www.dspace.cam.ac.uk/handle/1810/243852https://www.repository.cam.ac.uk/bitstream/1810/243852/3/license_url ; https://www.repository.cam.ac.uk/bitstream/1810/243852/4/license_text ; https://www.repository.cam.ac.uk/bitstream/1810/243852/5/license_rdf ; https://www.repository.cam.ac.uk/bitstream/1810/243852/8/RevisedThesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/243852/6/RevisedThesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/243852/9/RevisedThesis.pdf.jpg.

MLA Handbook (7th Edition):

Garagnani, Max. “Understanding language and attention: brain-based model and neurophysiological experiments.” 2009. Web. 21 Oct 2020.

Vancouver:

Garagnani M. Understanding language and attention: brain-based model and neurophysiological experiments. [Internet] [Doctoral dissertation]. University of Cambridge; 2009. [cited 2020 Oct 21]. Available from: http://www.dspace.cam.ac.uk/handle/1810/243852https://www.repository.cam.ac.uk/bitstream/1810/243852/3/license_url ; https://www.repository.cam.ac.uk/bitstream/1810/243852/4/license_text ; https://www.repository.cam.ac.uk/bitstream/1810/243852/5/license_rdf ; https://www.repository.cam.ac.uk/bitstream/1810/243852/8/RevisedThesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/243852/6/RevisedThesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/243852/9/RevisedThesis.pdf.jpg.

Council of Science Editors:

Garagnani M. Understanding language and attention: brain-based model and neurophysiological experiments. [Doctoral Dissertation]. University of Cambridge; 2009. Available from: http://www.dspace.cam.ac.uk/handle/1810/243852https://www.repository.cam.ac.uk/bitstream/1810/243852/3/license_url ; https://www.repository.cam.ac.uk/bitstream/1810/243852/4/license_text ; https://www.repository.cam.ac.uk/bitstream/1810/243852/5/license_rdf ; https://www.repository.cam.ac.uk/bitstream/1810/243852/8/RevisedThesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/243852/6/RevisedThesis.pdf.txt ; https://www.repository.cam.ac.uk/bitstream/1810/243852/9/RevisedThesis.pdf.jpg


University of Maryland

21. Haas, Alfred M. Analog VLSI Circuits for Biosensors, Neural Signal Processing and Prosthetics.

Degree: Electrical Engineering, 2009, University of Maryland

 Stroke, spinal cord injury and neurodegenerative diseases such as ALS and Parkinson's debilitate their victims by suffocating, cleaving communication between, and/or poisoning entire populations of… (more)

Subjects/Keywords: Engineering, Electronics and Electrical; Engineering, Biomedical; analog VLSI; biosensing; Hebbian learning; neural prosthetics; neural recording; spike sorting

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Haas, A. M. (2009). Analog VLSI Circuits for Biosensors, Neural Signal Processing and Prosthetics. (Thesis). University of Maryland. Retrieved from http://hdl.handle.net/1903/9175

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Haas, Alfred M. “Analog VLSI Circuits for Biosensors, Neural Signal Processing and Prosthetics.” 2009. Thesis, University of Maryland. Accessed October 21, 2020. http://hdl.handle.net/1903/9175.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Haas, Alfred M. “Analog VLSI Circuits for Biosensors, Neural Signal Processing and Prosthetics.” 2009. Web. 21 Oct 2020.

Vancouver:

Haas AM. Analog VLSI Circuits for Biosensors, Neural Signal Processing and Prosthetics. [Internet] [Thesis]. University of Maryland; 2009. [cited 2020 Oct 21]. Available from: http://hdl.handle.net/1903/9175.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Haas AM. Analog VLSI Circuits for Biosensors, Neural Signal Processing and Prosthetics. [Thesis]. University of Maryland; 2009. Available from: http://hdl.handle.net/1903/9175

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Rutgers University

22. Wayment, Adam. Assimilation as Attraction: Computing Distance, Similarity, and Locality in Phonology.

Degree: PhD, Phonetics, 2009, Rutgers University

 This dissertation explores similarity effects in assimilation, proposing an Attraction Framework to analyze cases of parasitic harmony where a trigger-target pair only results in harmony… (more)

Subjects/Keywords: Phonology; Computation; Formal analysis; Learnability; Phonology; Attraction; Assimilation; parasitic harmony; distance; similarity; locality; Representational Entailment; Optimality Theory; Harmonic Grammar; Hebbian learning; Harmony maximization; Artificial Neural Networks; Tensor product representations; Cognitive Science

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wayment, A. (2009). Assimilation as Attraction: Computing Distance, Similarity, and Locality in Phonology. (Doctoral Dissertation). Rutgers University. Retrieved from http://hdl.rutgers.edu/1782.1/rucore00000002165.ETD.000064825

Chicago Manual of Style (16th Edition):

Wayment, Adam. “Assimilation as Attraction: Computing Distance, Similarity, and Locality in Phonology.” 2009. Doctoral Dissertation, Rutgers University. Accessed October 21, 2020. http://hdl.rutgers.edu/1782.1/rucore00000002165.ETD.000064825.

MLA Handbook (7th Edition):

Wayment, Adam. “Assimilation as Attraction: Computing Distance, Similarity, and Locality in Phonology.” 2009. Web. 21 Oct 2020.

Vancouver:

Wayment A. Assimilation as Attraction: Computing Distance, Similarity, and Locality in Phonology. [Internet] [Doctoral dissertation]. Rutgers University; 2009. [cited 2020 Oct 21]. Available from: http://hdl.rutgers.edu/1782.1/rucore00000002165.ETD.000064825.

Council of Science Editors:

Wayment A. Assimilation as Attraction: Computing Distance, Similarity, and Locality in Phonology. [Doctoral Dissertation]. Rutgers University; 2009. Available from: http://hdl.rutgers.edu/1782.1/rucore00000002165.ETD.000064825

23. Barreto, Guilherme de Alencar. Redes neurais não-supervisionadas para processamento de sequências temporais.

Degree: Mestrado, Engenharia Elétrica, 1998, University of São Paulo

Em muitos domínios de aplicação, a variável tempo é uma dimensão essencial. Este é o caso da robótica, na qual trajetórias de robôs podem ser… (more)

Subjects/Keywords: Aprendizagem competitiva; Aprendizagem hebbiana temporal; Competitive learning; Context; Contexto; Exclusion mechanism; Fault tolerance; Mecanismo de exclusão; Redes não-supervisionadas; Redundância; Redundancy; Reprodução de trajetórias; Seqüências temporais; Temporal Hebbian learning; Temporal sequences; Tolerância a falhas; Trajectories reproduction; Unsupervised neural networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Barreto, G. d. A. (1998). Redes neurais não-supervisionadas para processamento de sequências temporais. (Masters Thesis). University of São Paulo. Retrieved from http://www.teses.usp.br/teses/disponiveis/18/18133/tde-25112015-111953/ ;

Chicago Manual of Style (16th Edition):

Barreto, Guilherme de Alencar. “Redes neurais não-supervisionadas para processamento de sequências temporais.” 1998. Masters Thesis, University of São Paulo. Accessed October 21, 2020. http://www.teses.usp.br/teses/disponiveis/18/18133/tde-25112015-111953/ ;.

MLA Handbook (7th Edition):

Barreto, Guilherme de Alencar. “Redes neurais não-supervisionadas para processamento de sequências temporais.” 1998. Web. 21 Oct 2020.

Vancouver:

Barreto GdA. Redes neurais não-supervisionadas para processamento de sequências temporais. [Internet] [Masters thesis]. University of São Paulo; 1998. [cited 2020 Oct 21]. Available from: http://www.teses.usp.br/teses/disponiveis/18/18133/tde-25112015-111953/ ;.

Council of Science Editors:

Barreto GdA. Redes neurais não-supervisionadas para processamento de sequências temporais. [Masters Thesis]. University of São Paulo; 1998. Available from: http://www.teses.usp.br/teses/disponiveis/18/18133/tde-25112015-111953/ ;

24. Janković Marko. Unsupervised neural networks for principal component analysis.

Degree: PhD, Electrical Engineering, 2006, University of Belgrade

This thesis is devoted to the simple biologically plausible algorithms for extraction of the principal/minor components/subspace from input signal covariance matrix, as well as discovery… (more)

Subjects/Keywords: unsupervised neural networks; principal component analysis; principal subspace analysis; time oriented heirarchical method; minor component analysis; Hebbian learning rule; biologically inspired learning algorithms; sаmооrgаnizuјućе nеurаlnе mrеžе; аnаlizа glаvnih kоmpоnеnаtа; аnаlizа glаvnоg pоtprоstоrа; vrеmеnski оriјеntisаn hiјеrаrhiјski mеtоd; аnаlizа spоrеdnih kоmpоnеnаtа; Hеbоv zаkоn učеnjа; biоlоški inspirisаni аlgоritmi zа učеnjе

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Marko, J. (2006). Unsupervised neural networks for principal component analysis. (Doctoral Dissertation). University of Belgrade. Retrieved from http://dx.doi.org/10.2298/BG20060323JANKOVIC ; http://eteze.bg.ac.rs/application/showtheses?thesesId=868 ; https://fedorabg.bg.ac.rs/fedora/get/o:7259/bdef:Content/get ; http://vbs.rs/scripts/cobiss?command=SEARCH&base=99999&select=ID=512065440

Chicago Manual of Style (16th Edition):

Marko, Janković. “Unsupervised neural networks for principal component analysis.” 2006. Doctoral Dissertation, University of Belgrade. Accessed October 21, 2020. http://dx.doi.org/10.2298/BG20060323JANKOVIC ; http://eteze.bg.ac.rs/application/showtheses?thesesId=868 ; https://fedorabg.bg.ac.rs/fedora/get/o:7259/bdef:Content/get ; http://vbs.rs/scripts/cobiss?command=SEARCH&base=99999&select=ID=512065440.

MLA Handbook (7th Edition):

Marko, Janković. “Unsupervised neural networks for principal component analysis.” 2006. Web. 21 Oct 2020.

Vancouver:

Marko J. Unsupervised neural networks for principal component analysis. [Internet] [Doctoral dissertation]. University of Belgrade; 2006. [cited 2020 Oct 21]. Available from: http://dx.doi.org/10.2298/BG20060323JANKOVIC ; http://eteze.bg.ac.rs/application/showtheses?thesesId=868 ; https://fedorabg.bg.ac.rs/fedora/get/o:7259/bdef:Content/get ; http://vbs.rs/scripts/cobiss?command=SEARCH&base=99999&select=ID=512065440.

Council of Science Editors:

Marko J. Unsupervised neural networks for principal component analysis. [Doctoral Dissertation]. University of Belgrade; 2006. Available from: http://dx.doi.org/10.2298/BG20060323JANKOVIC ; http://eteze.bg.ac.rs/application/showtheses?thesesId=868 ; https://fedorabg.bg.ac.rs/fedora/get/o:7259/bdef:Content/get ; http://vbs.rs/scripts/cobiss?command=SEARCH&base=99999&select=ID=512065440

25. Papageorgiou, Elpiniki. Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία.

Degree: 2004, University of Patras; Πανεπιστήμιο Πατρών

 The main contribution of this Dissertation is the development of new learning and convergence methodologies for Fuzzy Cognitive Maps that are proposed for the improvement… (more)

Subjects/Keywords: Εύκαμπτες υπολογιστικές μέθοδοι; Ασαφή γνωστικά δίκτυα; Αλγόριθμοι μη επιβλεπόμενης εκπαίδευσης τύπου-Hebb; Αλγόριθμοι βελτιστοποίησης; Νευρωνικά δίκτυα; Βελτιστοποίηση με σμήνη σωματιδίων; Διαφοροεξελικτικός αλγόριθμος; Ασαφής λογική; Αλγόριθμοι εκμάθησης; Soft computing methods; Fuzzy cognitive maps; Learning algorithms; Fuzzy logic; Unsupervised Hebbian learning algorithms; Optimization algorithms; Neural networks; Particle swarm optimization; Differential evolution algorithm; Tumor grading; Decision support

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Papageorgiou, E. (2004). Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία. (Thesis). University of Patras; Πανεπιστήμιο Πατρών. Retrieved from http://hdl.handle.net/10442/hedi/31208

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Papageorgiou, Elpiniki. “Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία.” 2004. Thesis, University of Patras; Πανεπιστήμιο Πατρών. Accessed October 21, 2020. http://hdl.handle.net/10442/hedi/31208.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Papageorgiou, Elpiniki. “Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία.” 2004. Web. 21 Oct 2020.

Vancouver:

Papageorgiou E. Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία. [Internet] [Thesis]. University of Patras; Πανεπιστήμιο Πατρών; 2004. [cited 2020 Oct 21]. Available from: http://hdl.handle.net/10442/hedi/31208.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Papageorgiou E. Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία. [Thesis]. University of Patras; Πανεπιστήμιο Πατρών; 2004. Available from: http://hdl.handle.net/10442/hedi/31208

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Georgia Tech

26. Rosen, Gail L. Signal processing for biologically-inspired gradient source localization and DNA sequence analysis.

Degree: PhD, Electrical and Computer Engineering, 2006, Georgia Tech

 Biological signal processing can help us gain knowledge about biological complexity, as well as using this knowledge to engineer better systems. Three areas are identified… (more)

Subjects/Keywords: DNA analysis; Ficks second law; Hebbian learning; Biased random walk; Sensor cross-correlation; Delay-and-Sum beamforming; Turbulent plumes; Electronic nose; Tandem repeats; Gradient sensing; Bacterial chemotaxis navigation; Chemotaxis; Sensor networks; Signal processing; Biologically-inspired computing; Chemotaxis; Nervous system Degeneration; Nucleotide sequence

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Rosen, G. L. (2006). Signal processing for biologically-inspired gradient source localization and DNA sequence analysis. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/11628

Chicago Manual of Style (16th Edition):

Rosen, Gail L. “Signal processing for biologically-inspired gradient source localization and DNA sequence analysis.” 2006. Doctoral Dissertation, Georgia Tech. Accessed October 21, 2020. http://hdl.handle.net/1853/11628.

MLA Handbook (7th Edition):

Rosen, Gail L. “Signal processing for biologically-inspired gradient source localization and DNA sequence analysis.” 2006. Web. 21 Oct 2020.

Vancouver:

Rosen GL. Signal processing for biologically-inspired gradient source localization and DNA sequence analysis. [Internet] [Doctoral dissertation]. Georgia Tech; 2006. [cited 2020 Oct 21]. Available from: http://hdl.handle.net/1853/11628.

Council of Science Editors:

Rosen GL. Signal processing for biologically-inspired gradient source localization and DNA sequence analysis. [Doctoral Dissertation]. Georgia Tech; 2006. Available from: http://hdl.handle.net/1853/11628

27. Παπαγεωργίου, Ελπινίκη. Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία.

Degree: 2004, University of Patras

 Αντικείµενο της διατριβής είναι η ανάπτυξη νέων µεθοδολογιών εκµάθησης και σύγκλισης των Ασαφών Γνωστικών ∆ικτύων που προτείνονται για τη βελτίωση και προσαρµογή της συµπεριφοράς τους,… (more)

Subjects/Keywords: Εύκαμπτες υπολογιστικές μέθοδοι; Ασαφή γνωστικά δίκτυα; Αλγόριθμοι μη επιβλεπόμενης εκπαίδευσης τύπου-Hebb; Αλγόριθμοι βελτιστοποίησης; Νευρωνικά δίκτυα; Βελτιστοποίηση με σμήνη σωματιδίων; Διαφοροεξελικτικός αλγόριθμο; Ασαφής λογική; Αλγόριθμοι εκμάθησης; 006.3; Soft computing methods; Fuzzy cognitive maps; Learning algorithms; Fuzzy logic; Unsupervised Hebbian learning algorithms; Optimization algorithms; Neural networks; Particle swarm optimization; Differential evolution algorithm; Tumor grading; Decision Supp

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Παπαγεωργίου, . (2004). Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία. (Doctoral Dissertation). University of Patras. Retrieved from http://nemertes.lis.upatras.gr/jspui/handle/10889/322

Chicago Manual of Style (16th Edition):

Παπαγεωργίου, Ελπινίκη. “Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία.” 2004. Doctoral Dissertation, University of Patras. Accessed October 21, 2020. http://nemertes.lis.upatras.gr/jspui/handle/10889/322.

MLA Handbook (7th Edition):

Παπαγεωργίου, Ελπινίκη. “Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία.” 2004. Web. 21 Oct 2020.

Vancouver:

Παπαγεωργίου . Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία. [Internet] [Doctoral dissertation]. University of Patras; 2004. [cited 2020 Oct 21]. Available from: http://nemertes.lis.upatras.gr/jspui/handle/10889/322.

Council of Science Editors:

Παπαγεωργίου . Νέες μέθοδοι εκμάθησης για ασαφή γνωστικά δίκτυα και εφαρμογές στην ιατρική και βιομηχανία. [Doctoral Dissertation]. University of Patras; 2004. Available from: http://nemertes.lis.upatras.gr/jspui/handle/10889/322

28. Siddiqui, Sana. Cognitive artificial intelligence – a complexity based machine learning approach for advanced cyber threats.

Degree: Electrical and Computer Engineering, 2016, University of Manitoba

 Application of machine intelligence is severely challenged in the domain of cyber security due to the surreptitious nature of advanced cyber threats which are persistent… (more)

Subjects/Keywords: Artificial Neural Network; Classification; Multiscale; Cognitive Intelligence; Dimensionality; Wavelets; Machine Intelligence; Fractals; Multifractals; Hebbian Learning; Instance Based Learners; Complexity Analysis; Packet Captures; Network Threats; Malware Detection; Machine Learning; Computational Intelligence; Cognitive Computing; Cognitive Informatics; Cyber Kill Chain; Cyber Threat; Cyber Security; Obfuscated Cyber Threats; Advanced Indistinguishable Threats

…i.e. k-NN, gradient descent based neural network and Hebbian learning algorithm in the… …highlighted particularly the modification of classical single scale Hebbian learning rule to… …internet packet captured through Libpcap library. A distributed version of the Hebbian learning… …methodologies. In particular, it describes the strength and challenges of existing machine learning… …49 1. Instance based learning algorithm – traditional and proposed fractal based cognitive… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Siddiqui, S. (2016). Cognitive artificial intelligence – a complexity based machine learning approach for advanced cyber threats. (Masters Thesis). University of Manitoba. Retrieved from http://hdl.handle.net/1993/32282

Chicago Manual of Style (16th Edition):

Siddiqui, Sana. “Cognitive artificial intelligence – a complexity based machine learning approach for advanced cyber threats.” 2016. Masters Thesis, University of Manitoba. Accessed October 21, 2020. http://hdl.handle.net/1993/32282.

MLA Handbook (7th Edition):

Siddiqui, Sana. “Cognitive artificial intelligence – a complexity based machine learning approach for advanced cyber threats.” 2016. Web. 21 Oct 2020.

Vancouver:

Siddiqui S. Cognitive artificial intelligence – a complexity based machine learning approach for advanced cyber threats. [Internet] [Masters thesis]. University of Manitoba; 2016. [cited 2020 Oct 21]. Available from: http://hdl.handle.net/1993/32282.

Council of Science Editors:

Siddiqui S. Cognitive artificial intelligence – a complexity based machine learning approach for advanced cyber threats. [Masters Thesis]. University of Manitoba; 2016. Available from: http://hdl.handle.net/1993/32282

.