Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Negative correlation learning). Showing records 1 – 2 of 2 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


Kaunas University of Technology

1. Cibulskis, Vladas. Dirbtinių neuroninių tinklų kolektyvų formavimo algoritmų kūrimas.

Degree: Master, Informatics, 2005, Kaunas University of Technology

Previous works on classification committees have shown that an efficient committee should consist of networks that are not only very accurate, but also diverse. In this work, aiming to explore trade-off between the diversity and accuracy of committee networks, the steps of neural network training, aggregation of the networks into a committee, and elimination of irrelevant input variables are integrated. To accomplish the elimination, an additional term to the Negative correlation learning error function, which forces input weights connected to the irrelevant input variables to decay, is added. Advisors/Committee Members: Maciulevičius, Stasys (Master’s degree session secretary), Barauskas, Rimantas (Master’s degree committee member), Lipnickas, Arūnas (Master’s thesis reviewer), Telksnys, Laimutis (Master’s degree committee chair), Plėštys, Rimantas (Master’s degree committee member), Gelžinis, Adas (Master’s thesis supervisor), Pranevičius, Henrikas (Master’s degree committee member), Mockus, Jonas (Master’s degree committee member), Verikas, Antanas (Master’s thesis supervisor), Jasinevičius, Raimundas (Master’s degree committee member).

Subjects/Keywords: Požymių atrinkimas; Neural network committees; Neigiamos koreliacijos mokymas; Negative correlation learning; Neuroninių tinklų kolektyvai; Neuroniniai tinklai; Feature selection; Neural network

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Cibulskis, Vladas. (2005). Dirbtinių neuroninių tinklų kolektyvų formavimo algoritmų kūrimas. (Masters Thesis). Kaunas University of Technology. Retrieved from http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2005~D_20050526_062729-44266 ;

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Chicago Manual of Style (16th Edition):

Cibulskis, Vladas. “Dirbtinių neuroninių tinklų kolektyvų formavimo algoritmų kūrimas.” 2005. Masters Thesis, Kaunas University of Technology. Accessed October 17, 2019. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2005~D_20050526_062729-44266 ;.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

MLA Handbook (7th Edition):

Cibulskis, Vladas. “Dirbtinių neuroninių tinklų kolektyvų formavimo algoritmų kūrimas.” 2005. Web. 17 Oct 2019.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Vancouver:

Cibulskis, Vladas. Dirbtinių neuroninių tinklų kolektyvų formavimo algoritmų kūrimas. [Internet] [Masters thesis]. Kaunas University of Technology; 2005. [cited 2019 Oct 17]. Available from: http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2005~D_20050526_062729-44266 ;.

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete

Council of Science Editors:

Cibulskis, Vladas. Dirbtinių neuroninių tinklų kolektyvų formavimo algoritmų kūrimas. [Masters Thesis]. Kaunas University of Technology; 2005. Available from: http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2005~D_20050526_062729-44266 ;

Note: this citation may be lacking information needed for this citation format:
Author name may be incomplete


University of New South Wales

2. Dam, Hai Huong. A scalable evolutionary learning classifier system for knowledge discovery in stream data mining.

Degree: Information Technology & Electrical Engineering, 2008, University of New South Wales

Data mining (DM) is the process of finding patterns and relationships in databases.The breakthrough in computer technologies triggered a massive growth in datacollected and maintained by organisations. In many applications, these data arrivecontinuously in large volumes as a sequence of instances known as a data stream.Mining these data is known as stream data mining. Due to the large amount of dataarriving in a data stream, each record is normally expected to be processed onlyonce. Moreover, this process can be carried out on different sites in the organisationsimultaneously making the problem distributed in nature. Distributed stream datamining poses many challenges to the data mining community including scalabilityand coping with changes in the underlying concept over time.In this thesis, the author hypothesizes that learning classifier systems (LCSs) - aclass of classification algorithms - have the potential to work efficiently in distributedstream data mining. LCSs are an incremental learner, and being evolutionarybased they are inherently adaptive. However, they suffer from two main drawbacksthat hinder their use as fast data mining algorithms. First, they require a largepopulation size, which slows down the processing of arriving instances. Second,they require a large number of parameter settings, some of them are very sensitiveto the nature of the learning problem. As a result, it becomes difficult to choose aright setup for totally unknown problems.The aim of this thesis is to attack these two problems in LCS, with a specific focuson UCS - a supervised evolutionary learning classifier system. UCS is chosen as ithas been tested extensively on classification tasks and it is the supervised versionof XCS, a state of the art LCS.In this thesis, the architectural design for a distributed stream data mining systemwill be first introduced. The problems that UCS should face in a distributed datastream task are confirmed through a large number of experiments with UCS andthe proposed architectural design.To overcome the problem of large population sizes, the idea of using a NeuralNetwork to represent the action in UCS is proposed. This new system - called NLCS{ was validated experimentally using a small fixed population size and has showna large reduction in the population size needed to learn the underlying concept inthe data.An adaptive version of NLCS called ANCS is then introduced. The adaptive versiondynamically controls the population size of NLCS. A comprehensive analysis of thebehaviour of ANCS revealed interesting patterns in the behaviour of the parameters,which motivated an ensemble version of the algorithm with 9 nodes, each using adifferent parameter setting. In total they cover all patterns of behaviour noticed inthe system. A voting gate is used for the ensemble. The resultant ensemble doesnot require any parameter setting, and showed better performance on all datasetstested.The thesis concludes with testing the ANCS system in the architectural design fordistributed environments proposed earlier.The… Advisors/Committee Members: Abbass, Hussein, Information Technology & Electrical Engineering, Australian Defence Force Academy, UNSW, Lokan, Chris, Information Technology & Electrical Engineering, Australian Defence Force Academy, UNSW.

Subjects/Keywords: Data mining; Action map; Classification; Data stream; Neural network; Noisy data; Non-stationary environment; Reinforcement learning; Rule-based system; Static environment; Stream data mining; Supervised learning; Distributed data mining; Dynamic environment; Ensemble learning; Evolutionary computation; Genetic algorithm; Knowledge discovery; Learning classifier system; Negative correlation learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Dam, H. H. (2008). A scalable evolutionary learning classifier system for knowledge discovery in stream data mining. (Doctoral Dissertation). University of New South Wales. Retrieved from http://handle.unsw.edu.au/1959.4/38865 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:3421/SOURCE1?view=true

Chicago Manual of Style (16th Edition):

Dam, Hai Huong. “A scalable evolutionary learning classifier system for knowledge discovery in stream data mining.” 2008. Doctoral Dissertation, University of New South Wales. Accessed October 17, 2019. http://handle.unsw.edu.au/1959.4/38865 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:3421/SOURCE1?view=true.

MLA Handbook (7th Edition):

Dam, Hai Huong. “A scalable evolutionary learning classifier system for knowledge discovery in stream data mining.” 2008. Web. 17 Oct 2019.

Vancouver:

Dam HH. A scalable evolutionary learning classifier system for knowledge discovery in stream data mining. [Internet] [Doctoral dissertation]. University of New South Wales; 2008. [cited 2019 Oct 17]. Available from: http://handle.unsw.edu.au/1959.4/38865 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:3421/SOURCE1?view=true.

Council of Science Editors:

Dam HH. A scalable evolutionary learning classifier system for knowledge discovery in stream data mining. [Doctoral Dissertation]. University of New South Wales; 2008. Available from: http://handle.unsw.edu.au/1959.4/38865 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:3421/SOURCE1?view=true

.