Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Ensemble Selection). Showing records 1 – 30 of 40 total matches.

[1] [2]

Search Limiters

Last 2 Years | English Only

▼ Search Limiters


UCLA

1. Chang, Kung-Hua. Complementarity In Data Mining.

Degree: Computer Science, 2015, UCLA

 A learning problem involving classifiers and features usually has three components: representation, evaluation, and optimization. Contemporary research represents classifiers and features as initially given, and… (more)

Subjects/Keywords: Computer science; Ensemble Selection; Feature Selection

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chang, K. (2015). Complementarity In Data Mining. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/8zn4s7mj

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Chang, Kung-Hua. “Complementarity In Data Mining.” 2015. Thesis, UCLA. Accessed June 25, 2019. http://www.escholarship.org/uc/item/8zn4s7mj.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Chang, Kung-Hua. “Complementarity In Data Mining.” 2015. Web. 25 Jun 2019.

Vancouver:

Chang K. Complementarity In Data Mining. [Internet] [Thesis]. UCLA; 2015. [cited 2019 Jun 25]. Available from: http://www.escholarship.org/uc/item/8zn4s7mj.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chang K. Complementarity In Data Mining. [Thesis]. UCLA; 2015. Available from: http://www.escholarship.org/uc/item/8zn4s7mj

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Connecticut

2. Yankee, Tara N. Rank Aggregation of Feature Scoring Methods for Unsupervised Learning.

Degree: M. Eng., Biomedical Engineering, 2017, University of Connecticut

  The ability to collect and store large amounts of data is transforming data-driven discovery; recent technological advances in biology allow systematic data production and… (more)

Subjects/Keywords: clustering; ensemble learning; feature selection; unsupervised learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yankee, T. N. (2017). Rank Aggregation of Feature Scoring Methods for Unsupervised Learning. (Masters Thesis). University of Connecticut. Retrieved from https://opencommons.uconn.edu/gs_theses/1123

Chicago Manual of Style (16th Edition):

Yankee, Tara N. “Rank Aggregation of Feature Scoring Methods for Unsupervised Learning.” 2017. Masters Thesis, University of Connecticut. Accessed June 25, 2019. https://opencommons.uconn.edu/gs_theses/1123.

MLA Handbook (7th Edition):

Yankee, Tara N. “Rank Aggregation of Feature Scoring Methods for Unsupervised Learning.” 2017. Web. 25 Jun 2019.

Vancouver:

Yankee TN. Rank Aggregation of Feature Scoring Methods for Unsupervised Learning. [Internet] [Masters thesis]. University of Connecticut; 2017. [cited 2019 Jun 25]. Available from: https://opencommons.uconn.edu/gs_theses/1123.

Council of Science Editors:

Yankee TN. Rank Aggregation of Feature Scoring Methods for Unsupervised Learning. [Masters Thesis]. University of Connecticut; 2017. Available from: https://opencommons.uconn.edu/gs_theses/1123

3. Oliveira e Cruz, Rafael Menelau. Methods for dynamic selection and fusion of ensemble of classifiers .

Degree: 2011, Universidade Federal de Pernambuco

Ensemble of Classifiers (EoC) é uma nova alternative para alcançar altas taxas de reconhecimento em sistemas de reconhecimento de padrões. O uso de ensemble é… (more)

Subjects/Keywords: Handwritten Recognition; Feature Extraction; Ensemble of Classifier; Dynamic Ensemble Selection; Regions of Competence; Neural Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Oliveira e Cruz, R. M. (2011). Methods for dynamic selection and fusion of ensemble of classifiers . (Thesis). Universidade Federal de Pernambuco. Retrieved from http://repositorio.ufpe.br/handle/123456789/2436

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Oliveira e Cruz, Rafael Menelau. “Methods for dynamic selection and fusion of ensemble of classifiers .” 2011. Thesis, Universidade Federal de Pernambuco. Accessed June 25, 2019. http://repositorio.ufpe.br/handle/123456789/2436.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Oliveira e Cruz, Rafael Menelau. “Methods for dynamic selection and fusion of ensemble of classifiers .” 2011. Web. 25 Jun 2019.

Vancouver:

Oliveira e Cruz RM. Methods for dynamic selection and fusion of ensemble of classifiers . [Internet] [Thesis]. Universidade Federal de Pernambuco; 2011. [cited 2019 Jun 25]. Available from: http://repositorio.ufpe.br/handle/123456789/2436.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Oliveira e Cruz RM. Methods for dynamic selection and fusion of ensemble of classifiers . [Thesis]. Universidade Federal de Pernambuco; 2011. Available from: http://repositorio.ufpe.br/handle/123456789/2436

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

4. Narassiguin, Anil. Apprentissage Ensembliste, Étude comparative et Améliorations via Sélection Dynamique : Ensemble Learning, Comparative Analysis and Further Improvements with Dynamic Ensemble Selection.

Degree: Docteur es, Informatique, 2018, Lyon

Les méthodes ensemblistes constituent un sujet de recherche très populaire au cours de la dernière décennie. Leur succès découle en grande partie de leurs solutions… (more)

Subjects/Keywords: Apprentissage ensembliste; Sélection dynamique; Multi-label; Ensemble learning; Dynamic ensemble selection; Multi-label; 004

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Narassiguin, A. (2018). Apprentissage Ensembliste, Étude comparative et Améliorations via Sélection Dynamique : Ensemble Learning, Comparative Analysis and Further Improvements with Dynamic Ensemble Selection. (Doctoral Dissertation). Lyon. Retrieved from http://www.theses.fr/2018LYSE1075

Chicago Manual of Style (16th Edition):

Narassiguin, Anil. “Apprentissage Ensembliste, Étude comparative et Améliorations via Sélection Dynamique : Ensemble Learning, Comparative Analysis and Further Improvements with Dynamic Ensemble Selection.” 2018. Doctoral Dissertation, Lyon. Accessed June 25, 2019. http://www.theses.fr/2018LYSE1075.

MLA Handbook (7th Edition):

Narassiguin, Anil. “Apprentissage Ensembliste, Étude comparative et Améliorations via Sélection Dynamique : Ensemble Learning, Comparative Analysis and Further Improvements with Dynamic Ensemble Selection.” 2018. Web. 25 Jun 2019.

Vancouver:

Narassiguin A. Apprentissage Ensembliste, Étude comparative et Améliorations via Sélection Dynamique : Ensemble Learning, Comparative Analysis and Further Improvements with Dynamic Ensemble Selection. [Internet] [Doctoral dissertation]. Lyon; 2018. [cited 2019 Jun 25]. Available from: http://www.theses.fr/2018LYSE1075.

Council of Science Editors:

Narassiguin A. Apprentissage Ensembliste, Étude comparative et Améliorations via Sélection Dynamique : Ensemble Learning, Comparative Analysis and Further Improvements with Dynamic Ensemble Selection. [Doctoral Dissertation]. Lyon; 2018. Available from: http://www.theses.fr/2018LYSE1075


Brunel University

5. Al-Enezi, Jamal. Artificial immune systems based committee machine for classification application.

Degree: 2012, Brunel University

 A new adaptive learning Artificial Immune System (AIS) based committee machine is developed in this thesis. The new proposed approach efficiently tackles the general problem… (more)

Subjects/Keywords: 006.3; Ensemble model; Clonal selection; Negative selection; Artificial immune networks; Neuro-fuzzy detector

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Al-Enezi, J. (2012). Artificial immune systems based committee machine for classification application. (Doctoral Dissertation). Brunel University. Retrieved from http://bura.brunel.ac.uk/handle/2438/6826 ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557868

Chicago Manual of Style (16th Edition):

Al-Enezi, Jamal. “Artificial immune systems based committee machine for classification application.” 2012. Doctoral Dissertation, Brunel University. Accessed June 25, 2019. http://bura.brunel.ac.uk/handle/2438/6826 ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557868.

MLA Handbook (7th Edition):

Al-Enezi, Jamal. “Artificial immune systems based committee machine for classification application.” 2012. Web. 25 Jun 2019.

Vancouver:

Al-Enezi J. Artificial immune systems based committee machine for classification application. [Internet] [Doctoral dissertation]. Brunel University; 2012. [cited 2019 Jun 25]. Available from: http://bura.brunel.ac.uk/handle/2438/6826 ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557868.

Council of Science Editors:

Al-Enezi J. Artificial immune systems based committee machine for classification application. [Doctoral Dissertation]. Brunel University; 2012. Available from: http://bura.brunel.ac.uk/handle/2438/6826 ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.557868


Duke University

6. Hughes, Roy Gene. Using Helix-coil Models to Study Protein Unfolded States .

Degree: 2016, Duke University

  An abstract of a thesis devoted to using helix-coil models to study unfolded states.\ Research on polypeptide unfolded states has received much more attention… (more)

Subjects/Keywords: Biophysics; Statistics; Biochemistry; amide; coil; ensemble; exchange; helix; model selection

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hughes, R. G. (2016). Using Helix-coil Models to Study Protein Unfolded States . (Thesis). Duke University. Retrieved from http://hdl.handle.net/10161/12279

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hughes, Roy Gene. “Using Helix-coil Models to Study Protein Unfolded States .” 2016. Thesis, Duke University. Accessed June 25, 2019. http://hdl.handle.net/10161/12279.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hughes, Roy Gene. “Using Helix-coil Models to Study Protein Unfolded States .” 2016. Web. 25 Jun 2019.

Vancouver:

Hughes RG. Using Helix-coil Models to Study Protein Unfolded States . [Internet] [Thesis]. Duke University; 2016. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/10161/12279.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hughes RG. Using Helix-coil Models to Study Protein Unfolded States . [Thesis]. Duke University; 2016. Available from: http://hdl.handle.net/10161/12279

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Cambridge

7. Wang, Fan. Penalised regression for high-dimensional data : an empirical investigation and improvements via ensemble learning.

Degree: PhD, 2019, University of Cambridge

 In a wide range of applications, datasets are generated for which the number of variables p exceeds the sample size n. Penalised likelihood methods are… (more)

Subjects/Keywords: Penalised regression; Lasso; Ensemble learning; Variable selection; High-dimensional data

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, F. (2019). Penalised regression for high-dimensional data : an empirical investigation and improvements via ensemble learning. (Doctoral Dissertation). University of Cambridge. Retrieved from https://www.repository.cam.ac.uk/handle/1810/289419 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.767927

Chicago Manual of Style (16th Edition):

Wang, Fan. “Penalised regression for high-dimensional data : an empirical investigation and improvements via ensemble learning.” 2019. Doctoral Dissertation, University of Cambridge. Accessed June 25, 2019. https://www.repository.cam.ac.uk/handle/1810/289419 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.767927.

MLA Handbook (7th Edition):

Wang, Fan. “Penalised regression for high-dimensional data : an empirical investigation and improvements via ensemble learning.” 2019. Web. 25 Jun 2019.

Vancouver:

Wang F. Penalised regression for high-dimensional data : an empirical investigation and improvements via ensemble learning. [Internet] [Doctoral dissertation]. University of Cambridge; 2019. [cited 2019 Jun 25]. Available from: https://www.repository.cam.ac.uk/handle/1810/289419 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.767927.

Council of Science Editors:

Wang F. Penalised regression for high-dimensional data : an empirical investigation and improvements via ensemble learning. [Doctoral Dissertation]. University of Cambridge; 2019. Available from: https://www.repository.cam.ac.uk/handle/1810/289419 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.767927


Virginia Tech

8. Ngo, Khai Thoi. Stacking Ensemble for auto_ml.

Degree: MS, Electrical and Computer Engineering, 2018, Virginia Tech

 Machine learning has been a subject undergoing intense study across many different industries and academic research areas. Companies and researchers have taken full advantages of… (more)

Subjects/Keywords: Machine Learning; Stacking Ensemble; Model Selection; Hyper-parameter optimization; auto_ml

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ngo, K. T. (2018). Stacking Ensemble for auto_ml. (Masters Thesis). Virginia Tech. Retrieved from http://hdl.handle.net/10919/83547

Chicago Manual of Style (16th Edition):

Ngo, Khai Thoi. “Stacking Ensemble for auto_ml.” 2018. Masters Thesis, Virginia Tech. Accessed June 25, 2019. http://hdl.handle.net/10919/83547.

MLA Handbook (7th Edition):

Ngo, Khai Thoi. “Stacking Ensemble for auto_ml.” 2018. Web. 25 Jun 2019.

Vancouver:

Ngo KT. Stacking Ensemble for auto_ml. [Internet] [Masters thesis]. Virginia Tech; 2018. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/10919/83547.

Council of Science Editors:

Ngo KT. Stacking Ensemble for auto_ml. [Masters Thesis]. Virginia Tech; 2018. Available from: http://hdl.handle.net/10919/83547


University of Waterloo

9. Xin, Lu. Stochastic Stepwise Ensembles for Variable Selection.

Degree: 2009, University of Waterloo

 Ensembles methods such as AdaBoost, Bagging and Random Forest have attracted much attention in the statistical learning community in the last 15 years. Zhu and… (more)

Subjects/Keywords: Stochastic Stepwise; Ensemble; Parallel Genetic Algorithm; Variable Selection; statistical learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Xin, L. (2009). Stochastic Stepwise Ensembles for Variable Selection. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/4369

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Xin, Lu. “Stochastic Stepwise Ensembles for Variable Selection.” 2009. Thesis, University of Waterloo. Accessed June 25, 2019. http://hdl.handle.net/10012/4369.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Xin, Lu. “Stochastic Stepwise Ensembles for Variable Selection.” 2009. Web. 25 Jun 2019.

Vancouver:

Xin L. Stochastic Stepwise Ensembles for Variable Selection. [Internet] [Thesis]. University of Waterloo; 2009. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/10012/4369.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Xin L. Stochastic Stepwise Ensembles for Variable Selection. [Thesis]. University of Waterloo; 2009. Available from: http://hdl.handle.net/10012/4369

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


NSYSU

10. Lin, Chia-Hung. Study on MIMO Antenna Selection Based on Deep Learning.

Degree: Master, Communications Engineering, 2018, NSYSU

 MIMO technology can improve the spectral efficiency of the communication system obviously. And we usually employ antenna selection technology when we implement MIMO on the… (more)

Subjects/Keywords: deep learning; antenna selection; MIMO; overfitting; ensemble learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lin, C. (2018). Study on MIMO Antenna Selection Based on Deep Learning. (Thesis). NSYSU. Retrieved from http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0624118-111521

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lin, Chia-Hung. “Study on MIMO Antenna Selection Based on Deep Learning.” 2018. Thesis, NSYSU. Accessed June 25, 2019. http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0624118-111521.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lin, Chia-Hung. “Study on MIMO Antenna Selection Based on Deep Learning.” 2018. Web. 25 Jun 2019.

Vancouver:

Lin C. Study on MIMO Antenna Selection Based on Deep Learning. [Internet] [Thesis]. NSYSU; 2018. [cited 2019 Jun 25]. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0624118-111521.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lin C. Study on MIMO Antenna Selection Based on Deep Learning. [Thesis]. NSYSU; 2018. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0624118-111521

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

11. Aldave, Roberto. Systematic ensemble learning and extensions for regression .

Degree: 2015, Université de Sherbrooke

 Abstract : The objective is to provide methods to improve the performance, or prediction accuracy of standard stacking approach, which is an ensemble method composed… (more)

Subjects/Keywords: Ensemble learning; Stacked generalization; Systematic cross-validation; Ensemble selection; Regression; Pareto non-dominated alternatives; Multi-criteria optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Aldave, R. (2015). Systematic ensemble learning and extensions for regression . (Doctoral Dissertation). Université de Sherbrooke. Retrieved from http://hdl.handle.net/11143/6958

Chicago Manual of Style (16th Edition):

Aldave, Roberto. “Systematic ensemble learning and extensions for regression .” 2015. Doctoral Dissertation, Université de Sherbrooke. Accessed June 25, 2019. http://hdl.handle.net/11143/6958.

MLA Handbook (7th Edition):

Aldave, Roberto. “Systematic ensemble learning and extensions for regression .” 2015. Web. 25 Jun 2019.

Vancouver:

Aldave R. Systematic ensemble learning and extensions for regression . [Internet] [Doctoral dissertation]. Université de Sherbrooke; 2015. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/11143/6958.

Council of Science Editors:

Aldave R. Systematic ensemble learning and extensions for regression . [Doctoral Dissertation]. Université de Sherbrooke; 2015. Available from: http://hdl.handle.net/11143/6958


Texas A&M University

12. De, Debkumar. Essays on Bayesian Time Series and Variable Selection.

Degree: 2014, Texas A&M University

 Estimating model parameters in dynamic model continues to be challenge. In my dissertation, we have introduced a Stochastic Approximation based parameter estimation approach under Ensemble(more)

Subjects/Keywords: Ensemble Kalman Filter; Stochastic Approximation; Non-parametric Regression; Matrix variate regression; Variable selection

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

De, D. (2014). Essays on Bayesian Time Series and Variable Selection. (Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/152793

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

De, Debkumar. “Essays on Bayesian Time Series and Variable Selection.” 2014. Thesis, Texas A&M University. Accessed June 25, 2019. http://hdl.handle.net/1969.1/152793.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

De, Debkumar. “Essays on Bayesian Time Series and Variable Selection.” 2014. Web. 25 Jun 2019.

Vancouver:

De D. Essays on Bayesian Time Series and Variable Selection. [Internet] [Thesis]. Texas A&M University; 2014. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/1969.1/152793.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

De D. Essays on Bayesian Time Series and Variable Selection. [Thesis]. Texas A&M University; 2014. Available from: http://hdl.handle.net/1969.1/152793

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Waikato

13. Pradhananga, Nripendra. Effective Linear-Time Feature Selection .

Degree: 2007, University of Waikato

 The classification learning task requires selection of a subset of features to represent patterns to be classified. This is because the performance of the classifier… (more)

Subjects/Keywords: filter; wrapper; feature selection; attribute selection; ensemble learning; machine learning; Linear Feature Selection

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Pradhananga, N. (2007). Effective Linear-Time Feature Selection . (Masters Thesis). University of Waikato. Retrieved from http://hdl.handle.net/10289/2315

Chicago Manual of Style (16th Edition):

Pradhananga, Nripendra. “Effective Linear-Time Feature Selection .” 2007. Masters Thesis, University of Waikato. Accessed June 25, 2019. http://hdl.handle.net/10289/2315.

MLA Handbook (7th Edition):

Pradhananga, Nripendra. “Effective Linear-Time Feature Selection .” 2007. Web. 25 Jun 2019.

Vancouver:

Pradhananga N. Effective Linear-Time Feature Selection . [Internet] [Masters thesis]. University of Waikato; 2007. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/10289/2315.

Council of Science Editors:

Pradhananga N. Effective Linear-Time Feature Selection . [Masters Thesis]. University of Waikato; 2007. Available from: http://hdl.handle.net/10289/2315

14. Pacheco Do Espirito Silva, Caroline. Feature extraction and selection for background modeling and foreground detection : Extraction et sélection de caractéristiques pour la détection d’objets mobiles dans des vidéos.

Degree: Docteur es, Mathématiques, image et applications, 2017, La Rochelle

Dans ce manuscrit de thèse, nous présentons un descripteur robuste pour la soustraction d’arrière-plan qui est capable de décrire la texture à partir d’une séquence… (more)

Subjects/Keywords: Détection d’objets mobiles; Soustraction de l’arrière-plan; Apprentissage par ensemble; Sélection de caractéristique; Extraction de caractéristique; Moving object detection; Background/foreground separation; Ensemble learning; Feature selection; Feature extraction

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Pacheco Do Espirito Silva, C. (2017). Feature extraction and selection for background modeling and foreground detection : Extraction et sélection de caractéristiques pour la détection d’objets mobiles dans des vidéos. (Doctoral Dissertation). La Rochelle. Retrieved from http://www.theses.fr/2017LAROS005

Chicago Manual of Style (16th Edition):

Pacheco Do Espirito Silva, Caroline. “Feature extraction and selection for background modeling and foreground detection : Extraction et sélection de caractéristiques pour la détection d’objets mobiles dans des vidéos.” 2017. Doctoral Dissertation, La Rochelle. Accessed June 25, 2019. http://www.theses.fr/2017LAROS005.

MLA Handbook (7th Edition):

Pacheco Do Espirito Silva, Caroline. “Feature extraction and selection for background modeling and foreground detection : Extraction et sélection de caractéristiques pour la détection d’objets mobiles dans des vidéos.” 2017. Web. 25 Jun 2019.

Vancouver:

Pacheco Do Espirito Silva C. Feature extraction and selection for background modeling and foreground detection : Extraction et sélection de caractéristiques pour la détection d’objets mobiles dans des vidéos. [Internet] [Doctoral dissertation]. La Rochelle; 2017. [cited 2019 Jun 25]. Available from: http://www.theses.fr/2017LAROS005.

Council of Science Editors:

Pacheco Do Espirito Silva C. Feature extraction and selection for background modeling and foreground detection : Extraction et sélection de caractéristiques pour la détection d’objets mobiles dans des vidéos. [Doctoral Dissertation]. La Rochelle; 2017. Available from: http://www.theses.fr/2017LAROS005

15. Suárez, Alberto. An analysis of ensemble pruning techniques based on ordered aggregation.

Degree: 2018, IEEE

Subjects/Keywords: Bagging; Decision trees; Ensemble Pruning; Ensemble Selection; Ensembles of classifiers; Ordered Aggregation; Informática

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Suárez, A. (2018). An analysis of ensemble pruning techniques based on ordered aggregation. (Thesis). IEEE. Retrieved from http://hdl.handle.net/10486/664049

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Suárez, Alberto. “An analysis of ensemble pruning techniques based on ordered aggregation.” 2018. Thesis, IEEE. Accessed June 25, 2019. http://hdl.handle.net/10486/664049.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Suárez, Alberto. “An analysis of ensemble pruning techniques based on ordered aggregation.” 2018. Web. 25 Jun 2019.

Vancouver:

Suárez A. An analysis of ensemble pruning techniques based on ordered aggregation. [Internet] [Thesis]. IEEE; 2018. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/10486/664049.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Suárez A. An analysis of ensemble pruning techniques based on ordered aggregation. [Thesis]. IEEE; 2018. Available from: http://hdl.handle.net/10486/664049

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Carnegie Mellon University

16. Fiterau, Madalina. Discovering Compact and Informative Structures through Data Partitioning.

Degree: 2015, Carnegie Mellon University

 In many practical scenarios, prediction for high-dimensional observations can be accurately performed using only a fraction of the existing features. However, the set of relevant… (more)

Subjects/Keywords: informative projection recovery; cost-based feature selection; ensemble methods; data partitioning; active learning; clinical data analysis

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fiterau, M. (2015). Discovering Compact and Informative Structures through Data Partitioning. (Thesis). Carnegie Mellon University. Retrieved from http://repository.cmu.edu/dissertations/792

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Fiterau, Madalina. “Discovering Compact and Informative Structures through Data Partitioning.” 2015. Thesis, Carnegie Mellon University. Accessed June 25, 2019. http://repository.cmu.edu/dissertations/792.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Fiterau, Madalina. “Discovering Compact and Informative Structures through Data Partitioning.” 2015. Web. 25 Jun 2019.

Vancouver:

Fiterau M. Discovering Compact and Informative Structures through Data Partitioning. [Internet] [Thesis]. Carnegie Mellon University; 2015. [cited 2019 Jun 25]. Available from: http://repository.cmu.edu/dissertations/792.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Fiterau M. Discovering Compact and Informative Structures through Data Partitioning. [Thesis]. Carnegie Mellon University; 2015. Available from: http://repository.cmu.edu/dissertations/792

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Universidade do Rio Grande do Sul

17. Padilha, Carlos Alberto de Araújo. Uma abordagem multinível usando algoritmos genéticos em um comitê de LS-SVM.

Degree: 2018, Universidade do Rio Grande do Sul

Há muitos anos, os sistemas de comitê já tem se mostrado um método eficiente para aumentar a acurácia e estabilidade de algoritmos de aprendizado nas… (more)

Subjects/Keywords: Algoritmos geneticos; Ensemble Systems; Deep Learning; Aprendizagem : Maquina; Diversity; Feature Selection; Least Squares Support Vector Machines; Genetic Algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Padilha, C. A. d. A. (2018). Uma abordagem multinível usando algoritmos genéticos em um comitê de LS-SVM. (Thesis). Universidade do Rio Grande do Sul. Retrieved from http://hdl.handle.net/10183/174541

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Padilha, Carlos Alberto de Araújo. “Uma abordagem multinível usando algoritmos genéticos em um comitê de LS-SVM.” 2018. Thesis, Universidade do Rio Grande do Sul. Accessed June 25, 2019. http://hdl.handle.net/10183/174541.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Padilha, Carlos Alberto de Araújo. “Uma abordagem multinível usando algoritmos genéticos em um comitê de LS-SVM.” 2018. Web. 25 Jun 2019.

Vancouver:

Padilha CAdA. Uma abordagem multinível usando algoritmos genéticos em um comitê de LS-SVM. [Internet] [Thesis]. Universidade do Rio Grande do Sul; 2018. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/10183/174541.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Padilha CAdA. Uma abordagem multinível usando algoritmos genéticos em um comitê de LS-SVM. [Thesis]. Universidade do Rio Grande do Sul; 2018. Available from: http://hdl.handle.net/10183/174541

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Victoria University of Wellington

18. Tran, Cao Truong. Evolutionary Machine Learning for Classification with Incomplete Data.

Degree: 2018, Victoria University of Wellington

 Classification is a major task in machine learning and data mining. Many real-world datasets suffer from the unavoidable issue of missing values. Classification with incomplete… (more)

Subjects/Keywords: Incomplete data; Missing data; Classification; Machine learning; Evolutionary computation; Genetic programming; Ensemble learning; Feature selection; Feature construction

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tran, C. T. (2018). Evolutionary Machine Learning for Classification with Incomplete Data. (Doctoral Dissertation). Victoria University of Wellington. Retrieved from http://hdl.handle.net/10063/7639

Chicago Manual of Style (16th Edition):

Tran, Cao Truong. “Evolutionary Machine Learning for Classification with Incomplete Data.” 2018. Doctoral Dissertation, Victoria University of Wellington. Accessed June 25, 2019. http://hdl.handle.net/10063/7639.

MLA Handbook (7th Edition):

Tran, Cao Truong. “Evolutionary Machine Learning for Classification with Incomplete Data.” 2018. Web. 25 Jun 2019.

Vancouver:

Tran CT. Evolutionary Machine Learning for Classification with Incomplete Data. [Internet] [Doctoral dissertation]. Victoria University of Wellington; 2018. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/10063/7639.

Council of Science Editors:

Tran CT. Evolutionary Machine Learning for Classification with Incomplete Data. [Doctoral Dissertation]. Victoria University of Wellington; 2018. Available from: http://hdl.handle.net/10063/7639

19. Sun, Quan. Meta-Learning and the Full Model Selection Problem .

Degree: 2014, University of Waikato

 When working as a data analyst, one of my daily tasks is to select appropriate tools from a set of existing data analysis techniques in… (more)

Subjects/Keywords: meta-learning; ranking; ensemble learning; model selection

…Learning curves of ES, ES++ and the three bagging ensemble selection algorithms… …86 4.6 Examples of the hillclimb set overfitting problem of the Ensemble Selection… …problem of the Ensemble Selection strategy on the CPU performance data . . . . . . . . . 92 4.8… …Bernhard Pfahringer. Bagging Ensemble Selection for Regression. In Proceedings of the 25th… …Dunedin, New Zealand, 2012. • Quan Sun and Bernhard Pfahringer. Bagging Ensemble Selection. In… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sun, Q. (2014). Meta-Learning and the Full Model Selection Problem . (Doctoral Dissertation). University of Waikato. Retrieved from http://hdl.handle.net/10289/8520

Chicago Manual of Style (16th Edition):

Sun, Quan. “Meta-Learning and the Full Model Selection Problem .” 2014. Doctoral Dissertation, University of Waikato. Accessed June 25, 2019. http://hdl.handle.net/10289/8520.

MLA Handbook (7th Edition):

Sun, Quan. “Meta-Learning and the Full Model Selection Problem .” 2014. Web. 25 Jun 2019.

Vancouver:

Sun Q. Meta-Learning and the Full Model Selection Problem . [Internet] [Doctoral dissertation]. University of Waikato; 2014. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/10289/8520.

Council of Science Editors:

Sun Q. Meta-Learning and the Full Model Selection Problem . [Doctoral Dissertation]. University of Waikato; 2014. Available from: http://hdl.handle.net/10289/8520


University of Stirling

20. Ali, Rozniza. Ensemble classification and signal image processing for genus Gyrodactylus (Monogenea).

Degree: PhD, 2014, University of Stirling

 This thesis presents an investigation into Gyrodactylus species recognition, making use of machine learning classification and feature selection techniques, and explores image feature extraction to… (more)

Subjects/Keywords: Gyrodactylus; machine learning; feature selection; Active Shape Model; ensemble classification; Complex Network; Machine learning; Fishes Parasites; Gyrodactylus

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ali, R. (2014). Ensemble classification and signal image processing for genus Gyrodactylus (Monogenea). (Doctoral Dissertation). University of Stirling. Retrieved from http://hdl.handle.net/1893/21734

Chicago Manual of Style (16th Edition):

Ali, Rozniza. “Ensemble classification and signal image processing for genus Gyrodactylus (Monogenea).” 2014. Doctoral Dissertation, University of Stirling. Accessed June 25, 2019. http://hdl.handle.net/1893/21734.

MLA Handbook (7th Edition):

Ali, Rozniza. “Ensemble classification and signal image processing for genus Gyrodactylus (Monogenea).” 2014. Web. 25 Jun 2019.

Vancouver:

Ali R. Ensemble classification and signal image processing for genus Gyrodactylus (Monogenea). [Internet] [Doctoral dissertation]. University of Stirling; 2014. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/1893/21734.

Council of Science Editors:

Ali R. Ensemble classification and signal image processing for genus Gyrodactylus (Monogenea). [Doctoral Dissertation]. University of Stirling; 2014. Available from: http://hdl.handle.net/1893/21734


RMIT University

21. Song, H. Evolutionary multivariate time series prediction.

Degree: 2019, RMIT University

 Multivariate time series (MTS) prediction plays a significant role in many practical data mining applications, such as finance, energy supply, and medical care domains. Over… (more)

Subjects/Keywords: Fields of Research; Multivariate time series prediction; Evolutionary algorithm; Ensemble learning; Feature extraction; Feature selection; Single-objective optimization; Multi-objective optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Song, H. (2019). Evolutionary multivariate time series prediction. (Thesis). RMIT University. Retrieved from http://researchbank.rmit.edu.au/view/rmit:162681

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Song, H. “Evolutionary multivariate time series prediction.” 2019. Thesis, RMIT University. Accessed June 25, 2019. http://researchbank.rmit.edu.au/view/rmit:162681.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Song, H. “Evolutionary multivariate time series prediction.” 2019. Web. 25 Jun 2019.

Vancouver:

Song H. Evolutionary multivariate time series prediction. [Internet] [Thesis]. RMIT University; 2019. [cited 2019 Jun 25]. Available from: http://researchbank.rmit.edu.au/view/rmit:162681.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Song H. Evolutionary multivariate time series prediction. [Thesis]. RMIT University; 2019. Available from: http://researchbank.rmit.edu.au/view/rmit:162681

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

22. Gharroudi, Ouadie. Ensemble multi-label learning in supervised and semi-supervised settings : Apprentissage multi-label ensembliste dans le context supervisé et semi-supervisé.

Degree: Docteur es, Informatique, 2017, Lyon

L'apprentissage multi-label est un problème d'apprentissage supervisé où chaque instance peut être associée à plusieurs labels cibles simultanément. Il est omniprésent dans l'apprentissage automatique et… (more)

Subjects/Keywords: Classification multi-label; Apprentissage supervisé; Apprentissage semi-supervisé; Multi-label classification; Ensemble models; Semi-supervised learning; Feature selection; 004

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gharroudi, O. (2017). Ensemble multi-label learning in supervised and semi-supervised settings : Apprentissage multi-label ensembliste dans le context supervisé et semi-supervisé. (Doctoral Dissertation). Lyon. Retrieved from http://www.theses.fr/2017LYSE1333

Chicago Manual of Style (16th Edition):

Gharroudi, Ouadie. “Ensemble multi-label learning in supervised and semi-supervised settings : Apprentissage multi-label ensembliste dans le context supervisé et semi-supervisé.” 2017. Doctoral Dissertation, Lyon. Accessed June 25, 2019. http://www.theses.fr/2017LYSE1333.

MLA Handbook (7th Edition):

Gharroudi, Ouadie. “Ensemble multi-label learning in supervised and semi-supervised settings : Apprentissage multi-label ensembliste dans le context supervisé et semi-supervisé.” 2017. Web. 25 Jun 2019.

Vancouver:

Gharroudi O. Ensemble multi-label learning in supervised and semi-supervised settings : Apprentissage multi-label ensembliste dans le context supervisé et semi-supervisé. [Internet] [Doctoral dissertation]. Lyon; 2017. [cited 2019 Jun 25]. Available from: http://www.theses.fr/2017LYSE1333.

Council of Science Editors:

Gharroudi O. Ensemble multi-label learning in supervised and semi-supervised settings : Apprentissage multi-label ensembliste dans le context supervisé et semi-supervisé. [Doctoral Dissertation]. Lyon; 2017. Available from: http://www.theses.fr/2017LYSE1333


University of Pretoria

23. Lutu, P.E.N. (Patricia Elizabeth Nalwoga). Dataset selection for aggregate model implementation in predictive data mining.

Degree: Computer Science, 2010, University of Pretoria

 Data mining has become a commonly used method for the analysis of organisational data, for purposes of summarizing data in useful ways and identifying non-trivial… (more)

Subjects/Keywords: Dataset partitioning; Data mining; Bias reduction; Predictive modeling; Classification; Model aggregation; Ensemble classifiers; Ova classification; Pvn classification; Dataset selection; Featureselection; Variable selection; Large datasets; Variance reduction; Dataset sampling; UCTD

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lutu, P. E. N. (. E. (2010). Dataset selection for aggregate model implementation in predictive data mining. (Doctoral Dissertation). University of Pretoria. Retrieved from http://hdl.handle.net/2263/29486

Chicago Manual of Style (16th Edition):

Lutu, P E N (Patricia Elizabeth. “Dataset selection for aggregate model implementation in predictive data mining.” 2010. Doctoral Dissertation, University of Pretoria. Accessed June 25, 2019. http://hdl.handle.net/2263/29486.

MLA Handbook (7th Edition):

Lutu, P E N (Patricia Elizabeth. “Dataset selection for aggregate model implementation in predictive data mining.” 2010. Web. 25 Jun 2019.

Vancouver:

Lutu PEN(E. Dataset selection for aggregate model implementation in predictive data mining. [Internet] [Doctoral dissertation]. University of Pretoria; 2010. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/2263/29486.

Council of Science Editors:

Lutu PEN(E. Dataset selection for aggregate model implementation in predictive data mining. [Doctoral Dissertation]. University of Pretoria; 2010. Available from: http://hdl.handle.net/2263/29486


University of Pretoria

24. Lutu, P.E.N. (Patricia Elizabeth Nalwoga). Dataset selection for aggregate model implementation in predictive data mining .

Degree: 2010, University of Pretoria

 Data mining has become a commonly used method for the analysis of organisational data, for purposes of summarizing data in useful ways and identifying non-trivial… (more)

Subjects/Keywords: Dataset partitioning; Data mining; Bias reduction; Predictive modeling; Classification; Model aggregation; Ensemble classifiers; Ova classification; Pvn classification; Dataset selection; Featureselection; Variable selection; Large datasets; Variance reduction; Dataset sampling; UCTD

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lutu, P. E. N. (. E. (2010). Dataset selection for aggregate model implementation in predictive data mining . (Doctoral Dissertation). University of Pretoria. Retrieved from http://upetd.up.ac.za/thesis/available/etd-11152010-203041/

Chicago Manual of Style (16th Edition):

Lutu, P E N (Patricia Elizabeth. “Dataset selection for aggregate model implementation in predictive data mining .” 2010. Doctoral Dissertation, University of Pretoria. Accessed June 25, 2019. http://upetd.up.ac.za/thesis/available/etd-11152010-203041/.

MLA Handbook (7th Edition):

Lutu, P E N (Patricia Elizabeth. “Dataset selection for aggregate model implementation in predictive data mining .” 2010. Web. 25 Jun 2019.

Vancouver:

Lutu PEN(E. Dataset selection for aggregate model implementation in predictive data mining . [Internet] [Doctoral dissertation]. University of Pretoria; 2010. [cited 2019 Jun 25]. Available from: http://upetd.up.ac.za/thesis/available/etd-11152010-203041/.

Council of Science Editors:

Lutu PEN(E. Dataset selection for aggregate model implementation in predictive data mining . [Doctoral Dissertation]. University of Pretoria; 2010. Available from: http://upetd.up.ac.za/thesis/available/etd-11152010-203041/


University of New South Wales

25. Milne, Linda. Machine learning for automatic classification of remotely sensed data.

Degree: Computer Science & Engineering, 2008, University of New South Wales

 As more and more remotely sensed data becomes available it is becoming increasinglyharder to analyse it with the more traditional labour intensive, manualmethods. The commonly… (more)

Subjects/Keywords: contribution analysis; ensemble classifiers; multi-strategy classification; attribute selection; feature selection

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Milne, L. (2008). Machine learning for automatic classification of remotely sensed data. (Doctoral Dissertation). University of New South Wales. Retrieved from http://handle.unsw.edu.au/1959.4/41322 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:2634/SOURCE2?view=true

Chicago Manual of Style (16th Edition):

Milne, Linda. “Machine learning for automatic classification of remotely sensed data.” 2008. Doctoral Dissertation, University of New South Wales. Accessed June 25, 2019. http://handle.unsw.edu.au/1959.4/41322 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:2634/SOURCE2?view=true.

MLA Handbook (7th Edition):

Milne, Linda. “Machine learning for automatic classification of remotely sensed data.” 2008. Web. 25 Jun 2019.

Vancouver:

Milne L. Machine learning for automatic classification of remotely sensed data. [Internet] [Doctoral dissertation]. University of New South Wales; 2008. [cited 2019 Jun 25]. Available from: http://handle.unsw.edu.au/1959.4/41322 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:2634/SOURCE2?view=true.

Council of Science Editors:

Milne L. Machine learning for automatic classification of remotely sensed data. [Doctoral Dissertation]. University of New South Wales; 2008. Available from: http://handle.unsw.edu.au/1959.4/41322 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:2634/SOURCE2?view=true

26. Ferreira, Ednaldo José. "Abordagem genética para seleção de um conjunto reduzido de características para construção de ensembles de redes neurais: aplicação à língua eletrônica".

Degree: Mestrado, Ciências de Computação e Matemática Computacional, 2005, University of São Paulo

As características irrelevantes, presentes em bases de dados de diversos domínios, deterioram a acurácia de predição de classificadores induzidos por algoritmos de aprendizado de máquina.… (more)

Subjects/Keywords: algoritmo genético; ensemble; ensemble; feature subset selection; genetic algorithm; neural networks; redes neurais artificiais; seleção de características

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ferreira, E. J. (2005). "Abordagem genética para seleção de um conjunto reduzido de características para construção de ensembles de redes neurais: aplicação à língua eletrônica". (Masters Thesis). University of São Paulo. Retrieved from http://www.teses.usp.br/teses/disponiveis/55/55134/tde-18052006-143603/ ;

Chicago Manual of Style (16th Edition):

Ferreira, Ednaldo José. “"Abordagem genética para seleção de um conjunto reduzido de características para construção de ensembles de redes neurais: aplicação à língua eletrônica".” 2005. Masters Thesis, University of São Paulo. Accessed June 25, 2019. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-18052006-143603/ ;.

MLA Handbook (7th Edition):

Ferreira, Ednaldo José. “"Abordagem genética para seleção de um conjunto reduzido de características para construção de ensembles de redes neurais: aplicação à língua eletrônica".” 2005. Web. 25 Jun 2019.

Vancouver:

Ferreira EJ. "Abordagem genética para seleção de um conjunto reduzido de características para construção de ensembles de redes neurais: aplicação à língua eletrônica". [Internet] [Masters thesis]. University of São Paulo; 2005. [cited 2019 Jun 25]. Available from: http://www.teses.usp.br/teses/disponiveis/55/55134/tde-18052006-143603/ ;.

Council of Science Editors:

Ferreira EJ. "Abordagem genética para seleção de um conjunto reduzido de características para construção de ensembles de redes neurais: aplicação à língua eletrônica". [Masters Thesis]. University of São Paulo; 2005. Available from: http://www.teses.usp.br/teses/disponiveis/55/55134/tde-18052006-143603/ ;

27. Haning, Jacob M. Feature Selection for High-Dimensional Individual and Ensemble Classifiers with Limited Data.

Degree: MS, Engineering and Applied Science: Electrical Engineering, 2014, University of Cincinnati

 There are many feature selection algorithms and many classification methods available to choose from in order to successfully and accurately learn a data set. This… (more)

Subjects/Keywords: Artificial Intelligence; Feature Selection; Ensemble; Classification; Relief-f; CART; ANOVA

…a feature selection ensemble is employed in conjunction with a classifier ensemble and… …Selection Ensemble The process for determining feature subsets for the ensemble construction… …combination of feature selection algorithms and classifiers. It will show that the combination of… …feature selection algorithms with classifier ensembles produces a more favorable result than the… …use of any individual method. Feature selection is a natural way of dimensionality reduction… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Haning, J. M. (2014). Feature Selection for High-Dimensional Individual and Ensemble Classifiers with Limited Data. (Masters Thesis). University of Cincinnati. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=ucin1406810947

Chicago Manual of Style (16th Edition):

Haning, Jacob M. “Feature Selection for High-Dimensional Individual and Ensemble Classifiers with Limited Data.” 2014. Masters Thesis, University of Cincinnati. Accessed June 25, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1406810947.

MLA Handbook (7th Edition):

Haning, Jacob M. “Feature Selection for High-Dimensional Individual and Ensemble Classifiers with Limited Data.” 2014. Web. 25 Jun 2019.

Vancouver:

Haning JM. Feature Selection for High-Dimensional Individual and Ensemble Classifiers with Limited Data. [Internet] [Masters thesis]. University of Cincinnati; 2014. [cited 2019 Jun 25]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1406810947.

Council of Science Editors:

Haning JM. Feature Selection for High-Dimensional Individual and Ensemble Classifiers with Limited Data. [Masters Thesis]. University of Cincinnati; 2014. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=ucin1406810947


Washington University in St. Louis

28. Schiller, Todd. Ensemble Support Vector Machine Models of Radiation-Induced Lung Injury Risk.

Degree: MA, Computer Science and Engineering, 2009, Washington University in St. Louis

 Patients undergoing radiation therapy can develop a potentially fatal inflammation of the lungs known as radiation pneumonitis: RP). In practice, modeling RP factors is difficult… (more)

Subjects/Keywords: support vector machine; radiation pneumonitis; feature selection; ensemble classifier; data imbalance

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Schiller, T. (2009). Ensemble Support Vector Machine Models of Radiation-Induced Lung Injury Risk. (Thesis). Washington University in St. Louis. Retrieved from https://openscholarship.wustl.edu/etd/932

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Schiller, Todd. “Ensemble Support Vector Machine Models of Radiation-Induced Lung Injury Risk.” 2009. Thesis, Washington University in St. Louis. Accessed June 25, 2019. https://openscholarship.wustl.edu/etd/932.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Schiller, Todd. “Ensemble Support Vector Machine Models of Radiation-Induced Lung Injury Risk.” 2009. Web. 25 Jun 2019.

Vancouver:

Schiller T. Ensemble Support Vector Machine Models of Radiation-Induced Lung Injury Risk. [Internet] [Thesis]. Washington University in St. Louis; 2009. [cited 2019 Jun 25]. Available from: https://openscholarship.wustl.edu/etd/932.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Schiller T. Ensemble Support Vector Machine Models of Radiation-Induced Lung Injury Risk. [Thesis]. Washington University in St. Louis; 2009. Available from: https://openscholarship.wustl.edu/etd/932

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Cambridge

29. Wang, Fan. Penalised regression for high-dimensional data: an empirical investigation and improvements via ensemble learning .

Degree: University of Cambridge

 In a wide range of applications, datasets are generated for which the number of variables p exceeds the sample size n. Penalised likelihood methods are… (more)

Subjects/Keywords: Penalised regression; Lasso; Ensemble learning; Variable selection; High-dimensional data

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, F. (n.d.). Penalised regression for high-dimensional data: an empirical investigation and improvements via ensemble learning . (Thesis). University of Cambridge. Retrieved from https://www.repository.cam.ac.uk/handle/1810/289419

Note: this citation may be lacking information needed for this citation format:
No year of publication.
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Wang, Fan. “Penalised regression for high-dimensional data: an empirical investigation and improvements via ensemble learning .” Thesis, University of Cambridge. Accessed June 25, 2019. https://www.repository.cam.ac.uk/handle/1810/289419.

Note: this citation may be lacking information needed for this citation format:
No year of publication.
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Wang, Fan. “Penalised regression for high-dimensional data: an empirical investigation and improvements via ensemble learning .” Web. 25 Jun 2019.

Note: this citation may be lacking information needed for this citation format:
No year of publication.

Vancouver:

Wang F. Penalised regression for high-dimensional data: an empirical investigation and improvements via ensemble learning . [Internet] [Thesis]. University of Cambridge; [cited 2019 Jun 25]. Available from: https://www.repository.cam.ac.uk/handle/1810/289419.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
No year of publication.

Council of Science Editors:

Wang F. Penalised regression for high-dimensional data: an empirical investigation and improvements via ensemble learning . [Thesis]. University of Cambridge; Available from: https://www.repository.cam.ac.uk/handle/1810/289419

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
No year of publication.


Texas A&M University

30. Tracy, James L. Random Subset Feature Selection for Ecological Niche Modeling of Wildfire Activity and the Monarch Butterfly.

Degree: PhD, Entomology, 2018, Texas A&M University

 Correlative ecological niche models (ENMs) are essential for investigating distributions of species and natural phenomena via environmental correlates across broad fields, including entomology and pyrogeography… (more)

Subjects/Keywords: Random Feature Selection; Feature Subset Ensemble; Species Distribution Model; Pyrogeography; Danaus plexippus; Migratory Niche Model; Kernel Density Estimation Migratory Model; Insect Roadkill Niche Model

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tracy, J. L. (2018). Random Subset Feature Selection for Ecological Niche Modeling of Wildfire Activity and the Monarch Butterfly. (Doctoral Dissertation). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/174342

Chicago Manual of Style (16th Edition):

Tracy, James L. “Random Subset Feature Selection for Ecological Niche Modeling of Wildfire Activity and the Monarch Butterfly.” 2018. Doctoral Dissertation, Texas A&M University. Accessed June 25, 2019. http://hdl.handle.net/1969.1/174342.

MLA Handbook (7th Edition):

Tracy, James L. “Random Subset Feature Selection for Ecological Niche Modeling of Wildfire Activity and the Monarch Butterfly.” 2018. Web. 25 Jun 2019.

Vancouver:

Tracy JL. Random Subset Feature Selection for Ecological Niche Modeling of Wildfire Activity and the Monarch Butterfly. [Internet] [Doctoral dissertation]. Texas A&M University; 2018. [cited 2019 Jun 25]. Available from: http://hdl.handle.net/1969.1/174342.

Council of Science Editors:

Tracy JL. Random Subset Feature Selection for Ecological Niche Modeling of Wildfire Activity and the Monarch Butterfly. [Doctoral Dissertation]. Texas A&M University; 2018. Available from: http://hdl.handle.net/1969.1/174342

[1] [2]

.