Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Kernel functions). Showing records 1 – 30 of 97 total matches.

[1] [2] [3] [4]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

▼ Search Limiters


Oregon State University

1. Borer, David. Approximate solutions of Fredholm integral equations of the second kind with singular kernels.

Degree: MS, Mathematics, 1977, Oregon State University

 The kernel subtraction method of Kantorovich and Krylov is studied in the setting of "Collectively Compact Operator Approximation Theory." Fredholm integral equations of the second… (more)

Subjects/Keywords: Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Borer, D. (1977). Approximate solutions of Fredholm integral equations of the second kind with singular kernels. (Masters Thesis). Oregon State University. Retrieved from http://hdl.handle.net/1957/43555

Chicago Manual of Style (16th Edition):

Borer, David. “Approximate solutions of Fredholm integral equations of the second kind with singular kernels.” 1977. Masters Thesis, Oregon State University. Accessed November 25, 2020. http://hdl.handle.net/1957/43555.

MLA Handbook (7th Edition):

Borer, David. “Approximate solutions of Fredholm integral equations of the second kind with singular kernels.” 1977. Web. 25 Nov 2020.

Vancouver:

Borer D. Approximate solutions of Fredholm integral equations of the second kind with singular kernels. [Internet] [Masters thesis]. Oregon State University; 1977. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/1957/43555.

Council of Science Editors:

Borer D. Approximate solutions of Fredholm integral equations of the second kind with singular kernels. [Masters Thesis]. Oregon State University; 1977. Available from: http://hdl.handle.net/1957/43555


Rutgers University

2. Tonde, Chetan J., 1985-. Supervised feature learning via dependency maximization.

Degree: PhD, Computer Science, 2016, Rutgers University

A key challenge in machine learning is to automatically extract relevant feature representations of data for a given task. This becomes especially formidable task for… (more)

Subjects/Keywords: Machine learning; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tonde, Chetan J., 1. (2016). Supervised feature learning via dependency maximization. (Doctoral Dissertation). Rutgers University. Retrieved from https://rucore.libraries.rutgers.edu/rutgers-lib/50200/

Chicago Manual of Style (16th Edition):

Tonde, Chetan J., 1985-. “Supervised feature learning via dependency maximization.” 2016. Doctoral Dissertation, Rutgers University. Accessed November 25, 2020. https://rucore.libraries.rutgers.edu/rutgers-lib/50200/.

MLA Handbook (7th Edition):

Tonde, Chetan J., 1985-. “Supervised feature learning via dependency maximization.” 2016. Web. 25 Nov 2020.

Vancouver:

Tonde, Chetan J. 1. Supervised feature learning via dependency maximization. [Internet] [Doctoral dissertation]. Rutgers University; 2016. [cited 2020 Nov 25]. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/50200/.

Council of Science Editors:

Tonde, Chetan J. 1. Supervised feature learning via dependency maximization. [Doctoral Dissertation]. Rutgers University; 2016. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/50200/


Michigan State University

3. Yi, Heung Su. Harmonic Bergman functions on half-spaces.

Degree: PhD, Department of Mathematics, 1994, Michigan State University

Subjects/Keywords: Bergman kernel functions; Harmonic functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yi, H. S. (1994). Harmonic Bergman functions on half-spaces. (Doctoral Dissertation). Michigan State University. Retrieved from http://etd.lib.msu.edu/islandora/object/etd:23818

Chicago Manual of Style (16th Edition):

Yi, Heung Su. “Harmonic Bergman functions on half-spaces.” 1994. Doctoral Dissertation, Michigan State University. Accessed November 25, 2020. http://etd.lib.msu.edu/islandora/object/etd:23818.

MLA Handbook (7th Edition):

Yi, Heung Su. “Harmonic Bergman functions on half-spaces.” 1994. Web. 25 Nov 2020.

Vancouver:

Yi HS. Harmonic Bergman functions on half-spaces. [Internet] [Doctoral dissertation]. Michigan State University; 1994. [cited 2020 Nov 25]. Available from: http://etd.lib.msu.edu/islandora/object/etd:23818.

Council of Science Editors:

Yi HS. Harmonic Bergman functions on half-spaces. [Doctoral Dissertation]. Michigan State University; 1994. Available from: http://etd.lib.msu.edu/islandora/object/etd:23818


Texas Christian University

4. Bolen, James Cordell. A reproducing kernel function and convergence properties for discrete analytic functions / by James Cordell Bolen.

Degree: 1968, Texas Christian University

Subjects/Keywords: Analytic functions; Kernel functions; Convergence

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bolen, J. C. (1968). A reproducing kernel function and convergence properties for discrete analytic functions / by James Cordell Bolen. (Thesis). Texas Christian University. Retrieved from https://repository.tcu.edu/handle/116099117/33794

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Bolen, James Cordell. “A reproducing kernel function and convergence properties for discrete analytic functions / by James Cordell Bolen.” 1968. Thesis, Texas Christian University. Accessed November 25, 2020. https://repository.tcu.edu/handle/116099117/33794.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Bolen, James Cordell. “A reproducing kernel function and convergence properties for discrete analytic functions / by James Cordell Bolen.” 1968. Web. 25 Nov 2020.

Vancouver:

Bolen JC. A reproducing kernel function and convergence properties for discrete analytic functions / by James Cordell Bolen. [Internet] [Thesis]. Texas Christian University; 1968. [cited 2020 Nov 25]. Available from: https://repository.tcu.edu/handle/116099117/33794.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Bolen JC. A reproducing kernel function and convergence properties for discrete analytic functions / by James Cordell Bolen. [Thesis]. Texas Christian University; 1968. Available from: https://repository.tcu.edu/handle/116099117/33794

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Georgia Tech

5. Kingravi, Hassan. Reduced-set models for improving the training and execution speed of kernel methods.

Degree: PhD, Electrical and Computer Engineering, 2014, Georgia Tech

 This thesis aims to contribute to the area of kernel methods, which are a class of machine learning methods known for their wide applicability and… (more)

Subjects/Keywords: Machine learning; Kernel methods; Reproducing kernel Hilbert spaces; Adaptive control; Manifold learning; Algorithms; Computer algorithms; Kernel functions; Support vector machines

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kingravi, H. (2014). Reduced-set models for improving the training and execution speed of kernel methods. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/51799

Chicago Manual of Style (16th Edition):

Kingravi, Hassan. “Reduced-set models for improving the training and execution speed of kernel methods.” 2014. Doctoral Dissertation, Georgia Tech. Accessed November 25, 2020. http://hdl.handle.net/1853/51799.

MLA Handbook (7th Edition):

Kingravi, Hassan. “Reduced-set models for improving the training and execution speed of kernel methods.” 2014. Web. 25 Nov 2020.

Vancouver:

Kingravi H. Reduced-set models for improving the training and execution speed of kernel methods. [Internet] [Doctoral dissertation]. Georgia Tech; 2014. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/1853/51799.

Council of Science Editors:

Kingravi H. Reduced-set models for improving the training and execution speed of kernel methods. [Doctoral Dissertation]. Georgia Tech; 2014. Available from: http://hdl.handle.net/1853/51799


Ryerson University

6. Hosseinizadeh, Pouyan. Predicting system collapse : application of kernel-based machine learning and inclination analysis.

Degree: 2009, Ryerson University

 While many modelling methods have been developed and introduced to predict the actual state of a system at the next point of time, the purpose… (more)

Subjects/Keywords: Finance  – Mathematical models; Investments  – Mathematics; Economic forecasting  – Econometric models; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hosseinizadeh, P. (2009). Predicting system collapse : application of kernel-based machine learning and inclination analysis. (Thesis). Ryerson University. Retrieved from https://digital.library.ryerson.ca/islandora/object/RULA%3A1314

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hosseinizadeh, Pouyan. “Predicting system collapse : application of kernel-based machine learning and inclination analysis.” 2009. Thesis, Ryerson University. Accessed November 25, 2020. https://digital.library.ryerson.ca/islandora/object/RULA%3A1314.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hosseinizadeh, Pouyan. “Predicting system collapse : application of kernel-based machine learning and inclination analysis.” 2009. Web. 25 Nov 2020.

Vancouver:

Hosseinizadeh P. Predicting system collapse : application of kernel-based machine learning and inclination analysis. [Internet] [Thesis]. Ryerson University; 2009. [cited 2020 Nov 25]. Available from: https://digital.library.ryerson.ca/islandora/object/RULA%3A1314.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hosseinizadeh P. Predicting system collapse : application of kernel-based machine learning and inclination analysis. [Thesis]. Ryerson University; 2009. Available from: https://digital.library.ryerson.ca/islandora/object/RULA%3A1314

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Georgia Tech

7. Leonard, Andrew. Probabilistic methods for decision making in precision airdrop.

Degree: PhD, Mechanical Engineering, 2019, Georgia Tech

 It is inconvenient and often difficult to consider systems, or environments, that are not completely known. However, with the proliferation of autonomous systems, where the… (more)

Subjects/Keywords: Koopman; Frobenius-Perron; Airdrop; Probabilistic; Lobachevsky; Kernel-based; Radial basis functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Leonard, A. (2019). Probabilistic methods for decision making in precision airdrop. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/62090

Chicago Manual of Style (16th Edition):

Leonard, Andrew. “Probabilistic methods for decision making in precision airdrop.” 2019. Doctoral Dissertation, Georgia Tech. Accessed November 25, 2020. http://hdl.handle.net/1853/62090.

MLA Handbook (7th Edition):

Leonard, Andrew. “Probabilistic methods for decision making in precision airdrop.” 2019. Web. 25 Nov 2020.

Vancouver:

Leonard A. Probabilistic methods for decision making in precision airdrop. [Internet] [Doctoral dissertation]. Georgia Tech; 2019. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/1853/62090.

Council of Science Editors:

Leonard A. Probabilistic methods for decision making in precision airdrop. [Doctoral Dissertation]. Georgia Tech; 2019. Available from: http://hdl.handle.net/1853/62090


Stellenbosch University

8. Van der Westhuizen, Cornelius Stephanus. Nearest hypersphere classification : a comparison with other classification techniques.

Degree: MCom, Statistics and Actuarial Science, 2014, Stellenbosch University

ENGLISH ABSTRACT: Classification is a widely used statistical procedure to classify objects into two or more classes according to some rule which is based on… (more)

Subjects/Keywords: Statistics and actuarial science; Classification; Machine learning; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Van der Westhuizen, C. S. (2014). Nearest hypersphere classification : a comparison with other classification techniques. (Thesis). Stellenbosch University. Retrieved from http://hdl.handle.net/10019.1/95839

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Van der Westhuizen, Cornelius Stephanus. “Nearest hypersphere classification : a comparison with other classification techniques.” 2014. Thesis, Stellenbosch University. Accessed November 25, 2020. http://hdl.handle.net/10019.1/95839.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Van der Westhuizen, Cornelius Stephanus. “Nearest hypersphere classification : a comparison with other classification techniques.” 2014. Web. 25 Nov 2020.

Vancouver:

Van der Westhuizen CS. Nearest hypersphere classification : a comparison with other classification techniques. [Internet] [Thesis]. Stellenbosch University; 2014. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/10019.1/95839.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Van der Westhuizen CS. Nearest hypersphere classification : a comparison with other classification techniques. [Thesis]. Stellenbosch University; 2014. Available from: http://hdl.handle.net/10019.1/95839

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Columbia University

9. May, Avner. Kernel Approximation Methods for Speech Recognition.

Degree: 2018, Columbia University

 Over the past five years or so, deep learning methods have dramatically improved the state of the art performance in a variety of domains, including… (more)

Subjects/Keywords: Computer science; Artificial intelligence; Automatic speech recognition; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

May, A. (2018). Kernel Approximation Methods for Speech Recognition. (Doctoral Dissertation). Columbia University. Retrieved from https://doi.org/10.7916/D8D80P9T

Chicago Manual of Style (16th Edition):

May, Avner. “Kernel Approximation Methods for Speech Recognition.” 2018. Doctoral Dissertation, Columbia University. Accessed November 25, 2020. https://doi.org/10.7916/D8D80P9T.

MLA Handbook (7th Edition):

May, Avner. “Kernel Approximation Methods for Speech Recognition.” 2018. Web. 25 Nov 2020.

Vancouver:

May A. Kernel Approximation Methods for Speech Recognition. [Internet] [Doctoral dissertation]. Columbia University; 2018. [cited 2020 Nov 25]. Available from: https://doi.org/10.7916/D8D80P9T.

Council of Science Editors:

May A. Kernel Approximation Methods for Speech Recognition. [Doctoral Dissertation]. Columbia University; 2018. Available from: https://doi.org/10.7916/D8D80P9T


Hong Kong University of Science and Technology

10. Dai, Guang. Feature extraction VIA kernel weighted discriminant analysis methods.

Degree: 2007, Hong Kong University of Science and Technology

 In recent years, as the kernel extension of linear discriminant analysis (LDA), kernel discriminant analysis (KDA) has become one of popular and powerful tools for… (more)

Subjects/Keywords: Discriminant analysis ; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Dai, G. (2007). Feature extraction VIA kernel weighted discriminant analysis methods. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-4262 ; https://doi.org/10.14711/thesis-b992476 ; http://repository.ust.hk/ir/bitstream/1783.1-4262/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Dai, Guang. “Feature extraction VIA kernel weighted discriminant analysis methods.” 2007. Thesis, Hong Kong University of Science and Technology. Accessed November 25, 2020. http://repository.ust.hk/ir/Record/1783.1-4262 ; https://doi.org/10.14711/thesis-b992476 ; http://repository.ust.hk/ir/bitstream/1783.1-4262/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Dai, Guang. “Feature extraction VIA kernel weighted discriminant analysis methods.” 2007. Web. 25 Nov 2020.

Vancouver:

Dai G. Feature extraction VIA kernel weighted discriminant analysis methods. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2007. [cited 2020 Nov 25]. Available from: http://repository.ust.hk/ir/Record/1783.1-4262 ; https://doi.org/10.14711/thesis-b992476 ; http://repository.ust.hk/ir/bitstream/1783.1-4262/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Dai G. Feature extraction VIA kernel weighted discriminant analysis methods. [Thesis]. Hong Kong University of Science and Technology; 2007. Available from: http://repository.ust.hk/ir/Record/1783.1-4262 ; https://doi.org/10.14711/thesis-b992476 ; http://repository.ust.hk/ir/bitstream/1783.1-4262/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Hong Kong University of Science and Technology

11. Hsiao, Roger Wend Huu. Kernel eigenspace-based MLLR adaptation.

Degree: 2004, Hong Kong University of Science and Technology

Kernel methods have been applied to improve the performance of existing eigenvoice-based adaptation methods. Several adaptation methods including kernel eigenvoice adaptation (KEV) and embedded kernel(more)

Subjects/Keywords: Automatic speech recognition ; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hsiao, R. W. H. (2004). Kernel eigenspace-based MLLR adaptation. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-5805 ; https://doi.org/10.14711/thesis-b849865 ; http://repository.ust.hk/ir/bitstream/1783.1-5805/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Hsiao, Roger Wend Huu. “Kernel eigenspace-based MLLR adaptation.” 2004. Thesis, Hong Kong University of Science and Technology. Accessed November 25, 2020. http://repository.ust.hk/ir/Record/1783.1-5805 ; https://doi.org/10.14711/thesis-b849865 ; http://repository.ust.hk/ir/bitstream/1783.1-5805/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Hsiao, Roger Wend Huu. “Kernel eigenspace-based MLLR adaptation.” 2004. Web. 25 Nov 2020.

Vancouver:

Hsiao RWH. Kernel eigenspace-based MLLR adaptation. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2004. [cited 2020 Nov 25]. Available from: http://repository.ust.hk/ir/Record/1783.1-5805 ; https://doi.org/10.14711/thesis-b849865 ; http://repository.ust.hk/ir/bitstream/1783.1-5805/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Hsiao RWH. Kernel eigenspace-based MLLR adaptation. [Thesis]. Hong Kong University of Science and Technology; 2004. Available from: http://repository.ust.hk/ir/Record/1783.1-5805 ; https://doi.org/10.14711/thesis-b849865 ; http://repository.ust.hk/ir/bitstream/1783.1-5805/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Hong Kong University of Science and Technology

12. Cheung, Pak Ming. Kernel-based multiple-instance learning.

Degree: 2006, Hong Kong University of Science and Technology

 In recent years, the Multiple-Instance Learning (MIL) problem is becoming more and more popular in the machine learning community. Each training object (bag) of the… (more)

Subjects/Keywords: Machine learning ; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Cheung, P. M. (2006). Kernel-based multiple-instance learning. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-5830 ; https://doi.org/10.14711/thesis-b931200 ; http://repository.ust.hk/ir/bitstream/1783.1-5830/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Cheung, Pak Ming. “Kernel-based multiple-instance learning.” 2006. Thesis, Hong Kong University of Science and Technology. Accessed November 25, 2020. http://repository.ust.hk/ir/Record/1783.1-5830 ; https://doi.org/10.14711/thesis-b931200 ; http://repository.ust.hk/ir/bitstream/1783.1-5830/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Cheung, Pak Ming. “Kernel-based multiple-instance learning.” 2006. Web. 25 Nov 2020.

Vancouver:

Cheung PM. Kernel-based multiple-instance learning. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2006. [cited 2020 Nov 25]. Available from: http://repository.ust.hk/ir/Record/1783.1-5830 ; https://doi.org/10.14711/thesis-b931200 ; http://repository.ust.hk/ir/bitstream/1783.1-5830/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Cheung PM. Kernel-based multiple-instance learning. [Thesis]. Hong Kong University of Science and Technology; 2006. Available from: http://repository.ust.hk/ir/Record/1783.1-5830 ; https://doi.org/10.14711/thesis-b931200 ; http://repository.ust.hk/ir/bitstream/1783.1-5830/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Michigan State University

13. Khan, Hassan Aqeel. Kernel methods for biosensing applications.

Degree: 2015, Michigan State University

Thesis Ph. D. Michigan State University. Electrical Engineering 2015.

This thesis examines the design noise robust information retrieval techniques basedon kernel methods. Algorithms are presented… (more)

Subjects/Keywords: Biosensors – Mathematical models; Signal processing – Mathematics; Kernel functions; Electrical engineering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Khan, H. A. (2015). Kernel methods for biosensing applications. (Thesis). Michigan State University. Retrieved from http://etd.lib.msu.edu/islandora/object/etd:3452

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Khan, Hassan Aqeel. “Kernel methods for biosensing applications.” 2015. Thesis, Michigan State University. Accessed November 25, 2020. http://etd.lib.msu.edu/islandora/object/etd:3452.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Khan, Hassan Aqeel. “Kernel methods for biosensing applications.” 2015. Web. 25 Nov 2020.

Vancouver:

Khan HA. Kernel methods for biosensing applications. [Internet] [Thesis]. Michigan State University; 2015. [cited 2020 Nov 25]. Available from: http://etd.lib.msu.edu/islandora/object/etd:3452.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Khan HA. Kernel methods for biosensing applications. [Thesis]. Michigan State University; 2015. Available from: http://etd.lib.msu.edu/islandora/object/etd:3452

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Michigan State University

14. Chitta, Radha. Kernel-based clustering of big data.

Degree: 2015, Michigan State University

Thesis Ph. D. Michigan State University. Computer Science 2015.

There has been a rapid increase in the volume of digital data over the recent years.… (more)

Subjects/Keywords: Cluster analysis; Computer algorithms; Kernel functions; Big data; Computer science

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chitta, R. (2015). Kernel-based clustering of big data. (Thesis). Michigan State University. Retrieved from http://etd.lib.msu.edu/islandora/object/etd:3727

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Chitta, Radha. “Kernel-based clustering of big data.” 2015. Thesis, Michigan State University. Accessed November 25, 2020. http://etd.lib.msu.edu/islandora/object/etd:3727.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Chitta, Radha. “Kernel-based clustering of big data.” 2015. Web. 25 Nov 2020.

Vancouver:

Chitta R. Kernel-based clustering of big data. [Internet] [Thesis]. Michigan State University; 2015. [cited 2020 Nov 25]. Available from: http://etd.lib.msu.edu/islandora/object/etd:3727.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Chitta R. Kernel-based clustering of big data. [Thesis]. Michigan State University; 2015. Available from: http://etd.lib.msu.edu/islandora/object/etd:3727

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Michigan State University

15. He, Tao, Ph. D. Kernel-based nonparametric testing in high-dimensional data with applications to gene set analysis.

Degree: 2015, Michigan State University

Thesis Ph. D. Michigan State University. Statistics 2015

The ultimate goal of genome-wide association studies (GWAS) is understanding the underlying relationship between genetic variants and… (more)

Subjects/Keywords: Genetics – Simulation methods; Kernel functions; Nonparametric statistics; Statistics; Biostatistics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

He, Tao, P. D. (2015). Kernel-based nonparametric testing in high-dimensional data with applications to gene set analysis. (Thesis). Michigan State University. Retrieved from http://etd.lib.msu.edu/islandora/object/etd:3667

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

He, Tao, Ph D. “Kernel-based nonparametric testing in high-dimensional data with applications to gene set analysis.” 2015. Thesis, Michigan State University. Accessed November 25, 2020. http://etd.lib.msu.edu/islandora/object/etd:3667.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

He, Tao, Ph D. “Kernel-based nonparametric testing in high-dimensional data with applications to gene set analysis.” 2015. Web. 25 Nov 2020.

Vancouver:

He, Tao PD. Kernel-based nonparametric testing in high-dimensional data with applications to gene set analysis. [Internet] [Thesis]. Michigan State University; 2015. [cited 2020 Nov 25]. Available from: http://etd.lib.msu.edu/islandora/object/etd:3667.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

He, Tao PD. Kernel-based nonparametric testing in high-dimensional data with applications to gene set analysis. [Thesis]. Michigan State University; 2015. Available from: http://etd.lib.msu.edu/islandora/object/etd:3667

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Georgia Tech

16. Mumford, Michael Leslie. Applications of reproducing kernels in Hilbert spaces.

Degree: MS, Applied Mathematics, 1972, Georgia Tech

Subjects/Keywords: Kernel functions; Hilbert space

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Mumford, M. L. (1972). Applications of reproducing kernels in Hilbert spaces. (Masters Thesis). Georgia Tech. Retrieved from http://hdl.handle.net/1853/28742

Chicago Manual of Style (16th Edition):

Mumford, Michael Leslie. “Applications of reproducing kernels in Hilbert spaces.” 1972. Masters Thesis, Georgia Tech. Accessed November 25, 2020. http://hdl.handle.net/1853/28742.

MLA Handbook (7th Edition):

Mumford, Michael Leslie. “Applications of reproducing kernels in Hilbert spaces.” 1972. Web. 25 Nov 2020.

Vancouver:

Mumford ML. Applications of reproducing kernels in Hilbert spaces. [Internet] [Masters thesis]. Georgia Tech; 1972. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/1853/28742.

Council of Science Editors:

Mumford ML. Applications of reproducing kernels in Hilbert spaces. [Masters Thesis]. Georgia Tech; 1972. Available from: http://hdl.handle.net/1853/28742


Virginia Tech

17. Guerra Huaman, Moises Daniel. Schur-class of finitely connected planar domains: the test-function approach.

Degree: PhD, Mathematics, 2011, Virginia Tech

 We study the structure of the set of extreme points of the compact convex set of matrix-valued holomorphic functions with positive real part on a… (more)

Subjects/Keywords: completely positive kernel; extreme points.; Schur class; test functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Guerra Huaman, M. D. (2011). Schur-class of finitely connected planar domains: the test-function approach. (Doctoral Dissertation). Virginia Tech. Retrieved from http://hdl.handle.net/10919/27334

Chicago Manual of Style (16th Edition):

Guerra Huaman, Moises Daniel. “Schur-class of finitely connected planar domains: the test-function approach.” 2011. Doctoral Dissertation, Virginia Tech. Accessed November 25, 2020. http://hdl.handle.net/10919/27334.

MLA Handbook (7th Edition):

Guerra Huaman, Moises Daniel. “Schur-class of finitely connected planar domains: the test-function approach.” 2011. Web. 25 Nov 2020.

Vancouver:

Guerra Huaman MD. Schur-class of finitely connected planar domains: the test-function approach. [Internet] [Doctoral dissertation]. Virginia Tech; 2011. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/10919/27334.

Council of Science Editors:

Guerra Huaman MD. Schur-class of finitely connected planar domains: the test-function approach. [Doctoral Dissertation]. Virginia Tech; 2011. Available from: http://hdl.handle.net/10919/27334

18. Kulis, Brian Joseph. Scalable kernel methods for machine learning.

Degree: PhD, Computer Sciences, 2008, University of Texas – Austin

 Machine learning techniques are now essential for a diverse set of applications in computer vision, natural language processing, software analysis, and many other domains. As… (more)

Subjects/Keywords: Machine learning; Kernel functions

kernel methods. First, we introduce a scalable framework for learning kernel functions based on… …Background and Related Work 2.1 Kernel Functions and Feature Mappings . . 2.1.1 Example: Kernel k… …Bibliography 174 Vita 190 xii List of Tables 2.1 Examples of popular kernel functions… …effectively work with the original data through the use of so-called kernel functions. A second… …advantage is that kernel functions have been defined over many complex data types that are not… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kulis, B. J. (2008). Scalable kernel methods for machine learning. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/18243

Chicago Manual of Style (16th Edition):

Kulis, Brian Joseph. “Scalable kernel methods for machine learning.” 2008. Doctoral Dissertation, University of Texas – Austin. Accessed November 25, 2020. http://hdl.handle.net/2152/18243.

MLA Handbook (7th Edition):

Kulis, Brian Joseph. “Scalable kernel methods for machine learning.” 2008. Web. 25 Nov 2020.

Vancouver:

Kulis BJ. Scalable kernel methods for machine learning. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2008. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/2152/18243.

Council of Science Editors:

Kulis BJ. Scalable kernel methods for machine learning. [Doctoral Dissertation]. University of Texas – Austin; 2008. Available from: http://hdl.handle.net/2152/18243


Massey University

19. Almalki, Adel Ahmed. The analysis of fragmentation type equation for special division kernels.

Degree: PhD, Mathematics, 2020, Massey University

 The growth fragmentation equation is a linear integro-differential equation describing the evolution of cohorts that grow, divide and die or disappear in the course of… (more)

Subjects/Keywords: Cells; Growth; Cell division; Mathematical models; Boundary value problems; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Almalki, A. A. (2020). The analysis of fragmentation type equation for special division kernels. (Doctoral Dissertation). Massey University. Retrieved from http://hdl.handle.net/10179/15716

Chicago Manual of Style (16th Edition):

Almalki, Adel Ahmed. “The analysis of fragmentation type equation for special division kernels.” 2020. Doctoral Dissertation, Massey University. Accessed November 25, 2020. http://hdl.handle.net/10179/15716.

MLA Handbook (7th Edition):

Almalki, Adel Ahmed. “The analysis of fragmentation type equation for special division kernels.” 2020. Web. 25 Nov 2020.

Vancouver:

Almalki AA. The analysis of fragmentation type equation for special division kernels. [Internet] [Doctoral dissertation]. Massey University; 2020. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/10179/15716.

Council of Science Editors:

Almalki AA. The analysis of fragmentation type equation for special division kernels. [Doctoral Dissertation]. Massey University; 2020. Available from: http://hdl.handle.net/10179/15716

20. Pereira, Luís Augusto Martins, 1981-. Domain adaptation via subspace learning and kernel methods : Adaptação de domínio via aprendizado de subespaço e métodos de kernel: Adaptação de domínio via aprendizado de subespaço e métodos de kernel.

Degree: 2018, Universidade Estadual de Campinas

 Abstract: Domain shift is a phenomenon observed when two related domains ¿ a source (training set) and a target domain (test set) ¿ have a… (more)

Subjects/Keywords: Aprendizado de máquina; Reconhecimento de padrões; Kernel, Funções de; Algoritmos; Machine learning; Pattern recognition; Kernel functions; Algorithms

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Pereira, Luís Augusto Martins, 1. (2018). Domain adaptation via subspace learning and kernel methods : Adaptação de domínio via aprendizado de subespaço e métodos de kernel: Adaptação de domínio via aprendizado de subespaço e métodos de kernel. (Thesis). Universidade Estadual de Campinas. Retrieved from http://repositorio.unicamp.br/jspui/handle/REPOSIP/335581

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Pereira, Luís Augusto Martins, 1981-. “Domain adaptation via subspace learning and kernel methods : Adaptação de domínio via aprendizado de subespaço e métodos de kernel: Adaptação de domínio via aprendizado de subespaço e métodos de kernel.” 2018. Thesis, Universidade Estadual de Campinas. Accessed November 25, 2020. http://repositorio.unicamp.br/jspui/handle/REPOSIP/335581.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Pereira, Luís Augusto Martins, 1981-. “Domain adaptation via subspace learning and kernel methods : Adaptação de domínio via aprendizado de subespaço e métodos de kernel: Adaptação de domínio via aprendizado de subespaço e métodos de kernel.” 2018. Web. 25 Nov 2020.

Vancouver:

Pereira, Luís Augusto Martins 1. Domain adaptation via subspace learning and kernel methods : Adaptação de domínio via aprendizado de subespaço e métodos de kernel: Adaptação de domínio via aprendizado de subespaço e métodos de kernel. [Internet] [Thesis]. Universidade Estadual de Campinas; 2018. [cited 2020 Nov 25]. Available from: http://repositorio.unicamp.br/jspui/handle/REPOSIP/335581.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Pereira, Luís Augusto Martins 1. Domain adaptation via subspace learning and kernel methods : Adaptação de domínio via aprendizado de subespaço e métodos de kernel: Adaptação de domínio via aprendizado de subespaço e métodos de kernel. [Thesis]. Universidade Estadual de Campinas; 2018. Available from: http://repositorio.unicamp.br/jspui/handle/REPOSIP/335581

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Stellenbosch University

21. Melonas, Michail C. Projected naive bayes.

Degree: MCom, Statistics and Actuarial Science, 2020, Stellenbosch University

ENGLISH SUMMARY : Naïve Bayes is a well-known statistical model that is recognised by the Institute of Electrical and Electronics Engineers (IEEE) as being among… (more)

Subjects/Keywords: Bayesian statistical decision theory; Correspondence analysis (Statistics); Gaussian distribution; Kernel functions; Kernel density estimation; Data mining; Computer algorithms; UCTD

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Melonas, M. C. (2020). Projected naive bayes. (Thesis). Stellenbosch University. Retrieved from http://hdl.handle.net/10019.1/107862

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Melonas, Michail C. “Projected naive bayes.” 2020. Thesis, Stellenbosch University. Accessed November 25, 2020. http://hdl.handle.net/10019.1/107862.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Melonas, Michail C. “Projected naive bayes.” 2020. Web. 25 Nov 2020.

Vancouver:

Melonas MC. Projected naive bayes. [Internet] [Thesis]. Stellenbosch University; 2020. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/10019.1/107862.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Melonas MC. Projected naive bayes. [Thesis]. Stellenbosch University; 2020. Available from: http://hdl.handle.net/10019.1/107862

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

22. TABBARA, RAMI. Generalised directed walker models of adsorption and gelation.

Degree: 2015, University of Melbourne

 We outline an approach to constructing and solving models of highly interactive systems of single and multiple homopolymers, focusing on adsorption and gelation effects. In… (more)

Subjects/Keywords: polymers; polymer models; directed walkers; directed walks; kernel method; combinatorics; adsorption; gelation; generating functions; multiple walkers; obstinate kernel method

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

TABBARA, R. (2015). Generalised directed walker models of adsorption and gelation. (Doctoral Dissertation). University of Melbourne. Retrieved from http://hdl.handle.net/11343/55359

Chicago Manual of Style (16th Edition):

TABBARA, RAMI. “Generalised directed walker models of adsorption and gelation.” 2015. Doctoral Dissertation, University of Melbourne. Accessed November 25, 2020. http://hdl.handle.net/11343/55359.

MLA Handbook (7th Edition):

TABBARA, RAMI. “Generalised directed walker models of adsorption and gelation.” 2015. Web. 25 Nov 2020.

Vancouver:

TABBARA R. Generalised directed walker models of adsorption and gelation. [Internet] [Doctoral dissertation]. University of Melbourne; 2015. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/11343/55359.

Council of Science Editors:

TABBARA R. Generalised directed walker models of adsorption and gelation. [Doctoral Dissertation]. University of Melbourne; 2015. Available from: http://hdl.handle.net/11343/55359


Ryerson University

23. Athavale, Yashodhan Rajiv. Pattern classification of time-series signals using Fisher kernels and support vector machines.

Degree: 2010, Ryerson University

 The objective of this study is to assess the performance and capability of a kernel-based machine learning method for time-series signal classification. Applying various stages… (more)

Subjects/Keywords: Kernel functions  – Evaluation; Kernel functions  – Testing; Machine learning; Osteoarthritis  – Diagnosis  – Computer programs; Stock exchanges  – Mathematical models; Signal processing; Time-series analysis  – Computer programs

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Athavale, Y. R. (2010). Pattern classification of time-series signals using Fisher kernels and support vector machines. (Thesis). Ryerson University. Retrieved from https://digital.library.ryerson.ca/islandora/object/RULA%3A1262

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Athavale, Yashodhan Rajiv. “Pattern classification of time-series signals using Fisher kernels and support vector machines.” 2010. Thesis, Ryerson University. Accessed November 25, 2020. https://digital.library.ryerson.ca/islandora/object/RULA%3A1262.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Athavale, Yashodhan Rajiv. “Pattern classification of time-series signals using Fisher kernels and support vector machines.” 2010. Web. 25 Nov 2020.

Vancouver:

Athavale YR. Pattern classification of time-series signals using Fisher kernels and support vector machines. [Internet] [Thesis]. Ryerson University; 2010. [cited 2020 Nov 25]. Available from: https://digital.library.ryerson.ca/islandora/object/RULA%3A1262.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Athavale YR. Pattern classification of time-series signals using Fisher kernels and support vector machines. [Thesis]. Ryerson University; 2010. Available from: https://digital.library.ryerson.ca/islandora/object/RULA%3A1262

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Florida

24. Luery, Kristin E. Composition Operators on Hardy Spaces of the Disk and Half-Plane.

Degree: PhD, Mathematics, 2013, University of Florida

 In his work, J. H. Shapiro provided an integral formula for the Nevanlinna counting function and used it to prove many results for composition operators… (more)

Subjects/Keywords: Analytic functions; Analytics; Automorphisms; Hilbert spaces; Juries; Kernel functions; Log integral function; Mathematical theorems; Mathematics; Property composition; composition  – operator

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Luery, K. E. (2013). Composition Operators on Hardy Spaces of the Disk and Half-Plane. (Doctoral Dissertation). University of Florida. Retrieved from https://ufdc.ufl.edu/UFE0045410

Chicago Manual of Style (16th Edition):

Luery, Kristin E. “Composition Operators on Hardy Spaces of the Disk and Half-Plane.” 2013. Doctoral Dissertation, University of Florida. Accessed November 25, 2020. https://ufdc.ufl.edu/UFE0045410.

MLA Handbook (7th Edition):

Luery, Kristin E. “Composition Operators on Hardy Spaces of the Disk and Half-Plane.” 2013. Web. 25 Nov 2020.

Vancouver:

Luery KE. Composition Operators on Hardy Spaces of the Disk and Half-Plane. [Internet] [Doctoral dissertation]. University of Florida; 2013. [cited 2020 Nov 25]. Available from: https://ufdc.ufl.edu/UFE0045410.

Council of Science Editors:

Luery KE. Composition Operators on Hardy Spaces of the Disk and Half-Plane. [Doctoral Dissertation]. University of Florida; 2013. Available from: https://ufdc.ufl.edu/UFE0045410


Brno University of Technology

25. Homoliak, Ivan. Zvýšení úspěšnosti klasifikace v libSVM s použitím řetězcových fukcí: Increasing Classification Accuracy in libSVM Using String Kernel Functions.

Degree: 2019, Brno University of Technology

 Publication aims to explore dependencies of text classification used with string kernel functions. String kernel functions are here used to retrieve rate of similarity between… (more)

Subjects/Keywords: řetězcové funkce; libSVM; klasifikace; umělá inteligence; string kernel functions; libSVM; classification; artificial inteligence

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Homoliak, I. (2019). Zvýšení úspěšnosti klasifikace v libSVM s použitím řetězcových fukcí: Increasing Classification Accuracy in libSVM Using String Kernel Functions. (Thesis). Brno University of Technology. Retrieved from http://hdl.handle.net/11012/55997

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Homoliak, Ivan. “Zvýšení úspěšnosti klasifikace v libSVM s použitím řetězcových fukcí: Increasing Classification Accuracy in libSVM Using String Kernel Functions.” 2019. Thesis, Brno University of Technology. Accessed November 25, 2020. http://hdl.handle.net/11012/55997.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Homoliak, Ivan. “Zvýšení úspěšnosti klasifikace v libSVM s použitím řetězcových fukcí: Increasing Classification Accuracy in libSVM Using String Kernel Functions.” 2019. Web. 25 Nov 2020.

Vancouver:

Homoliak I. Zvýšení úspěšnosti klasifikace v libSVM s použitím řetězcových fukcí: Increasing Classification Accuracy in libSVM Using String Kernel Functions. [Internet] [Thesis]. Brno University of Technology; 2019. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/11012/55997.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Homoliak I. Zvýšení úspěšnosti klasifikace v libSVM s použitím řetězcových fukcí: Increasing Classification Accuracy in libSVM Using String Kernel Functions. [Thesis]. Brno University of Technology; 2019. Available from: http://hdl.handle.net/11012/55997

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Delaware

26. Xu, Lifan. Android malware classification using parallelized machine learning methods.

Degree: PhD, University of Delaware, Department of Computer and Information Sciences, 2016, University of Delaware

 Android is the most popular mobile operating system with a market share of over 80%. Due to its popularity and also its open source nature,… (more)

Subjects/Keywords: Smartphones.; Machine learning.; Malware (Computer software); Neural networks (Computer science); Kernel functions.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Xu, L. (2016). Android malware classification using parallelized machine learning methods. (Doctoral Dissertation). University of Delaware. Retrieved from http://udspace.udel.edu/handle/19716/19893

Chicago Manual of Style (16th Edition):

Xu, Lifan. “Android malware classification using parallelized machine learning methods.” 2016. Doctoral Dissertation, University of Delaware. Accessed November 25, 2020. http://udspace.udel.edu/handle/19716/19893.

MLA Handbook (7th Edition):

Xu, Lifan. “Android malware classification using parallelized machine learning methods.” 2016. Web. 25 Nov 2020.

Vancouver:

Xu L. Android malware classification using parallelized machine learning methods. [Internet] [Doctoral dissertation]. University of Delaware; 2016. [cited 2020 Nov 25]. Available from: http://udspace.udel.edu/handle/19716/19893.

Council of Science Editors:

Xu L. Android malware classification using parallelized machine learning methods. [Doctoral Dissertation]. University of Delaware; 2016. Available from: http://udspace.udel.edu/handle/19716/19893


Hong Kong University of Science and Technology

27. Tsang, Wai-Hung. Scaling up support vector machines.

Degree: 2007, Hong Kong University of Science and Technology

Kernel methods, such as support vector machines (SVMs), have been successfully used in various aspects of machine learning problems, such as classification, regression, and ranking.… (more)

Subjects/Keywords: Machine learning ; Kernel functions ; Mathematical optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tsang, W. (2007). Scaling up support vector machines. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-3076 ; https://doi.org/10.14711/thesis-b987563 ; http://repository.ust.hk/ir/bitstream/1783.1-3076/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Tsang, Wai-Hung. “Scaling up support vector machines.” 2007. Thesis, Hong Kong University of Science and Technology. Accessed November 25, 2020. http://repository.ust.hk/ir/Record/1783.1-3076 ; https://doi.org/10.14711/thesis-b987563 ; http://repository.ust.hk/ir/bitstream/1783.1-3076/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Tsang, Wai-Hung. “Scaling up support vector machines.” 2007. Web. 25 Nov 2020.

Vancouver:

Tsang W. Scaling up support vector machines. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2007. [cited 2020 Nov 25]. Available from: http://repository.ust.hk/ir/Record/1783.1-3076 ; https://doi.org/10.14711/thesis-b987563 ; http://repository.ust.hk/ir/bitstream/1783.1-3076/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tsang W. Scaling up support vector machines. [Thesis]. Hong Kong University of Science and Technology; 2007. Available from: http://repository.ust.hk/ir/Record/1783.1-3076 ; https://doi.org/10.14711/thesis-b987563 ; http://repository.ust.hk/ir/bitstream/1783.1-3076/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Hong Kong University of Science and Technology

28. Zhang, Kai. Kernel-based clustering and low rank approximation.

Degree: 2008, Hong Kong University of Science and Technology

 Clustering is an unsupervised data exploration scenario that is of fundamental importance to pattern recognition and machine learning. This thesis involves two types of clustering… (more)

Subjects/Keywords: Cluster analysis ; Kernel functions ; Approximation theory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhang, K. (2008). Kernel-based clustering and low rank approximation. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-3572 ; https://doi.org/10.14711/thesis-b1029252 ; http://repository.ust.hk/ir/bitstream/1783.1-3572/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Zhang, Kai. “Kernel-based clustering and low rank approximation.” 2008. Thesis, Hong Kong University of Science and Technology. Accessed November 25, 2020. http://repository.ust.hk/ir/Record/1783.1-3572 ; https://doi.org/10.14711/thesis-b1029252 ; http://repository.ust.hk/ir/bitstream/1783.1-3572/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Zhang, Kai. “Kernel-based clustering and low rank approximation.” 2008. Web. 25 Nov 2020.

Vancouver:

Zhang K. Kernel-based clustering and low rank approximation. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2008. [cited 2020 Nov 25]. Available from: http://repository.ust.hk/ir/Record/1783.1-3572 ; https://doi.org/10.14711/thesis-b1029252 ; http://repository.ust.hk/ir/bitstream/1783.1-3572/1/th_redirect.html.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhang K. Kernel-based clustering and low rank approximation. [Thesis]. Hong Kong University of Science and Technology; 2008. Available from: http://repository.ust.hk/ir/Record/1783.1-3572 ; https://doi.org/10.14711/thesis-b1029252 ; http://repository.ust.hk/ir/bitstream/1783.1-3572/1/th_redirect.html

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Arizona

29. Long, Andrew Edmund. Cokriging, kernels, and the SVD: Toward better geostatistical analysis.

Degree: 1994, University of Arizona

 Three forms of multivariate analysis, one very classical and the other two relatively new and little-known, are showcased and enhanced: the first is the Singular… (more)

Subjects/Keywords: Geology  – Statistical methods.; Kriging.; Kernel functions.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Long, A. E. (1994). Cokriging, kernels, and the SVD: Toward better geostatistical analysis. (Doctoral Dissertation). University of Arizona. Retrieved from http://hdl.handle.net/10150/186892

Chicago Manual of Style (16th Edition):

Long, Andrew Edmund. “Cokriging, kernels, and the SVD: Toward better geostatistical analysis. ” 1994. Doctoral Dissertation, University of Arizona. Accessed November 25, 2020. http://hdl.handle.net/10150/186892.

MLA Handbook (7th Edition):

Long, Andrew Edmund. “Cokriging, kernels, and the SVD: Toward better geostatistical analysis. ” 1994. Web. 25 Nov 2020.

Vancouver:

Long AE. Cokriging, kernels, and the SVD: Toward better geostatistical analysis. [Internet] [Doctoral dissertation]. University of Arizona; 1994. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/10150/186892.

Council of Science Editors:

Long AE. Cokriging, kernels, and the SVD: Toward better geostatistical analysis. [Doctoral Dissertation]. University of Arizona; 1994. Available from: http://hdl.handle.net/10150/186892


Uniwersytet im. Adama Mickiewicza w Poznaniu

30. Deręgowski, Karol. Nieklasyczne metody konstrukcji składowych głównych .

Degree: 2011, Uniwersytet im. Adama Mickiewicza w Poznaniu

 Rozprawa poświęcona jest nieklasycznym metodom konstrukcji składowych głównych. W Rozdziale 1 rozprawy przedstawiono klasyczny sposób konstrukcji składowych głównych oraz ich podstawowe własności. W Rozdziale 2… (more)

Subjects/Keywords: Nieliniowe składowe główne; Nonlinear PCA; Funkcje jądrowe,; Functional PCA; Funkcjonalne składowe główne; Kernel functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Deręgowski, K. (2011). Nieklasyczne metody konstrukcji składowych głównych . (Doctoral Dissertation). Uniwersytet im. Adama Mickiewicza w Poznaniu. Retrieved from http://hdl.handle.net/10593/1495

Chicago Manual of Style (16th Edition):

Deręgowski, Karol. “Nieklasyczne metody konstrukcji składowych głównych .” 2011. Doctoral Dissertation, Uniwersytet im. Adama Mickiewicza w Poznaniu. Accessed November 25, 2020. http://hdl.handle.net/10593/1495.

MLA Handbook (7th Edition):

Deręgowski, Karol. “Nieklasyczne metody konstrukcji składowych głównych .” 2011. Web. 25 Nov 2020.

Vancouver:

Deręgowski K. Nieklasyczne metody konstrukcji składowych głównych . [Internet] [Doctoral dissertation]. Uniwersytet im. Adama Mickiewicza w Poznaniu; 2011. [cited 2020 Nov 25]. Available from: http://hdl.handle.net/10593/1495.

Council of Science Editors:

Deręgowski K. Nieklasyczne metody konstrukcji składowych głównych . [Doctoral Dissertation]. Uniwersytet im. Adama Mickiewicza w Poznaniu; 2011. Available from: http://hdl.handle.net/10593/1495

[1] [2] [3] [4]

.