Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for +publisher:"Cornell University" +contributor:("Wilson, Andrew G"). Showing records 1 – 2 of 2 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


Cornell University

1. Wang, Ke Alexander. Large Scale Exact Gaussian Processes Inference and Euclidean Constrained Neural Networks with Physics Priors.

Degree: M.S., Computer Science, Computer Science, 2020, Cornell University

Intelligent systems that interact with the physical world must be able to model the underlying dynamics accurately to be able to make informed actions and decisions. This requires accurate dynamics models that are scalable enough to learn from large amounts of data, robust enough to be used in the presence of noisy data or scarce data, and flexible enough to capture the true dynamics of arbitrary systems. Gaussian processes and neural networks each have desirable properties that make them potential models for this task, but they do not meet all of the above criteria  – Gaussians processes do not scale well computationally to large datasets, and current neural networks do not generalize well to complex physical systems. In this thesis, we present two methods that help address these shortcomings. First, we present a practical method to scale exact inference with Gaussian processes to over a million data points using GPU parallelism, a hundred times more than previous methods. In addition, our method outperforms other scalable Gaussian processes while maintaining similar or faster training times. We then present a method to lower the burden of learning physical systems for neural networks by representing constraints explicitly and using coordinate systems that simplify the functions that must be learned. Our method results in models that are a hundred times more accurate than competing baselines while maintaining a hundred times higher data efficiency. Advisors/Committee Members: Wilson, Andrew G (chair), Kleinberg, Robert D (committee member).

Subjects/Keywords: exact inference; Gaussian process; hamiltonian; lagrangian; neural networks; physics priors

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, K. A. (2020). Large Scale Exact Gaussian Processes Inference and Euclidean Constrained Neural Networks with Physics Priors. (Masters Thesis). Cornell University. Retrieved from http://hdl.handle.net/1813/70285

Chicago Manual of Style (16th Edition):

Wang, Ke Alexander. “Large Scale Exact Gaussian Processes Inference and Euclidean Constrained Neural Networks with Physics Priors.” 2020. Masters Thesis, Cornell University. Accessed May 11, 2021. http://hdl.handle.net/1813/70285.

MLA Handbook (7th Edition):

Wang, Ke Alexander. “Large Scale Exact Gaussian Processes Inference and Euclidean Constrained Neural Networks with Physics Priors.” 2020. Web. 11 May 2021.

Vancouver:

Wang KA. Large Scale Exact Gaussian Processes Inference and Euclidean Constrained Neural Networks with Physics Priors. [Internet] [Masters thesis]. Cornell University; 2020. [cited 2021 May 11]. Available from: http://hdl.handle.net/1813/70285.

Council of Science Editors:

Wang KA. Large Scale Exact Gaussian Processes Inference and Euclidean Constrained Neural Networks with Physics Priors. [Masters Thesis]. Cornell University; 2020. Available from: http://hdl.handle.net/1813/70285


Cornell University

2. Delbridge, Ian Andrew. Randomly Projected Additive Gaussian Processes and Fast Streaming Gaussian Processes.

Degree: M.S., Computer Science, Computer Science, 2020, Cornell University

Gaussian processes are powerful Bayesian non-parametric models used for their closed-form posterior and marginal likelihoods. Yet, traditional methods for computing these quantities are not scalable to large data sets. And traditional kernels often do not express inductive biases that allow sample-efficient learning. Finding and exploiting structure in Gaussian processes builds a path towards tackling both problems. However, this has traditionally been accomplished through burdensome learning procedures, often by maximizing the marginal likelihood or variational objectives. In this thesis, we present methods for creating and exploiting structure in Gaussian processes directly. We first present randomly projected additive Gaussian processes, a class of Gaussian processes whose kernels operate additively over a set of random data projections. We prove that these kernels converge almost surely to a limiting kernel, which can be analytically derived in certain cases. We derive error bounds that characterize this convergence rate. We propose modifications to randomly projected additive Gaussian processes that improve their empirical convergence rate and regression performance. Next, we present algorithms for performing efficient online Gaussian process inference. Specifically, we present algorithms for computing the marginal likelihood, score function, and predictive distributions in constant time with respect to the number of observed data by using kernel interpolation approximations and algebraic manipulations using the Woodbury matrix identity. Advisors/Committee Members: Wilson, Andrew G (chair), Samorodnitsky, Gennady (committee member).

Subjects/Keywords: additive; Gaussian Process; online; projected; streaming

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Delbridge, I. A. (2020). Randomly Projected Additive Gaussian Processes and Fast Streaming Gaussian Processes. (Masters Thesis). Cornell University. Retrieved from http://hdl.handle.net/1813/70303

Chicago Manual of Style (16th Edition):

Delbridge, Ian Andrew. “Randomly Projected Additive Gaussian Processes and Fast Streaming Gaussian Processes.” 2020. Masters Thesis, Cornell University. Accessed May 11, 2021. http://hdl.handle.net/1813/70303.

MLA Handbook (7th Edition):

Delbridge, Ian Andrew. “Randomly Projected Additive Gaussian Processes and Fast Streaming Gaussian Processes.” 2020. Web. 11 May 2021.

Vancouver:

Delbridge IA. Randomly Projected Additive Gaussian Processes and Fast Streaming Gaussian Processes. [Internet] [Masters thesis]. Cornell University; 2020. [cited 2021 May 11]. Available from: http://hdl.handle.net/1813/70303.

Council of Science Editors:

Delbridge IA. Randomly Projected Additive Gaussian Processes and Fast Streaming Gaussian Processes. [Masters Thesis]. Cornell University; 2020. Available from: http://hdl.handle.net/1813/70303

.