Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

You searched for +publisher:"University of Illinois – Urbana-Champaign" +contributor:("Poole, David"). One record found.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


University of Illinois – Urbana-Champaign

1. Choi, Jaesik. Lifted Inference for Relational Hybrid Models.

Degree: PhD, 0112, 2012, University of Illinois – Urbana-Champaign

Probabilistic Graphical Models (PGMs) promise to play a prominent role in many complex real-world systems. Probabilistic Relational Graphical Models (PRGMs) scale the representation and learning of PGMs. Answering questions using PRGMs enables many current and future applications, such as medical informatics, environmental engineering, financial forecasting and robot localizations. Scaling inference algorithms for large models is a key challenge for scaling up current applications and enabling future ones. This thesis presents new insights into large-scale probabilistic graphical models. It provides fresh ideas for maintaining a compact structure when answering questions or inferences about large, continuous models. The insights result in a key contribution, the Lifted Relational Kalman filter (LRKF), an efficient estimation algorithm for large-scale linear dynamic systems. It shows that the new relational Kalman filter enables scaling the exact vanilla Kalman filter from 1,000 to 1,000,000,000 variables. Another key contribution of this thesis is that it proves that typically used probabilistic first-order languages, including Markov Logic Networks (MLNs) and First-Order Probabilistic Models (FOPMs), can be reduced to compact probabilistic graphical representations under reasonable conditions. Specifically, this thesis shows that aggregate operators and the existential quantification in the languages are accurately approximated by linear constraints in the Gaussian distribution. In general, probabilistic first-order languages are transformed into nonparametric variational models where lifted inference algorithms can efficiently solve inference problems. Advisors/Committee Members: Amir, Eyal (advisor), Amir, Eyal (Committee Chair), Roth, Dan (committee member), LaValle, Steven M. (committee member), Poole, David (committee member).

Subjects/Keywords: Probabilistic Graphical Models; Relational Hybrid Models; Lifted Inference; First-Order Probabilistic Models; Probabilistic Logic; Kalman filter; Relational Kalman filter; Variational Learning, Markov Logic Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Choi, J. (2012). Lifted Inference for Relational Hybrid Models. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/32004

Chicago Manual of Style (16th Edition):

Choi, Jaesik. “Lifted Inference for Relational Hybrid Models.” 2012. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed December 09, 2019. http://hdl.handle.net/2142/32004.

MLA Handbook (7th Edition):

Choi, Jaesik. “Lifted Inference for Relational Hybrid Models.” 2012. Web. 09 Dec 2019.

Vancouver:

Choi J. Lifted Inference for Relational Hybrid Models. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2012. [cited 2019 Dec 09]. Available from: http://hdl.handle.net/2142/32004.

Council of Science Editors:

Choi J. Lifted Inference for Relational Hybrid Models. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2012. Available from: http://hdl.handle.net/2142/32004

.