Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:


Written in Published in Earliest date Latest date

Sorted by

Results per page:

You searched for subject:(Relational Hybrid Models). One record found.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters

University of Illinois – Urbana-Champaign

1. Choi, Jaesik. Lifted Inference for Relational Hybrid Models.

Degree: PhD, 0112, 2012, University of Illinois – Urbana-Champaign

Probabilistic Graphical Models (PGMs) promise to play a prominent role in many complex real-world systems. Probabilistic Relational Graphical Models (PRGMs) scale the representation and learning of PGMs. Answering questions using PRGMs enables many current and future applications, such as medical informatics, environmental engineering, financial forecasting and robot localizations. Scaling inference algorithms for large models is a key challenge for scaling up current applications and enabling future ones. This thesis presents new insights into large-scale probabilistic graphical models. It provides fresh ideas for maintaining a compact structure when answering questions or inferences about large, continuous models. The insights result in a key contribution, the Lifted Relational Kalman filter (LRKF), an efficient estimation algorithm for large-scale linear dynamic systems. It shows that the new relational Kalman filter enables scaling the exact vanilla Kalman filter from 1,000 to 1,000,000,000 variables. Another key contribution of this thesis is that it proves that typically used probabilistic first-order languages, including Markov Logic Networks (MLNs) and First-Order Probabilistic Models (FOPMs), can be reduced to compact probabilistic graphical representations under reasonable conditions. Specifically, this thesis shows that aggregate operators and the existential quantification in the languages are accurately approximated by linear constraints in the Gaussian distribution. In general, probabilistic first-order languages are transformed into nonparametric variational models where lifted inference algorithms can efficiently solve inference problems. Advisors/Committee Members: Amir, Eyal (advisor), Amir, Eyal (Committee Chair), Roth, Dan (committee member), LaValle, Steven M. (committee member), Poole, David (committee member).

Subjects/Keywords: Probabilistic Graphical Models; Relational Hybrid Models; Lifted Inference; First-Order Probabilistic Models; Probabilistic Logic; Kalman filter; Relational Kalman filter; Variational Learning, Markov Logic Networks

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Choi, J. (2012). Lifted Inference for Relational Hybrid Models. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from

Chicago Manual of Style (16th Edition):

Choi, Jaesik. “Lifted Inference for Relational Hybrid Models.” 2012. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed March 28, 2020.

MLA Handbook (7th Edition):

Choi, Jaesik. “Lifted Inference for Relational Hybrid Models.” 2012. Web. 28 Mar 2020.


Choi J. Lifted Inference for Relational Hybrid Models. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2012. [cited 2020 Mar 28]. Available from:

Council of Science Editors:

Choi J. Lifted Inference for Relational Hybrid Models. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2012. Available from: