Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

You searched for id:"handle:1773/44667". One record found.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


University of Washington

1. Powers, Thomas. Differentiable and Robust Optimization Algorithms.

Degree: PhD, 2019, University of Washington

Imposing appropriate structure or constraints onto optimization problems is often the key to deriving guarantees or improving generalization of performance aspects like generalization or interpretability. The main contribution of this dissertation is developing algorithms that can leverage underlying submodular or sparse structure to do robust optimization: unfolded discrete and continuous optimization algorithms and robust submodular optimization. While deep neural networks (DNNs) continue to advance the state-of-the-art for many tasks in fields such as speech and audio processing, natural language processing, and computer vision over traditional statistical generative models, many of the most popular architectures are difficult to analyze and require large amounts of data to achieve such good performance. The choice between DNN and generative model need not be binary, however. Using a technique called deep unfolding, inference algorithms for generative models can be turned into DNNs and trained discriminatively. Such models can also leverage sparisty, submodularity, or other regularization frameworks while still being able to make use of domain knowledge integrated into the underlying model. Subset selection problems are important for many applications in machine learning such as feature and data selection, dictionary learning, compression, and sparse recovery. While these problems are generally NP-hard, if the objective function is submodular (or in some cases has submodular structure), a near-optimal solution can be found in polynomial-time. Deep learning and subset selection have overlap when sparse models are used. For DNNs and other continuous models, sparsity is typically induced via penalty function such as the ℓ1 norm, whereas sparsity is induced via a hard cardinality constraint for subset selection algorithms. This dissertation explores algorithms around the intersection of subset selection and deep learning. Broadly, there are two sets of algorithms presented in this dissertation. In the first, algorithms are designed to approximately solve both submodular and non-submodular optimization problems. These algorithms are applied to sensor scheduling and placement problems. In the second, deep unfolding is used to turn inference algorithms into DNNs. These unfolded models have a number of advantages over conventional DNN models, and are shown to have competitive or improved performance on a variety of tasks, can have principled initializations, and tend to need less data compared to conventional network architectures. Advisors/Committee Members: Atlas, Les (advisor).

Subjects/Keywords: deep unfolding; sparsity; submodularity; Electrical engineering; Artificial intelligence; Electrical engineering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Powers, T. (2019). Differentiable and Robust Optimization Algorithms. (Doctoral Dissertation). University of Washington. Retrieved from http://hdl.handle.net/1773/44667

Chicago Manual of Style (16th Edition):

Powers, Thomas. “Differentiable and Robust Optimization Algorithms.” 2019. Doctoral Dissertation, University of Washington. Accessed December 08, 2019. http://hdl.handle.net/1773/44667.

MLA Handbook (7th Edition):

Powers, Thomas. “Differentiable and Robust Optimization Algorithms.” 2019. Web. 08 Dec 2019.

Vancouver:

Powers T. Differentiable and Robust Optimization Algorithms. [Internet] [Doctoral dissertation]. University of Washington; 2019. [cited 2019 Dec 08]. Available from: http://hdl.handle.net/1773/44667.

Council of Science Editors:

Powers T. Differentiable and Robust Optimization Algorithms. [Doctoral Dissertation]. University of Washington; 2019. Available from: http://hdl.handle.net/1773/44667

.