Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(non convex optimization)`

.
Showing records 1 – 30 of
93 total matches.

▼ Search Limiters

1.
Uthayakumar, R.
Study on convergence of *optimization* problems;.

Degree: 2014, INFLIBNET

URL: http://shodhganga.inflibnet.ac.in/handle/10603/17964

►

In this thesis, various notions of convergence of sequence of sets and functions and their applications in the convergence of the optimal values under the… (more)

Subjects/Keywords: Convergence; Convex; Functions; Non-convex; Optimization; Sets

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Uthayakumar, R. (2014). Study on convergence of optimization problems;. (Thesis). INFLIBNET. Retrieved from http://shodhganga.inflibnet.ac.in/handle/10603/17964

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Uthayakumar, R. “Study on convergence of optimization problems;.” 2014. Thesis, INFLIBNET. Accessed April 17, 2021. http://shodhganga.inflibnet.ac.in/handle/10603/17964.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Uthayakumar, R. “Study on convergence of optimization problems;.” 2014. Web. 17 Apr 2021.

Vancouver:

Uthayakumar R. Study on convergence of optimization problems;. [Internet] [Thesis]. INFLIBNET; 2014. [cited 2021 Apr 17]. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/17964.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Uthayakumar R. Study on convergence of optimization problems;. [Thesis]. INFLIBNET; 2014. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/17964

Not specified: Masters Thesis or Doctoral Dissertation

University of Texas – Austin

2.
Park, Dohyung.
Efficient *non*-*convex* algorithms for large-scale learning problems.

Degree: PhD, Electrical and Computer engineering, 2016, University of Texas – Austin

URL: http://hdl.handle.net/2152/46581

► The emergence of modern large-scale datasets has led to a huge interest in the problem of learning hidden complex structures. Not only can models from…
(more)

Subjects/Keywords: Machine learning; Non-convex optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Park, D. (2016). Efficient non-convex algorithms for large-scale learning problems. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/46581

Chicago Manual of Style (16^{th} Edition):

Park, Dohyung. “Efficient non-convex algorithms for large-scale learning problems.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed April 17, 2021. http://hdl.handle.net/2152/46581.

MLA Handbook (7^{th} Edition):

Park, Dohyung. “Efficient non-convex algorithms for large-scale learning problems.” 2016. Web. 17 Apr 2021.

Vancouver:

Park D. Efficient non-convex algorithms for large-scale learning problems. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/2152/46581.

Council of Science Editors:

Park D. Efficient non-convex algorithms for large-scale learning problems. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/46581

Princeton University

3.
Bullins, Brian Anderson.
Efficient Higher-Order *Optimization* for Machine Learning
.

Degree: PhD, 2019, Princeton University

URL: http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c

► In recent years, stochastic gradient descent (SGD) has taken center stage for training large-scale models in machine learning. Although some higher-order methods have improved iteration…
(more)

Subjects/Keywords: convex optimization; higher-order; machine learning; non-convex optimization; second-order

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Bullins, B. A. (2019). Efficient Higher-Order Optimization for Machine Learning . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c

Chicago Manual of Style (16^{th} Edition):

Bullins, Brian Anderson. “Efficient Higher-Order Optimization for Machine Learning .” 2019. Doctoral Dissertation, Princeton University. Accessed April 17, 2021. http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c.

MLA Handbook (7^{th} Edition):

Bullins, Brian Anderson. “Efficient Higher-Order Optimization for Machine Learning .” 2019. Web. 17 Apr 2021.

Vancouver:

Bullins BA. Efficient Higher-Order Optimization for Machine Learning . [Internet] [Doctoral dissertation]. Princeton University; 2019. [cited 2021 Apr 17]. Available from: http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c.

Council of Science Editors:

Bullins BA. Efficient Higher-Order Optimization for Machine Learning . [Doctoral Dissertation]. Princeton University; 2019. Available from: http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c

Carnegie Mellon University

4. Xiong, Xuehan. Supervised Descent Method.

Degree: 2015, Carnegie Mellon University

URL: http://repository.cmu.edu/dissertations/652

► In this dissertation, we focus on solving Nonlinear Least Squares problems using a supervised approach. In particular, we developed a Supervised Descent Method (SDM), performed…
(more)

Subjects/Keywords: nonlinear optimization; global optimization; non-convex optimization; nonlinear least squares; face alignment; facial feature tracking

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Xiong, X. (2015). Supervised Descent Method. (Thesis). Carnegie Mellon University. Retrieved from http://repository.cmu.edu/dissertations/652

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Xiong, Xuehan. “Supervised Descent Method.” 2015. Thesis, Carnegie Mellon University. Accessed April 17, 2021. http://repository.cmu.edu/dissertations/652.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Xiong, Xuehan. “Supervised Descent Method.” 2015. Web. 17 Apr 2021.

Vancouver:

Xiong X. Supervised Descent Method. [Internet] [Thesis]. Carnegie Mellon University; 2015. [cited 2021 Apr 17]. Available from: http://repository.cmu.edu/dissertations/652.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Xiong X. Supervised Descent Method. [Thesis]. Carnegie Mellon University; 2015. Available from: http://repository.cmu.edu/dissertations/652

Not specified: Masters Thesis or Doctoral Dissertation

University of Minnesota

5. Asiaeetaheri, Amir. High Dimensional Learning with Structure Inducing Constraints and Regularizers.

Degree: PhD, Computer Science, 2017, University of Minnesota

URL: http://hdl.handle.net/11299/191407

► Explosive growth in data generation through science and technology calls for new computational and analytical tools. To the statistical machine learning community, one major challenge…
(more)

Subjects/Keywords: Convex Optimization; High Dimensional Learning; Influence Maximization; Non-asymptotic Error Bound

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Asiaeetaheri, A. (2017). High Dimensional Learning with Structure Inducing Constraints and Regularizers. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/191407

Chicago Manual of Style (16^{th} Edition):

Asiaeetaheri, Amir. “High Dimensional Learning with Structure Inducing Constraints and Regularizers.” 2017. Doctoral Dissertation, University of Minnesota. Accessed April 17, 2021. http://hdl.handle.net/11299/191407.

MLA Handbook (7^{th} Edition):

Asiaeetaheri, Amir. “High Dimensional Learning with Structure Inducing Constraints and Regularizers.” 2017. Web. 17 Apr 2021.

Vancouver:

Asiaeetaheri A. High Dimensional Learning with Structure Inducing Constraints and Regularizers. [Internet] [Doctoral dissertation]. University of Minnesota; 2017. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/11299/191407.

Council of Science Editors:

Asiaeetaheri A. High Dimensional Learning with Structure Inducing Constraints and Regularizers. [Doctoral Dissertation]. University of Minnesota; 2017. Available from: http://hdl.handle.net/11299/191407

King Abdullah University of Science and Technology

6. Alabbasi, AbdulRahman. Towards Energy Efficient Cognitive Radio Systems.

Degree: Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division, 2016, King Abdullah University of Science and Technology

URL: http://hdl.handle.net/10754/617094

► Cognitive radio (CR) is a cutting-edge wireless communication technology that adopts several existing communication concepts in order to efficiently utilize the spectrum and meet the…
(more)

Subjects/Keywords: Wireless communication; Cognitive Radio; Green communication; Outage Probability; non convex optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Alabbasi, A. (2016). Towards Energy Efficient Cognitive Radio Systems. (Thesis). King Abdullah University of Science and Technology. Retrieved from http://hdl.handle.net/10754/617094

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Alabbasi, AbdulRahman. “Towards Energy Efficient Cognitive Radio Systems.” 2016. Thesis, King Abdullah University of Science and Technology. Accessed April 17, 2021. http://hdl.handle.net/10754/617094.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Alabbasi, AbdulRahman. “Towards Energy Efficient Cognitive Radio Systems.” 2016. Web. 17 Apr 2021.

Vancouver:

Alabbasi A. Towards Energy Efficient Cognitive Radio Systems. [Internet] [Thesis]. King Abdullah University of Science and Technology; 2016. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/10754/617094.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Alabbasi A. Towards Energy Efficient Cognitive Radio Systems. [Thesis]. King Abdullah University of Science and Technology; 2016. Available from: http://hdl.handle.net/10754/617094

Not specified: Masters Thesis or Doctoral Dissertation

Cornell University

7. Qian, Wei. Local Minima in Mixture Problems and their Algorithmic Implications.

Degree: PhD, Operations Research and Information Engineering, 2020, Cornell University

URL: http://hdl.handle.net/1813/102985

► We study the location estimation problem for a balanced mixture of k distributions. The geometry of this *non*-*convex* *optimization* problem differs according to the number…
(more)

Subjects/Keywords: Expectation Maximization; Gaussian Mixture Model; K-means; Non Convex Optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Qian, W. (2020). Local Minima in Mixture Problems and their Algorithmic Implications. (Doctoral Dissertation). Cornell University. Retrieved from http://hdl.handle.net/1813/102985

Chicago Manual of Style (16^{th} Edition):

Qian, Wei. “Local Minima in Mixture Problems and their Algorithmic Implications.” 2020. Doctoral Dissertation, Cornell University. Accessed April 17, 2021. http://hdl.handle.net/1813/102985.

MLA Handbook (7^{th} Edition):

Qian, Wei. “Local Minima in Mixture Problems and their Algorithmic Implications.” 2020. Web. 17 Apr 2021.

Vancouver:

Qian W. Local Minima in Mixture Problems and their Algorithmic Implications. [Internet] [Doctoral dissertation]. Cornell University; 2020. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/1813/102985.

Council of Science Editors:

Qian W. Local Minima in Mixture Problems and their Algorithmic Implications. [Doctoral Dissertation]. Cornell University; 2020. Available from: http://hdl.handle.net/1813/102985

University of Ontario Institute of Technology

8.
Rokhsatyazdi, Ehsan.
Proposing effective coordinate search methods for solving large-scale expensive black-box *optimization* problems.

Degree: 2020, University of Ontario Institute of Technology

URL: http://hdl.handle.net/10155/1234

► In engineering and science, *optimization* plays a vital role in many real-world applications. In this work, several novel *optimization* algorithms based on Coordinate Search (CS)…
(more)

Subjects/Keywords: Coordinate-search; Gradient-free; Non-convex; Neural-network; Large-scale optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Rokhsatyazdi, E. (2020). Proposing effective coordinate search methods for solving large-scale expensive black-box optimization problems. (Thesis). University of Ontario Institute of Technology. Retrieved from http://hdl.handle.net/10155/1234

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Rokhsatyazdi, Ehsan. “Proposing effective coordinate search methods for solving large-scale expensive black-box optimization problems.” 2020. Thesis, University of Ontario Institute of Technology. Accessed April 17, 2021. http://hdl.handle.net/10155/1234.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Rokhsatyazdi, Ehsan. “Proposing effective coordinate search methods for solving large-scale expensive black-box optimization problems.” 2020. Web. 17 Apr 2021.

Vancouver:

Rokhsatyazdi E. Proposing effective coordinate search methods for solving large-scale expensive black-box optimization problems. [Internet] [Thesis]. University of Ontario Institute of Technology; 2020. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/10155/1234.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Rokhsatyazdi E. Proposing effective coordinate search methods for solving large-scale expensive black-box optimization problems. [Thesis]. University of Ontario Institute of Technology; 2020. Available from: http://hdl.handle.net/10155/1234

Not specified: Masters Thesis or Doctoral Dissertation

Rice University

9. Leong, Oscar. Phase Retrieval Under a Generative Prior.

Degree: MA, Engineering, 2019, Rice University

URL: http://hdl.handle.net/1911/105890

► The phase retrieval problem, arising from X-ray crystallography and medical imaging, asks to recover a signal given intensity-only measurements. When the number of measurements is…
(more)

Subjects/Keywords: Phase Retrieval; Generative Models; Non-convex Optimization; Deep Learning

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Leong, O. (2019). Phase Retrieval Under a Generative Prior. (Masters Thesis). Rice University. Retrieved from http://hdl.handle.net/1911/105890

Chicago Manual of Style (16^{th} Edition):

Leong, Oscar. “Phase Retrieval Under a Generative Prior.” 2019. Masters Thesis, Rice University. Accessed April 17, 2021. http://hdl.handle.net/1911/105890.

MLA Handbook (7^{th} Edition):

Leong, Oscar. “Phase Retrieval Under a Generative Prior.” 2019. Web. 17 Apr 2021.

Vancouver:

Leong O. Phase Retrieval Under a Generative Prior. [Internet] [Masters thesis]. Rice University; 2019. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/1911/105890.

Council of Science Editors:

Leong O. Phase Retrieval Under a Generative Prior. [Masters Thesis]. Rice University; 2019. Available from: http://hdl.handle.net/1911/105890

University of Minnesota

10.
Kadkhodaie Elyaderani, Mojtaba.
A Computational and Statistical Study of *Convex* and Nonconvex *Optimization* with Applications to Structured Source Demixing and Matrix Factorization Problems.

Degree: PhD, Electrical/Computer Engineering, 2017, University of Minnesota

URL: http://hdl.handle.net/11299/191334

► Modern machine learning problems that emerge from real-world applications typically involve estimating high dimensional model parameters, whose number may be of the same order as…
(more)

Subjects/Keywords: Alternating Direction Method of Multipliers; Convex Optimization; Group Lasso; Local Convergence Analysis; Low-rank Matrix Factorization; Non-Convex Optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Kadkhodaie Elyaderani, M. (2017). A Computational and Statistical Study of Convex and Nonconvex Optimization with Applications to Structured Source Demixing and Matrix Factorization Problems. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/191334

Chicago Manual of Style (16^{th} Edition):

Kadkhodaie Elyaderani, Mojtaba. “A Computational and Statistical Study of Convex and Nonconvex Optimization with Applications to Structured Source Demixing and Matrix Factorization Problems.” 2017. Doctoral Dissertation, University of Minnesota. Accessed April 17, 2021. http://hdl.handle.net/11299/191334.

MLA Handbook (7^{th} Edition):

Kadkhodaie Elyaderani, Mojtaba. “A Computational and Statistical Study of Convex and Nonconvex Optimization with Applications to Structured Source Demixing and Matrix Factorization Problems.” 2017. Web. 17 Apr 2021.

Vancouver:

Kadkhodaie Elyaderani M. A Computational and Statistical Study of Convex and Nonconvex Optimization with Applications to Structured Source Demixing and Matrix Factorization Problems. [Internet] [Doctoral dissertation]. University of Minnesota; 2017. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/11299/191334.

Council of Science Editors:

Kadkhodaie Elyaderani M. A Computational and Statistical Study of Convex and Nonconvex Optimization with Applications to Structured Source Demixing and Matrix Factorization Problems. [Doctoral Dissertation]. University of Minnesota; 2017. Available from: http://hdl.handle.net/11299/191334

University of Texas – Austin

11. Bhojanapalli, Venkata Sesha Pavana Srinadh. Large scale matrix factorization with guarantees: sampling and bi-linearity.

Degree: PhD, Electrical and Computer Engineering, 2015, University of Texas – Austin

URL: http://hdl.handle.net/2152/32832

► Low rank matrix factorization is an important step in many high dimensional machine learning algorithms. Traditional algorithms for factorization do not scale well with the…
(more)

Subjects/Keywords: Matrix completion; Non-convex optimization; Low rank approximation; Semi-definite optimization; Tensor factorization; Scalable algorithms

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Bhojanapalli, V. S. P. S. (2015). Large scale matrix factorization with guarantees: sampling and bi-linearity. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/32832

Chicago Manual of Style (16^{th} Edition):

Bhojanapalli, Venkata Sesha Pavana Srinadh. “Large scale matrix factorization with guarantees: sampling and bi-linearity.” 2015. Doctoral Dissertation, University of Texas – Austin. Accessed April 17, 2021. http://hdl.handle.net/2152/32832.

MLA Handbook (7^{th} Edition):

Bhojanapalli, Venkata Sesha Pavana Srinadh. “Large scale matrix factorization with guarantees: sampling and bi-linearity.” 2015. Web. 17 Apr 2021.

Vancouver:

Bhojanapalli VSPS. Large scale matrix factorization with guarantees: sampling and bi-linearity. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2015. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/2152/32832.

Council of Science Editors:

Bhojanapalli VSPS. Large scale matrix factorization with guarantees: sampling and bi-linearity. [Doctoral Dissertation]. University of Texas – Austin; 2015. Available from: http://hdl.handle.net/2152/32832

University of Illinois – Urbana-Champaign

12.
Tsai, Katherine.
A *non*-*convex* framework for structured *non*-stationary covariance recovery theory and application.

Degree: MS, Electrical & Computer Engr, 2020, University of Illinois – Urbana-Champaign

URL: http://hdl.handle.net/2142/108461

► Flexible, yet interpretable, models for the second-order temporal structure are needed in scientific analyses of high-dimensional data. The thesis develops a structured time-indexed covariance model…
(more)

Subjects/Keywords: machine learning; structured learning; non-convex optimization; non-stationary covariance; dynamic functional connectivity

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Tsai, K. (2020). A non-convex framework for structured non-stationary covariance recovery theory and application. (Thesis). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/108461

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Tsai, Katherine. “A non-convex framework for structured non-stationary covariance recovery theory and application.” 2020. Thesis, University of Illinois – Urbana-Champaign. Accessed April 17, 2021. http://hdl.handle.net/2142/108461.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Tsai, Katherine. “A non-convex framework for structured non-stationary covariance recovery theory and application.” 2020. Web. 17 Apr 2021.

Vancouver:

Tsai K. A non-convex framework for structured non-stationary covariance recovery theory and application. [Internet] [Thesis]. University of Illinois – Urbana-Champaign; 2020. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/2142/108461.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Tsai K. A non-convex framework for structured non-stationary covariance recovery theory and application. [Thesis]. University of Illinois – Urbana-Champaign; 2020. Available from: http://hdl.handle.net/2142/108461

Not specified: Masters Thesis or Doctoral Dissertation

Universitat de Valencia

13.
Huang, Xiaoge.
* Non*-

Degree: 2013, Universitat de Valencia

URL: http://hdl.handle.net/10550/29185

► In this thesis, we explore interweave communication systems in cognitive radio networks where the overall objective is to maximize the sum-rate of each cognitive radio…
(more)

Subjects/Keywords: Quasi-Nash Equilibrium; Non-cooperative Game; Non-convex Optimization; Cognitive Radio Networks

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Huang, X. (2013). Non-convex distributed power allocation games in cognitive radio networks . (Doctoral Dissertation). Universitat de Valencia. Retrieved from http://hdl.handle.net/10550/29185

Chicago Manual of Style (16^{th} Edition):

Huang, Xiaoge. “Non-convex distributed power allocation games in cognitive radio networks .” 2013. Doctoral Dissertation, Universitat de Valencia. Accessed April 17, 2021. http://hdl.handle.net/10550/29185.

MLA Handbook (7^{th} Edition):

Huang, Xiaoge. “Non-convex distributed power allocation games in cognitive radio networks .” 2013. Web. 17 Apr 2021.

Vancouver:

Huang X. Non-convex distributed power allocation games in cognitive radio networks . [Internet] [Doctoral dissertation]. Universitat de Valencia; 2013. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/10550/29185.

Council of Science Editors:

Huang X. Non-convex distributed power allocation games in cognitive radio networks . [Doctoral Dissertation]. Universitat de Valencia; 2013. Available from: http://hdl.handle.net/10550/29185

14.
Lazare, Arnaud.
Global *optimization* of polynomial programs with mixed-integer variables : Optimisation globale de programmes polynomiaux en variables mixtes-entières.

Degree: Docteur es, Mathématiques appliquées, 2019, Université Paris-Saclay (ComUE)

URL: http://www.theses.fr/2019SACLY011

►

Dans cette thèse, nous nous intéressons à l'étude des programmes polynomiaux, c'est à dire les problème d'optimisation dont la fonction objectif et/ou les contraintes font… (more)

Subjects/Keywords: Optimisation non convexe; Optimisation polynomiale; Reformulation quadratique convexe; Optimisation discrète; Résolution exacte; Non convex Optimization; Polynomial optimization; Quadratic convex reformulation; Discrete Optimization; Exact solution

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Lazare, A. (2019). Global optimization of polynomial programs with mixed-integer variables : Optimisation globale de programmes polynomiaux en variables mixtes-entières. (Doctoral Dissertation). Université Paris-Saclay (ComUE). Retrieved from http://www.theses.fr/2019SACLY011

Chicago Manual of Style (16^{th} Edition):

Lazare, Arnaud. “Global optimization of polynomial programs with mixed-integer variables : Optimisation globale de programmes polynomiaux en variables mixtes-entières.” 2019. Doctoral Dissertation, Université Paris-Saclay (ComUE). Accessed April 17, 2021. http://www.theses.fr/2019SACLY011.

MLA Handbook (7^{th} Edition):

Lazare, Arnaud. “Global optimization of polynomial programs with mixed-integer variables : Optimisation globale de programmes polynomiaux en variables mixtes-entières.” 2019. Web. 17 Apr 2021.

Vancouver:

Lazare A. Global optimization of polynomial programs with mixed-integer variables : Optimisation globale de programmes polynomiaux en variables mixtes-entières. [Internet] [Doctoral dissertation]. Université Paris-Saclay (ComUE); 2019. [cited 2021 Apr 17]. Available from: http://www.theses.fr/2019SACLY011.

Council of Science Editors:

Lazare A. Global optimization of polynomial programs with mixed-integer variables : Optimisation globale de programmes polynomiaux en variables mixtes-entières. [Doctoral Dissertation]. Université Paris-Saclay (ComUE); 2019. Available from: http://www.theses.fr/2019SACLY011

Australian National University

15.
Deng, Huizhong.
Shape Clustering and Spatial-temporal Constraint for *Non*-rigid Structure from Motion
.

Degree: 2017, Australian National University

URL: http://hdl.handle.net/1885/113634

► *Non*-rigid Structure-from-Motion (NRSfM) is an active research eld in computer vision. The task of NRSfM is to simultaneously recover camera motion and 3D structure from…
(more)

Subjects/Keywords: Non-rigid Structure-from-Motion; sparse; dense; reconstructability; shape clustering; spatial-temporal; convex optimization; convex optimisation; simple

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Deng, H. (2017). Shape Clustering and Spatial-temporal Constraint for Non-rigid Structure from Motion . (Thesis). Australian National University. Retrieved from http://hdl.handle.net/1885/113634

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Deng, Huizhong. “Shape Clustering and Spatial-temporal Constraint for Non-rigid Structure from Motion .” 2017. Thesis, Australian National University. Accessed April 17, 2021. http://hdl.handle.net/1885/113634.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Deng, Huizhong. “Shape Clustering and Spatial-temporal Constraint for Non-rigid Structure from Motion .” 2017. Web. 17 Apr 2021.

Vancouver:

Deng H. Shape Clustering and Spatial-temporal Constraint for Non-rigid Structure from Motion . [Internet] [Thesis]. Australian National University; 2017. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/1885/113634.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Deng H. Shape Clustering and Spatial-temporal Constraint for Non-rigid Structure from Motion . [Thesis]. Australian National University; 2017. Available from: http://hdl.handle.net/1885/113634

Not specified: Masters Thesis or Doctoral Dissertation

University of Texas – Austin

16.
-1859-7314.
* Non*-

Degree: PhD, Electrical and Computer Engineering, 2020, University of Texas – Austin

URL: http://dx.doi.org/10.26153/tsw/8154

► Electricity markets are particularly complex because they must accommodate the underlying physics that govern the electric power system. These physics present *non*-convexities in the social…
(more)

Subjects/Keywords: Electricity market; Optimization; Non-convex; AC transmission network; Primary frequency response; Marginal pricing; Convex hull pricing

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

-1859-7314. (2020). Non-convex myopic electricity markets : the AC transmission network and interdependent reserve types. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://dx.doi.org/10.26153/tsw/8154

Note: this citation may be lacking information needed for this citation format:

Author name may be incomplete

Chicago Manual of Style (16^{th} Edition):

-1859-7314. “Non-convex myopic electricity markets : the AC transmission network and interdependent reserve types.” 2020. Doctoral Dissertation, University of Texas – Austin. Accessed April 17, 2021. http://dx.doi.org/10.26153/tsw/8154.

Note: this citation may be lacking information needed for this citation format:

Author name may be incomplete

MLA Handbook (7^{th} Edition):

-1859-7314. “Non-convex myopic electricity markets : the AC transmission network and interdependent reserve types.” 2020. Web. 17 Apr 2021.

Note: this citation may be lacking information needed for this citation format:

Author name may be incomplete

Vancouver:

-1859-7314. Non-convex myopic electricity markets : the AC transmission network and interdependent reserve types. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2020. [cited 2021 Apr 17]. Available from: http://dx.doi.org/10.26153/tsw/8154.

Author name may be incomplete

Council of Science Editors:

-1859-7314. Non-convex myopic electricity markets : the AC transmission network and interdependent reserve types. [Doctoral Dissertation]. University of Texas – Austin; 2020. Available from: http://dx.doi.org/10.26153/tsw/8154

Author name may be incomplete

University of Minnesota

17.
Das, Puja.
Online *convex* *optimization* and its application to online portfolio selection.

Degree: PhD, Computer science, 2014, University of Minnesota

URL: http://hdl.handle.net/11299/163662

► Today, whether we consider the data from the internet, consumers, financial markets, a common feature emerges: all of them involve huge amounts of dynamic data…
(more)

Subjects/Keywords: Alternating direction method of multipliers; Constrained optimization; Meta optimization; Non-smooth composite objective; Online convex optimization; Online portfolio selection

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Das, P. (2014). Online convex optimization and its application to online portfolio selection. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/163662

Chicago Manual of Style (16^{th} Edition):

Das, Puja. “Online convex optimization and its application to online portfolio selection.” 2014. Doctoral Dissertation, University of Minnesota. Accessed April 17, 2021. http://hdl.handle.net/11299/163662.

MLA Handbook (7^{th} Edition):

Das, Puja. “Online convex optimization and its application to online portfolio selection.” 2014. Web. 17 Apr 2021.

Vancouver:

Das P. Online convex optimization and its application to online portfolio selection. [Internet] [Doctoral dissertation]. University of Minnesota; 2014. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/11299/163662.

Council of Science Editors:

Das P. Online convex optimization and its application to online portfolio selection. [Doctoral Dissertation]. University of Minnesota; 2014. Available from: http://hdl.handle.net/11299/163662

18.
Hess, Roxana.
Some approximation schemes in polynomial *optimization* : Quelques schémas d'approximation en optimisation polynomiale.

Degree: Docteur es, Automatique, 2017, Université Toulouse III – Paul Sabatier

URL: http://www.theses.fr/2017TOU30129

►

Cette thèse est dédiée à l'étude de la hiérarchie moments-sommes-de-carrés, une famille de problèmes de programmation semi-définie en optimisation polynomiale, couramment appelée hiérarchie de Lasserre.… (more)

Subjects/Keywords: Optimisation non-convexe; Optimisation non-lisse; Approximations polynomiales; Optimisation semi-algébrique; Optimisation semi-définie positive; Non-convex optimization; Non-smooth optimization; Polynomial approximations; Semialgebraic optimization; Semidefinite programming

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Hess, R. (2017). Some approximation schemes in polynomial optimization : Quelques schémas d'approximation en optimisation polynomiale. (Doctoral Dissertation). Université Toulouse III – Paul Sabatier. Retrieved from http://www.theses.fr/2017TOU30129

Chicago Manual of Style (16^{th} Edition):

Hess, Roxana. “Some approximation schemes in polynomial optimization : Quelques schémas d'approximation en optimisation polynomiale.” 2017. Doctoral Dissertation, Université Toulouse III – Paul Sabatier. Accessed April 17, 2021. http://www.theses.fr/2017TOU30129.

MLA Handbook (7^{th} Edition):

Hess, Roxana. “Some approximation schemes in polynomial optimization : Quelques schémas d'approximation en optimisation polynomiale.” 2017. Web. 17 Apr 2021.

Vancouver:

Hess R. Some approximation schemes in polynomial optimization : Quelques schémas d'approximation en optimisation polynomiale. [Internet] [Doctoral dissertation]. Université Toulouse III – Paul Sabatier; 2017. [cited 2021 Apr 17]. Available from: http://www.theses.fr/2017TOU30129.

Council of Science Editors:

Hess R. Some approximation schemes in polynomial optimization : Quelques schémas d'approximation en optimisation polynomiale. [Doctoral Dissertation]. Université Toulouse III – Paul Sabatier; 2017. Available from: http://www.theses.fr/2017TOU30129

University of California – Irvine

19.
Janzamin, Majid.
* Non*-

Degree: Electrical and Computer Engineering, 2016, University of California – Irvine

URL: http://www.escholarship.org/uc/item/7p90p57n

► In the last decade, machine learning algorithms have been substantially developed and they have gained tremendous empirical success. But, there is limited theoretical understanding about…
(more)

Subjects/Keywords: Computer science; Latent Representations; Machine Learning; Neural Networks; Non-convex Optimization; Tensor Decomposition

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Janzamin, M. (2016). Non-convex Optimization in Machine Learning: Provable Guarantees Using Tensor Methods. (Thesis). University of California – Irvine. Retrieved from http://www.escholarship.org/uc/item/7p90p57n

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Janzamin, Majid. “Non-convex Optimization in Machine Learning: Provable Guarantees Using Tensor Methods.” 2016. Thesis, University of California – Irvine. Accessed April 17, 2021. http://www.escholarship.org/uc/item/7p90p57n.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Janzamin, Majid. “Non-convex Optimization in Machine Learning: Provable Guarantees Using Tensor Methods.” 2016. Web. 17 Apr 2021.

Vancouver:

Janzamin M. Non-convex Optimization in Machine Learning: Provable Guarantees Using Tensor Methods. [Internet] [Thesis]. University of California – Irvine; 2016. [cited 2021 Apr 17]. Available from: http://www.escholarship.org/uc/item/7p90p57n.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Janzamin M. Non-convex Optimization in Machine Learning: Provable Guarantees Using Tensor Methods. [Thesis]. University of California – Irvine; 2016. Available from: http://www.escholarship.org/uc/item/7p90p57n

Not specified: Masters Thesis or Doctoral Dissertation

University of Colorado

20.
Gronski, Jessica.
* Non*-

Degree: PhD, 2019, University of Colorado

URL: https://scholar.colorado.edu/appm_gradetds/154

► Bilinear programs and Phase Retrieval are two instances of nonconvex problems that arise in engineering and physical applications, and both occur with their fundamental…
(more)

Subjects/Keywords: bilinear programming; non-convex optimization; quadratic programming; super-resolution imaging; Applied Mathematics; Computer Sciences; Optics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Gronski, J. (2019). Non-Convex Optimization and Applications to Bilinear Programming and Super-Resolution Imaging. (Doctoral Dissertation). University of Colorado. Retrieved from https://scholar.colorado.edu/appm_gradetds/154

Chicago Manual of Style (16^{th} Edition):

Gronski, Jessica. “Non-Convex Optimization and Applications to Bilinear Programming and Super-Resolution Imaging.” 2019. Doctoral Dissertation, University of Colorado. Accessed April 17, 2021. https://scholar.colorado.edu/appm_gradetds/154.

MLA Handbook (7^{th} Edition):

Gronski, Jessica. “Non-Convex Optimization and Applications to Bilinear Programming and Super-Resolution Imaging.” 2019. Web. 17 Apr 2021.

Vancouver:

Gronski J. Non-Convex Optimization and Applications to Bilinear Programming and Super-Resolution Imaging. [Internet] [Doctoral dissertation]. University of Colorado; 2019. [cited 2021 Apr 17]. Available from: https://scholar.colorado.edu/appm_gradetds/154.

Council of Science Editors:

Gronski J. Non-Convex Optimization and Applications to Bilinear Programming and Super-Resolution Imaging. [Doctoral Dissertation]. University of Colorado; 2019. Available from: https://scholar.colorado.edu/appm_gradetds/154

Delft University of Technology

21. Cetin, H. (author). Spectrum Sharing among Cellular Operators from a Game Theoretical Cognitive and Cooperative Networking Perspective.

Degree: 2012, Delft University of Technology

URL: http://resolver.tudelft.nl/uuid:a54bccdc-7ec4-44ba-aba9-088d63c716c1

► The demand for wireless services and the need for high data-rates are growing rapidly. Future generation networks are expected to provide high data-rates in the…
(more)

Subjects/Keywords: Spectrum Sharing; 3G and 4G networking; Interference Mitigation; Game Theory; Non-convex Optimization; Beamforming

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Cetin, H. (. (2012). Spectrum Sharing among Cellular Operators from a Game Theoretical Cognitive and Cooperative Networking Perspective. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:a54bccdc-7ec4-44ba-aba9-088d63c716c1

Chicago Manual of Style (16^{th} Edition):

Cetin, H (author). “Spectrum Sharing among Cellular Operators from a Game Theoretical Cognitive and Cooperative Networking Perspective.” 2012. Masters Thesis, Delft University of Technology. Accessed April 17, 2021. http://resolver.tudelft.nl/uuid:a54bccdc-7ec4-44ba-aba9-088d63c716c1.

MLA Handbook (7^{th} Edition):

Cetin, H (author). “Spectrum Sharing among Cellular Operators from a Game Theoretical Cognitive and Cooperative Networking Perspective.” 2012. Web. 17 Apr 2021.

Vancouver:

Cetin H(. Spectrum Sharing among Cellular Operators from a Game Theoretical Cognitive and Cooperative Networking Perspective. [Internet] [Masters thesis]. Delft University of Technology; 2012. [cited 2021 Apr 17]. Available from: http://resolver.tudelft.nl/uuid:a54bccdc-7ec4-44ba-aba9-088d63c716c1.

Council of Science Editors:

Cetin H(. Spectrum Sharing among Cellular Operators from a Game Theoretical Cognitive and Cooperative Networking Perspective. [Masters Thesis]. Delft University of Technology; 2012. Available from: http://resolver.tudelft.nl/uuid:a54bccdc-7ec4-44ba-aba9-088d63c716c1

22.
Ma, Tengyu.
* Non*-

Degree: PhD, 2017, Princeton University

URL: http://arks.princeton.edu/ark:/88435/dsp01th83m199d

► *Non*-*convex* *optimization* is ubiquitous in modern machine learning: recent breakthroughs in deep learning require optimizing *non*-*convex* training objective functions; problems that admit accurate *convex* relaxation…
(more)

Subjects/Keywords: machine learning; non-convex optimization

…Introduction
1.1
I
1
Analyzing Local Improvement Algorithms for *Non*-*convex* *Optimization*
2
1.1.1… …xiii
Chapter 1
Introduction
*Non*-*convex* *optimization* algorithms have been widely used in… …*convex*
*optimization* algorithms in a principled way? The thesis aims put the *non*-*convex*… …*optimization* on a more solid theoretical footing. We design and analyze *non*-*convex*
*optimization*… …Analyzing Local Improvement Algorithms for
*Non*-*convex* *Optimization*
Finding a global minimizer of…

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ma, T. (2017). Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01th83m199d

Chicago Manual of Style (16^{th} Edition):

Ma, Tengyu. “Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding .” 2017. Doctoral Dissertation, Princeton University. Accessed April 17, 2021. http://arks.princeton.edu/ark:/88435/dsp01th83m199d.

MLA Handbook (7^{th} Edition):

Ma, Tengyu. “Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding .” 2017. Web. 17 Apr 2021.

Vancouver:

Ma T. Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding . [Internet] [Doctoral dissertation]. Princeton University; 2017. [cited 2021 Apr 17]. Available from: http://arks.princeton.edu/ark:/88435/dsp01th83m199d.

Council of Science Editors:

Ma T. Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding . [Doctoral Dissertation]. Princeton University; 2017. Available from: http://arks.princeton.edu/ark:/88435/dsp01th83m199d

Princeton University

23.
Wang, Kaizheng.
Latent Variable Models: Spectral Methods and *Non*-*convex* * Optimization*
.

Degree: PhD, 2020, Princeton University

URL: http://arks.princeton.edu/ark:/88435/dsp01rb68xf782

► Latent variable models lay the statistical foundation for data science problems with unstructured, incomplete and heterogeneous information. The significant challenges in computation and memory call…
(more)

Subjects/Keywords: clustering; dimension reduction; latent variable models; network analysis; non-convex optimization; spectral methods

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, K. (2020). Latent Variable Models: Spectral Methods and Non-convex Optimization . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01rb68xf782

Chicago Manual of Style (16^{th} Edition):

Wang, Kaizheng. “Latent Variable Models: Spectral Methods and Non-convex Optimization .” 2020. Doctoral Dissertation, Princeton University. Accessed April 17, 2021. http://arks.princeton.edu/ark:/88435/dsp01rb68xf782.

MLA Handbook (7^{th} Edition):

Wang, Kaizheng. “Latent Variable Models: Spectral Methods and Non-convex Optimization .” 2020. Web. 17 Apr 2021.

Vancouver:

Wang K. Latent Variable Models: Spectral Methods and Non-convex Optimization . [Internet] [Doctoral dissertation]. Princeton University; 2020. [cited 2021 Apr 17]. Available from: http://arks.princeton.edu/ark:/88435/dsp01rb68xf782.

Council of Science Editors:

Wang K. Latent Variable Models: Spectral Methods and Non-convex Optimization . [Doctoral Dissertation]. Princeton University; 2020. Available from: http://arks.princeton.edu/ark:/88435/dsp01rb68xf782

24.
Yi, Xinyang.
Learning with latent structures, robustness and *non*-linearity : *non*-*convex* approaches.

Degree: PhD, Electrical and Computer engineering, 2016, University of Texas – Austin

URL: http://hdl.handle.net/2152/46474

► *Non*-*convex* *optimization* based algorithms are ubiquitous in machine learning and statistical estimation, especially in dealing with complex models that are noisy, *non*-linear or contain latent…
(more)

Subjects/Keywords: Statistical machine learning; High dimensional statistics; Non-convex optimization; Mixed linear regression

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Yi, X. (2016). Learning with latent structures, robustness and non-linearity : non-convex approaches. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/46474

Chicago Manual of Style (16^{th} Edition):

Yi, Xinyang. “Learning with latent structures, robustness and non-linearity : non-convex approaches.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed April 17, 2021. http://hdl.handle.net/2152/46474.

MLA Handbook (7^{th} Edition):

Yi, Xinyang. “Learning with latent structures, robustness and non-linearity : non-convex approaches.” 2016. Web. 17 Apr 2021.

Vancouver:

Yi X. Learning with latent structures, robustness and non-linearity : non-convex approaches. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/2152/46474.

Council of Science Editors:

Yi X. Learning with latent structures, robustness and non-linearity : non-convex approaches. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/46474

25.
Mierswa, Ingo.
* Non*-

Degree: 2009, Technische Universität Dortmund

URL: http://hdl.handle.net/2003/26104

Subjects/Keywords: Data mining; Multi-objective optimization; Non-convex optimization; 004

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Mierswa, I. (2009). Non-convex and multi-objective optimization in data mining. (Thesis). Technische Universität Dortmund. Retrieved from http://hdl.handle.net/2003/26104

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Mierswa, Ingo. “Non-convex and multi-objective optimization in data mining.” 2009. Thesis, Technische Universität Dortmund. Accessed April 17, 2021. http://hdl.handle.net/2003/26104.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Mierswa, Ingo. “Non-convex and multi-objective optimization in data mining.” 2009. Web. 17 Apr 2021.

Vancouver:

Mierswa I. Non-convex and multi-objective optimization in data mining. [Internet] [Thesis]. Technische Universität Dortmund; 2009. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/2003/26104.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Mierswa I. Non-convex and multi-objective optimization in data mining. [Thesis]. Technische Universität Dortmund; 2009. Available from: http://hdl.handle.net/2003/26104

Not specified: Masters Thesis or Doctoral Dissertation

26.
Mierswa, Ingo.
* Non*-

Degree: 2009, Technische Universität Dortmund

URL: http://dx.doi.org/10.17877/DE290R-12761

Subjects/Keywords: Data mining; Multi-objective optimization; Non-convex optimization; 004

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Mierswa, I. (2009). Non-convex and multi-objective optimization in data mining. (Doctoral Dissertation). Technische Universität Dortmund. Retrieved from http://dx.doi.org/10.17877/DE290R-12761

Chicago Manual of Style (16^{th} Edition):

Mierswa, Ingo. “Non-convex and multi-objective optimization in data mining.” 2009. Doctoral Dissertation, Technische Universität Dortmund. Accessed April 17, 2021. http://dx.doi.org/10.17877/DE290R-12761.

MLA Handbook (7^{th} Edition):

Mierswa, Ingo. “Non-convex and multi-objective optimization in data mining.” 2009. Web. 17 Apr 2021.

Vancouver:

Mierswa I. Non-convex and multi-objective optimization in data mining. [Internet] [Doctoral dissertation]. Technische Universität Dortmund; 2009. [cited 2021 Apr 17]. Available from: http://dx.doi.org/10.17877/DE290R-12761.

Council of Science Editors:

Mierswa I. Non-convex and multi-objective optimization in data mining. [Doctoral Dissertation]. Technische Universität Dortmund; 2009. Available from: http://dx.doi.org/10.17877/DE290R-12761

Université Catholique de Louvain

27.
Degraux, Kévin.
Methods for solving regularized inverse problems : from *non*-Euclidean fidelities to computational imaging applications.

Degree: 2017, Université Catholique de Louvain

URL: http://hdl.handle.net/2078.1/191756

►

Many branches of science and engineering are concerned with the problem of recording signals from physical phenomena. However, an acquisition system does not always directly… (more)

Subjects/Keywords: Signal Processing; Sparsity; Non-smooth Optimization; Inverse Problems; Convex Optimization; Compressed Sensing; Computational Imaging; Dictionary Learning; Hyperspectral; Multispectral

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Degraux, K. (2017). Methods for solving regularized inverse problems : from non-Euclidean fidelities to computational imaging applications. (Thesis). Université Catholique de Louvain. Retrieved from http://hdl.handle.net/2078.1/191756

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Degraux, Kévin. “Methods for solving regularized inverse problems : from non-Euclidean fidelities to computational imaging applications.” 2017. Thesis, Université Catholique de Louvain. Accessed April 17, 2021. http://hdl.handle.net/2078.1/191756.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Degraux, Kévin. “Methods for solving regularized inverse problems : from non-Euclidean fidelities to computational imaging applications.” 2017. Web. 17 Apr 2021.

Vancouver:

Degraux K. Methods for solving regularized inverse problems : from non-Euclidean fidelities to computational imaging applications. [Internet] [Thesis]. Université Catholique de Louvain; 2017. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/2078.1/191756.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Degraux K. Methods for solving regularized inverse problems : from non-Euclidean fidelities to computational imaging applications. [Thesis]. Université Catholique de Louvain; 2017. Available from: http://hdl.handle.net/2078.1/191756

Not specified: Masters Thesis or Doctoral Dissertation

Iowa State University

28.
Ma, Xu.
Distributed approaches for solving *non*-*convex* optimizations under strong duality.

Degree: 2016, Iowa State University

URL: https://lib.dr.iastate.edu/etd/15769

► This dissertation studies *non*-*convex* optimizations under the strong duality condition. In general, *non*-*convex* problems are *non*-deterministic polynomial-time (NP) hard and hence are difficult to solve.…
(more)

Subjects/Keywords: distributed approaches; non-convex optimization; optimal power flow (OPF); optimization dynamics; primal-dual algorithm; QCQP; Electrical and Electronics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ma, X. (2016). Distributed approaches for solving non-convex optimizations under strong duality. (Thesis). Iowa State University. Retrieved from https://lib.dr.iastate.edu/etd/15769

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Ma, Xu. “Distributed approaches for solving non-convex optimizations under strong duality.” 2016. Thesis, Iowa State University. Accessed April 17, 2021. https://lib.dr.iastate.edu/etd/15769.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Ma, Xu. “Distributed approaches for solving non-convex optimizations under strong duality.” 2016. Web. 17 Apr 2021.

Vancouver:

Ma X. Distributed approaches for solving non-convex optimizations under strong duality. [Internet] [Thesis]. Iowa State University; 2016. [cited 2021 Apr 17]. Available from: https://lib.dr.iastate.edu/etd/15769.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ma X. Distributed approaches for solving non-convex optimizations under strong duality. [Thesis]. Iowa State University; 2016. Available from: https://lib.dr.iastate.edu/etd/15769

Not specified: Masters Thesis or Doctoral Dissertation

University of Minnesota

29.
Wang, Gang.
* Non*-

Degree: PhD, Electrical Engineering, 2018, University of Minnesota

URL: http://hdl.handle.net/11299/198408

► High-dimensional signal estimation plays a fundamental role in various science and engineering applications, including optical and medical imaging, wireless communications, and power system monitoring. The…
(more)

Subjects/Keywords: Amplitude flow; Information-theoretic limit; Linear convergence to global optimum; Non-convex optimization; Sparsity; Stochastic optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, G. (2018). Non-Convex Phase Retrieval Algorithms and Performance Analysis. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/198408

Chicago Manual of Style (16^{th} Edition):

Wang, Gang. “Non-Convex Phase Retrieval Algorithms and Performance Analysis.” 2018. Doctoral Dissertation, University of Minnesota. Accessed April 17, 2021. http://hdl.handle.net/11299/198408.

MLA Handbook (7^{th} Edition):

Wang, Gang. “Non-Convex Phase Retrieval Algorithms and Performance Analysis.” 2018. Web. 17 Apr 2021.

Vancouver:

Wang G. Non-Convex Phase Retrieval Algorithms and Performance Analysis. [Internet] [Doctoral dissertation]. University of Minnesota; 2018. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/11299/198408.

Council of Science Editors:

Wang G. Non-Convex Phase Retrieval Algorithms and Performance Analysis. [Doctoral Dissertation]. University of Minnesota; 2018. Available from: http://hdl.handle.net/11299/198408

University of Washington

30. Raut, Prasanna Sanjay. Online Decision Making: DR-Submodular Objectives and Stochastic Linear Constraints.

Degree: 2021, University of Washington

URL: http://hdl.handle.net/1773/46846

► In this thesis, we consider online continuous DR-submodular maximization with linear stochastic long-term constraints. Compared to the prior work on online submodular maximization , our…
(more)

Subjects/Keywords: regret analysis; non-convex optimization; online optimization; submodular maximization; Applied mathematics; Computer science; Operations research; Mechanical engineering

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Raut, P. S. (2021). Online Decision Making: DR-Submodular Objectives and Stochastic Linear Constraints. (Thesis). University of Washington. Retrieved from http://hdl.handle.net/1773/46846

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Raut, Prasanna Sanjay. “Online Decision Making: DR-Submodular Objectives and Stochastic Linear Constraints.” 2021. Thesis, University of Washington. Accessed April 17, 2021. http://hdl.handle.net/1773/46846.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Raut, Prasanna Sanjay. “Online Decision Making: DR-Submodular Objectives and Stochastic Linear Constraints.” 2021. Web. 17 Apr 2021.

Vancouver:

Raut PS. Online Decision Making: DR-Submodular Objectives and Stochastic Linear Constraints. [Internet] [Thesis]. University of Washington; 2021. [cited 2021 Apr 17]. Available from: http://hdl.handle.net/1773/46846.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Raut PS. Online Decision Making: DR-Submodular Objectives and Stochastic Linear Constraints. [Thesis]. University of Washington; 2021. Available from: http://hdl.handle.net/1773/46846

Not specified: Masters Thesis or Doctoral Dissertation