Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(Convex Optimization)`

.
Showing records 1 – 30 of
520 total matches.

◁ [1] [2] [3] [4] [5] … [18] ▶

Search Limiters

Dates

- 2017 – 2021 (208)
- 2012 – 2016 (241)
- 2007 – 2011 (87)
- 2002 – 2006 (13)

Universities

- University of Waterloo (25)
- National University of Singapore (16)
- University of California – Berkeley (16)
- Georgia Tech (15)
- University of Minnesota (14)
- University of Texas – Austin (14)
- University of Illinois – Urbana-Champaign (13)
- Delft University of Technology (12)
- Penn State University (11)
- University of Michigan (10)
- University of Washington (10)

Department

Degrees

- PhD (159)
- Docteur es (72)
- MS (16)
- Master (10)

▼ Search Limiters

1.
Uthayakumar, R.
Study on convergence of *optimization* problems;.

Degree: 2014, INFLIBNET

URL: http://shodhganga.inflibnet.ac.in/handle/10603/17964

►

In this thesis, various notions of convergence of sequence of sets and functions and their applications in the convergence of the optimal values under the… (more)

Subjects/Keywords: Convergence; Convex; Functions; Non-convex; Optimization; Sets

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Uthayakumar, R. (2014). Study on convergence of optimization problems;. (Thesis). INFLIBNET. Retrieved from http://shodhganga.inflibnet.ac.in/handle/10603/17964

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Uthayakumar, R. “Study on convergence of optimization problems;.” 2014. Thesis, INFLIBNET. Accessed March 04, 2021. http://shodhganga.inflibnet.ac.in/handle/10603/17964.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Uthayakumar, R. “Study on convergence of optimization problems;.” 2014. Web. 04 Mar 2021.

Vancouver:

Uthayakumar R. Study on convergence of optimization problems;. [Internet] [Thesis]. INFLIBNET; 2014. [cited 2021 Mar 04]. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/17964.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Uthayakumar R. Study on convergence of optimization problems;. [Thesis]. INFLIBNET; 2014. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/17964

Not specified: Masters Thesis or Doctoral Dissertation

NSYSU

2. Zhang, Shu-Bin. Study on Digital Filter Design and Coefficient Quantization.

Degree: Master, Communications Engineering, 2011, NSYSU

URL: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237

► In this thesis, the basic theory is *convex* *optimization* theory[1]. And we study the problem about how to transfer to *convex* *optimization* problem from the…
(more)

Subjects/Keywords: Filter; Optimization; Convex; Bits; Quantization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhang, S. (2011). Study on Digital Filter Design and Coefficient Quantization. (Thesis). NSYSU. Retrieved from http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Zhang, Shu-Bin. “Study on Digital Filter Design and Coefficient Quantization.” 2011. Thesis, NSYSU. Accessed March 04, 2021. http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Zhang, Shu-Bin. “Study on Digital Filter Design and Coefficient Quantization.” 2011. Web. 04 Mar 2021.

Vancouver:

Zhang S. Study on Digital Filter Design and Coefficient Quantization. [Internet] [Thesis]. NSYSU; 2011. [cited 2021 Mar 04]. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhang S. Study on Digital Filter Design and Coefficient Quantization. [Thesis]. NSYSU; 2011. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237

Not specified: Masters Thesis or Doctoral Dissertation

Victoria University of Wellington

3.
Jellyman, Dayle Raymond.
*Convex**Optimization* for Distributed Acoustic Beamforming.

Degree: 2017, Victoria University of Wellington

URL: http://hdl.handle.net/10063/6650

► Beamforming filter *optimization* can be performed over a distributed wireless sensor network, but the output calculation remains either centralized or linked in time to the…
(more)

Subjects/Keywords: Distributed; Beamforming; Convex optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Jellyman, D. R. (2017). Convex Optimization for Distributed Acoustic Beamforming. (Masters Thesis). Victoria University of Wellington. Retrieved from http://hdl.handle.net/10063/6650

Chicago Manual of Style (16^{th} Edition):

Jellyman, Dayle Raymond. “Convex Optimization for Distributed Acoustic Beamforming.” 2017. Masters Thesis, Victoria University of Wellington. Accessed March 04, 2021. http://hdl.handle.net/10063/6650.

MLA Handbook (7^{th} Edition):

Jellyman, Dayle Raymond. “Convex Optimization for Distributed Acoustic Beamforming.” 2017. Web. 04 Mar 2021.

Vancouver:

Jellyman DR. Convex Optimization for Distributed Acoustic Beamforming. [Internet] [Masters thesis]. Victoria University of Wellington; 2017. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10063/6650.

Council of Science Editors:

Jellyman DR. Convex Optimization for Distributed Acoustic Beamforming. [Masters Thesis]. Victoria University of Wellington; 2017. Available from: http://hdl.handle.net/10063/6650

Université Catholique de Louvain

4.
Martin, Benoît.
Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through *convex* * optimization*.

Degree: 2018, Université Catholique de Louvain

URL: http://hdl.handle.net/2078.1/214246

►

Autonomous microgrid planning requires to consider both investments in network and generation assets as there is no connection to another power system. In this problem,… (more)

Subjects/Keywords: Planning; Microgrid; Convex optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Martin, B. (2018). Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization. (Thesis). Université Catholique de Louvain. Retrieved from http://hdl.handle.net/2078.1/214246

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Martin, Benoît. “Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization.” 2018. Thesis, Université Catholique de Louvain. Accessed March 04, 2021. http://hdl.handle.net/2078.1/214246.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Martin, Benoît. “Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization.” 2018. Web. 04 Mar 2021.

Vancouver:

Martin B. Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization. [Internet] [Thesis]. Université Catholique de Louvain; 2018. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2078.1/214246.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Martin B. Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization. [Thesis]. Université Catholique de Louvain; 2018. Available from: http://hdl.handle.net/2078.1/214246

Not specified: Masters Thesis or Doctoral Dissertation

5.
Cosentino, Alessandro.
Quantum State Local Distinguishability via *Convex* * Optimization*.

Degree: 2015, University of Waterloo

URL: http://hdl.handle.net/10012/9572

► Entanglement and nonlocality play a fundamental role in quantum computing. To understand the interplay between these phenomena, researchers have considered the model of local operations…
(more)

Subjects/Keywords: quantum information; convex optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Cosentino, A. (2015). Quantum State Local Distinguishability via Convex Optimization. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/9572

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Cosentino, Alessandro. “Quantum State Local Distinguishability via Convex Optimization.” 2015. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/9572.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Cosentino, Alessandro. “Quantum State Local Distinguishability via Convex Optimization.” 2015. Web. 04 Mar 2021.

Vancouver:

Cosentino A. Quantum State Local Distinguishability via Convex Optimization. [Internet] [Thesis]. University of Waterloo; 2015. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/9572.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Cosentino A. Quantum State Local Distinguishability via Convex Optimization. [Thesis]. University of Waterloo; 2015. Available from: http://hdl.handle.net/10012/9572

Not specified: Masters Thesis or Doctoral Dissertation

University of Texas – Austin

6. Berning, Andrew Walter, Jr. Verification of successive convexification algorithm.

Degree: MSin Engineering, Aerospace engineering, 2016, University of Texas – Austin

URL: http://hdl.handle.net/2152/41579

► In this report, I describe a technique which allows a non-*convex* optimal control problem to be expressed and solved in a *convex* manner. I then…
(more)

Subjects/Keywords: Convex; Convexification; Optimization; Verification

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Berning, Andrew Walter, J. (2016). Verification of successive convexification algorithm. (Masters Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/41579

Chicago Manual of Style (16^{th} Edition):

Berning, Andrew Walter, Jr. “Verification of successive convexification algorithm.” 2016. Masters Thesis, University of Texas – Austin. Accessed March 04, 2021. http://hdl.handle.net/2152/41579.

MLA Handbook (7^{th} Edition):

Berning, Andrew Walter, Jr. “Verification of successive convexification algorithm.” 2016. Web. 04 Mar 2021.

Vancouver:

Berning, Andrew Walter J. Verification of successive convexification algorithm. [Internet] [Masters thesis]. University of Texas – Austin; 2016. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2152/41579.

Council of Science Editors:

Berning, Andrew Walter J. Verification of successive convexification algorithm. [Masters Thesis]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/41579

University of Texas – Austin

7.
Park, Dohyung.
Efficient non-*convex* algorithms for large-scale learning problems.

Degree: PhD, Electrical and Computer engineering, 2016, University of Texas – Austin

URL: http://hdl.handle.net/2152/46581

► The emergence of modern large-scale datasets has led to a huge interest in the problem of learning hidden complex structures. Not only can models from…
(more)

Subjects/Keywords: Machine learning; Non-convex optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Park, D. (2016). Efficient non-convex algorithms for large-scale learning problems. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/46581

Chicago Manual of Style (16^{th} Edition):

Park, Dohyung. “Efficient non-convex algorithms for large-scale learning problems.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed March 04, 2021. http://hdl.handle.net/2152/46581.

MLA Handbook (7^{th} Edition):

Park, Dohyung. “Efficient non-convex algorithms for large-scale learning problems.” 2016. Web. 04 Mar 2021.

Vancouver:

Park D. Efficient non-convex algorithms for large-scale learning problems. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2152/46581.

Council of Science Editors:

Park D. Efficient non-convex algorithms for large-scale learning problems. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/46581

University of Texas – Austin

8.
Wang, Ye, Ph. D.
Novel *convex* *optimization* techniques for circuit analysis and synthesis.

Degree: PhD, Electrical and Computer Engineering, 2018, University of Texas – Austin

URL: http://hdl.handle.net/2152/67661

► Technology scaling brings about the need for computationally efficient methods for circuit analysis, *optimization*, and synthesis. *Convex* *optimization* is a special class of mathematical *optimization*…
(more)

Subjects/Keywords: Convex optimization; EDA problems

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, Ye, P. D. (2018). Novel convex optimization techniques for circuit analysis and synthesis. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/67661

Chicago Manual of Style (16^{th} Edition):

Wang, Ye, Ph D. “Novel convex optimization techniques for circuit analysis and synthesis.” 2018. Doctoral Dissertation, University of Texas – Austin. Accessed March 04, 2021. http://hdl.handle.net/2152/67661.

MLA Handbook (7^{th} Edition):

Wang, Ye, Ph D. “Novel convex optimization techniques for circuit analysis and synthesis.” 2018. Web. 04 Mar 2021.

Vancouver:

Wang, Ye PD. Novel convex optimization techniques for circuit analysis and synthesis. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2018. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2152/67661.

Council of Science Editors:

Wang, Ye PD. Novel convex optimization techniques for circuit analysis and synthesis. [Doctoral Dissertation]. University of Texas – Austin; 2018. Available from: http://hdl.handle.net/2152/67661

Rutgers University

9. Yao, Wang, 1985-. Approximate versions of the alternating direction method of multipliers.

Degree: PhD, Operations Research, 2016, Rutgers University

URL: https://rucore.libraries.rutgers.edu/rutgers-lib/51517/

►

*Convex* *optimization* is at the core of many of today's analysis tools for large datasets, and in particular machine learning methods. This thesis will develop…
(more)

Subjects/Keywords: Mathematical optimization; Convex functions

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Yao, Wang, 1. (2016). Approximate versions of the alternating direction method of multipliers. (Doctoral Dissertation). Rutgers University. Retrieved from https://rucore.libraries.rutgers.edu/rutgers-lib/51517/

Chicago Manual of Style (16^{th} Edition):

Yao, Wang, 1985-. “Approximate versions of the alternating direction method of multipliers.” 2016. Doctoral Dissertation, Rutgers University. Accessed March 04, 2021. https://rucore.libraries.rutgers.edu/rutgers-lib/51517/.

MLA Handbook (7^{th} Edition):

Yao, Wang, 1985-. “Approximate versions of the alternating direction method of multipliers.” 2016. Web. 04 Mar 2021.

Vancouver:

Yao, Wang 1. Approximate versions of the alternating direction method of multipliers. [Internet] [Doctoral dissertation]. Rutgers University; 2016. [cited 2021 Mar 04]. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/51517/.

Council of Science Editors:

Yao, Wang 1. Approximate versions of the alternating direction method of multipliers. [Doctoral Dissertation]. Rutgers University; 2016. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/51517/

Princeton University

10.
Bullins, Brian Anderson.
Efficient Higher-Order *Optimization* for Machine Learning
.

Degree: PhD, 2019, Princeton University

URL: http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c

► In recent years, stochastic gradient descent (SGD) has taken center stage for training large-scale models in machine learning. Although some higher-order methods have improved iteration…
(more)

Subjects/Keywords: convex optimization; higher-order; machine learning; non-convex optimization; second-order

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Bullins, B. A. (2019). Efficient Higher-Order Optimization for Machine Learning . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c

Chicago Manual of Style (16^{th} Edition):

Bullins, Brian Anderson. “Efficient Higher-Order Optimization for Machine Learning .” 2019. Doctoral Dissertation, Princeton University. Accessed March 04, 2021. http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c.

MLA Handbook (7^{th} Edition):

Bullins, Brian Anderson. “Efficient Higher-Order Optimization for Machine Learning .” 2019. Web. 04 Mar 2021.

Vancouver:

Bullins BA. Efficient Higher-Order Optimization for Machine Learning . [Internet] [Doctoral dissertation]. Princeton University; 2019. [cited 2021 Mar 04]. Available from: http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c.

Council of Science Editors:

Bullins BA. Efficient Higher-Order Optimization for Machine Learning . [Doctoral Dissertation]. Princeton University; 2019. Available from: http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c

Penn State University

11.
Wang, Zi.
First-Order Methods for Large Scale *Convex* * Optimization*.

Degree: 2016, Penn State University

URL: https://submit-etda.libraries.psu.edu/catalog/13485zxw121

► The revolution of storage technology in the past few decades made it possible to gather tremendous amount of data anywhere from demand and sales records…
(more)

Subjects/Keywords: first-order methods; convex optimization; distributed optimization; convex regression; multi-agent consensus optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, Z. (2016). First-Order Methods for Large Scale Convex Optimization. (Thesis). Penn State University. Retrieved from https://submit-etda.libraries.psu.edu/catalog/13485zxw121

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Wang, Zi. “First-Order Methods for Large Scale Convex Optimization.” 2016. Thesis, Penn State University. Accessed March 04, 2021. https://submit-etda.libraries.psu.edu/catalog/13485zxw121.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Wang, Zi. “First-Order Methods for Large Scale Convex Optimization.” 2016. Web. 04 Mar 2021.

Vancouver:

Wang Z. First-Order Methods for Large Scale Convex Optimization. [Internet] [Thesis]. Penn State University; 2016. [cited 2021 Mar 04]. Available from: https://submit-etda.libraries.psu.edu/catalog/13485zxw121.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang Z. First-Order Methods for Large Scale Convex Optimization. [Thesis]. Penn State University; 2016. Available from: https://submit-etda.libraries.psu.edu/catalog/13485zxw121

Not specified: Masters Thesis or Doctoral Dissertation

University of Iowa

12.
Xu, Yi.
Accelerating *convex* *optimization* in machine learning by leveraging functional growth conditions.

Degree: PhD, Computer Science, 2019, University of Iowa

URL: https://ir.uiowa.edu/etd/7048

► In recent years, unprecedented growths in scale and dimensionality of data raise big computational challenges for traditional *optimization* algorithms; thus it becomes very important…
(more)

Subjects/Keywords: Convex Optimization; Local Error Bound; Computer Sciences

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Xu, Y. (2019). Accelerating convex optimization in machine learning by leveraging functional growth conditions. (Doctoral Dissertation). University of Iowa. Retrieved from https://ir.uiowa.edu/etd/7048

Chicago Manual of Style (16^{th} Edition):

Xu, Yi. “Accelerating convex optimization in machine learning by leveraging functional growth conditions.” 2019. Doctoral Dissertation, University of Iowa. Accessed March 04, 2021. https://ir.uiowa.edu/etd/7048.

MLA Handbook (7^{th} Edition):

Xu, Yi. “Accelerating convex optimization in machine learning by leveraging functional growth conditions.” 2019. Web. 04 Mar 2021.

Vancouver:

Xu Y. Accelerating convex optimization in machine learning by leveraging functional growth conditions. [Internet] [Doctoral dissertation]. University of Iowa; 2019. [cited 2021 Mar 04]. Available from: https://ir.uiowa.edu/etd/7048.

Council of Science Editors:

Xu Y. Accelerating convex optimization in machine learning by leveraging functional growth conditions. [Doctoral Dissertation]. University of Iowa; 2019. Available from: https://ir.uiowa.edu/etd/7048

Universidade Nova

13. Soares, Diogo Lopes. Design of multidimensional compact constellations with high power efficiency.

Degree: 2013, Universidade Nova

URL: http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111

Dissertação apresentada para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
*Advisors/Committee Members: Dinis, Rui, Beko, Marko.*

Subjects/Keywords: Multidimensional constellations; Power efficiency; Convex optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Soares, D. L. (2013). Design of multidimensional compact constellations with high power efficiency. (Thesis). Universidade Nova. Retrieved from http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Soares, Diogo Lopes. “Design of multidimensional compact constellations with high power efficiency.” 2013. Thesis, Universidade Nova. Accessed March 04, 2021. http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Soares, Diogo Lopes. “Design of multidimensional compact constellations with high power efficiency.” 2013. Web. 04 Mar 2021.

Vancouver:

Soares DL. Design of multidimensional compact constellations with high power efficiency. [Internet] [Thesis]. Universidade Nova; 2013. [cited 2021 Mar 04]. Available from: http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Soares DL. Design of multidimensional compact constellations with high power efficiency. [Thesis]. Universidade Nova; 2013. Available from: http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111

Not specified: Masters Thesis or Doctoral Dissertation

Université Catholique de Louvain

14. Orban de Xivry, François-Xavier. Nearest stable system.

Degree: 2013, Université Catholique de Louvain

URL: http://hdl.handle.net/2078.1/132586

►

Stability is a universal concept which we experience in our everyday lives. It plays a central role in the study of dynamical systems and is… (more)

Subjects/Keywords: Stability; Dynamical system; Convex optimization; Nonconvex; Nonsmooth

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Orban de Xivry, F. (2013). Nearest stable system. (Thesis). Université Catholique de Louvain. Retrieved from http://hdl.handle.net/2078.1/132586

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Orban de Xivry, François-Xavier. “Nearest stable system.” 2013. Thesis, Université Catholique de Louvain. Accessed March 04, 2021. http://hdl.handle.net/2078.1/132586.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Orban de Xivry, François-Xavier. “Nearest stable system.” 2013. Web. 04 Mar 2021.

Vancouver:

Orban de Xivry F. Nearest stable system. [Internet] [Thesis]. Université Catholique de Louvain; 2013. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2078.1/132586.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Orban de Xivry F. Nearest stable system. [Thesis]. Université Catholique de Louvain; 2013. Available from: http://hdl.handle.net/2078.1/132586

Not specified: Masters Thesis or Doctoral Dissertation

University of Ghana

15.
Katsekpor, T.
Iterative Methods for Large Scale *Convex* * Optimization*
.

Degree: 2017, University of Ghana

URL: http://ugspace.ug.edu.gh/handle/123456789/23393

► This thesis presents a detailed description and analysis of Bregman’s iterative method for *convex* programming with linear constraints. Row and block action methods for large…
(more)

Subjects/Keywords: Iterative Methods; Large Scale Convex; Optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Katsekpor, T. (2017). Iterative Methods for Large Scale Convex Optimization . (Doctoral Dissertation). University of Ghana. Retrieved from http://ugspace.ug.edu.gh/handle/123456789/23393

Chicago Manual of Style (16^{th} Edition):

Katsekpor, T. “Iterative Methods for Large Scale Convex Optimization .” 2017. Doctoral Dissertation, University of Ghana. Accessed March 04, 2021. http://ugspace.ug.edu.gh/handle/123456789/23393.

MLA Handbook (7^{th} Edition):

Katsekpor, T. “Iterative Methods for Large Scale Convex Optimization .” 2017. Web. 04 Mar 2021.

Vancouver:

Katsekpor T. Iterative Methods for Large Scale Convex Optimization . [Internet] [Doctoral dissertation]. University of Ghana; 2017. [cited 2021 Mar 04]. Available from: http://ugspace.ug.edu.gh/handle/123456789/23393.

Council of Science Editors:

Katsekpor T. Iterative Methods for Large Scale Convex Optimization . [Doctoral Dissertation]. University of Ghana; 2017. Available from: http://ugspace.ug.edu.gh/handle/123456789/23393

Iowa State University

16. Li, Chong. Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds.

Degree: 2013, Iowa State University

URL: https://lib.dr.iastate.edu/etd/13421

► Since the success of obtaining the capacity (i.e. the maximal achievable transmission rate under which the message can be recovered with arbitrarily small probability of…
(more)

Subjects/Keywords: Capacity; Convex Optimization; Feedback; Information Theory; Engineering

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Li, C. (2013). Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds. (Thesis). Iowa State University. Retrieved from https://lib.dr.iastate.edu/etd/13421

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Li, Chong. “Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds.” 2013. Thesis, Iowa State University. Accessed March 04, 2021. https://lib.dr.iastate.edu/etd/13421.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Li, Chong. “Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds.” 2013. Web. 04 Mar 2021.

Vancouver:

Li C. Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds. [Internet] [Thesis]. Iowa State University; 2013. [cited 2021 Mar 04]. Available from: https://lib.dr.iastate.edu/etd/13421.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Li C. Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds. [Thesis]. Iowa State University; 2013. Available from: https://lib.dr.iastate.edu/etd/13421

Not specified: Masters Thesis or Doctoral Dissertation

University of Technology, Sydney

17. Ho, Huu Minh Tam. Interference management in 5G cellular networks.

Degree: 2016, University of Technology, Sydney

URL: http://hdl.handle.net/10453/102703

► This dissertation is concerned with the nonconvex *optimization* problems of interference management under the consideration of new disruptive technologies in the fifth-generation cellular networks. These…
(more)

Subjects/Keywords: Convex programming.; Mathematical optimization.; Algorithms.; Nonconvex programming.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ho, H. M. T. (2016). Interference management in 5G cellular networks. (Thesis). University of Technology, Sydney. Retrieved from http://hdl.handle.net/10453/102703

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Ho, Huu Minh Tam. “Interference management in 5G cellular networks.” 2016. Thesis, University of Technology, Sydney. Accessed March 04, 2021. http://hdl.handle.net/10453/102703.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Ho, Huu Minh Tam. “Interference management in 5G cellular networks.” 2016. Web. 04 Mar 2021.

Vancouver:

Ho HMT. Interference management in 5G cellular networks. [Internet] [Thesis]. University of Technology, Sydney; 2016. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10453/102703.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ho HMT. Interference management in 5G cellular networks. [Thesis]. University of Technology, Sydney; 2016. Available from: http://hdl.handle.net/10453/102703

Not specified: Masters Thesis or Doctoral Dissertation

Delft University of Technology

18.
Zhang, H.M. (author).
Distributed *Convex* *Optimization*: A Study on the Primal-Dual Method of Multipliers.

Degree: 2015, Delft University of Technology

URL: http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b

►

The Primal-Dual Method of Multipliers (PDMM) is a new algorithm that solves *convex* *optimization* problems in a distributed manner. This study focuses on the convergence…
(more)

Subjects/Keywords: convex optimization; distributed signal processing; ADMM; PDMM

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhang, H. M. (. (2015). Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b

Chicago Manual of Style (16^{th} Edition):

Zhang, H M (author). “Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers.” 2015. Masters Thesis, Delft University of Technology. Accessed March 04, 2021. http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b.

MLA Handbook (7^{th} Edition):

Zhang, H M (author). “Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers.” 2015. Web. 04 Mar 2021.

Vancouver:

Zhang HM(. Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers. [Internet] [Masters thesis]. Delft University of Technology; 2015. [cited 2021 Mar 04]. Available from: http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b.

Council of Science Editors:

Zhang HM(. Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers. [Masters Thesis]. Delft University of Technology; 2015. Available from: http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b

University of Minnesota

19.
Choi, Hyungjin.
Quantication of the Impact of Uncertainty in Power Systems using *Convex* * Optimization*.

Degree: PhD, Electrical Engineering, 2017, University of Minnesota

URL: http://hdl.handle.net/11299/190457

► Rampant integration of renewable resources (e.g., photovoltaic and wind-energy conversion systems) and uncontrollable and elastic loads (e.g., plug-in hybrid electric vehicles) are rapidly transforming power…
(more)

Subjects/Keywords: Convex Optimization; Power Systems; Sensitivity; Stability; Uncertainty

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Choi, H. (2017). Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/190457

Chicago Manual of Style (16^{th} Edition):

Choi, Hyungjin. “Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization.” 2017. Doctoral Dissertation, University of Minnesota. Accessed March 04, 2021. http://hdl.handle.net/11299/190457.

MLA Handbook (7^{th} Edition):

Choi, Hyungjin. “Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization.” 2017. Web. 04 Mar 2021.

Vancouver:

Choi H. Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization. [Internet] [Doctoral dissertation]. University of Minnesota; 2017. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/11299/190457.

Council of Science Editors:

Choi H. Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization. [Doctoral Dissertation]. University of Minnesota; 2017. Available from: http://hdl.handle.net/11299/190457

University of Waterloo

20.
Karimi, Mehdi.
*Convex**Optimization* via Domain-Driven Barriers and Primal-Dual Interior-Point Methods.

Degree: 2017, University of Waterloo

URL: http://hdl.handle.net/10012/12209

► This thesis studies the theory and implementation of infeasible-start primal-dual interior-point methods for *convex* *optimization* problems. *Convex* *optimization* has applications in many fields of engineering…
(more)

Subjects/Keywords: convex optimization; primal-dual interior-point methods

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Karimi, M. (2017). Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/12209

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Karimi, Mehdi. “Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods.” 2017. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/12209.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Karimi, Mehdi. “Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods.” 2017. Web. 04 Mar 2021.

Vancouver:

Karimi M. Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods. [Internet] [Thesis]. University of Waterloo; 2017. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/12209.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Karimi M. Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods. [Thesis]. University of Waterloo; 2017. Available from: http://hdl.handle.net/10012/12209

Not specified: Masters Thesis or Doctoral Dissertation

21.
Umenberger, Jack.
* Convex* Identifcation of Stable Dynamical Systems
.

Degree: 2017, University of Sydney

URL: http://hdl.handle.net/2123/17321

► This thesis concerns the scalable application of *convex* *optimization* to data-driven modeling of dynamical systems, termed system identi cation in the control community. Two problems…
(more)

Subjects/Keywords: system identification; convex optimization; positive systems

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Umenberger, J. (2017). Convex Identifcation of Stable Dynamical Systems . (Thesis). University of Sydney. Retrieved from http://hdl.handle.net/2123/17321

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Umenberger, Jack. “Convex Identifcation of Stable Dynamical Systems .” 2017. Thesis, University of Sydney. Accessed March 04, 2021. http://hdl.handle.net/2123/17321.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Umenberger, Jack. “Convex Identifcation of Stable Dynamical Systems .” 2017. Web. 04 Mar 2021.

Vancouver:

Umenberger J. Convex Identifcation of Stable Dynamical Systems . [Internet] [Thesis]. University of Sydney; 2017. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2123/17321.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Umenberger J. Convex Identifcation of Stable Dynamical Systems . [Thesis]. University of Sydney; 2017. Available from: http://hdl.handle.net/2123/17321

Not specified: Masters Thesis or Doctoral Dissertation

Delft University of Technology

22.
Zattoni Scroccaro, Pedro (author).
Online *Convex* *Optimization* with Predictions: Static and Dynamic Environments.

Degree: 2020, Delft University of Technology

URL: http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b

►

In this thesis, we study Online *Convex* *Optimization* algorithms that exploit predictive and/or dynamical information about a problem instance. These features are inspired by recent…
(more)

Subjects/Keywords: Online Convex Optimization; Prediction; Online Learning

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zattoni Scroccaro, P. (. (2020). Online Convex Optimization with Predictions: Static and Dynamic Environments. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b

Chicago Manual of Style (16^{th} Edition):

Zattoni Scroccaro, Pedro (author). “Online Convex Optimization with Predictions: Static and Dynamic Environments.” 2020. Masters Thesis, Delft University of Technology. Accessed March 04, 2021. http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b.

MLA Handbook (7^{th} Edition):

Zattoni Scroccaro, Pedro (author). “Online Convex Optimization with Predictions: Static and Dynamic Environments.” 2020. Web. 04 Mar 2021.

Vancouver:

Zattoni Scroccaro P(. Online Convex Optimization with Predictions: Static and Dynamic Environments. [Internet] [Masters thesis]. Delft University of Technology; 2020. [cited 2021 Mar 04]. Available from: http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b.

Council of Science Editors:

Zattoni Scroccaro P(. Online Convex Optimization with Predictions: Static and Dynamic Environments. [Masters Thesis]. Delft University of Technology; 2020. Available from: http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b

University of Waterloo

23. Wang, Houze. A Convergent Hierarchy of Certificates for Constrained Signomial Positivity.

Degree: 2020, University of Waterloo

URL: http://hdl.handle.net/10012/16361

► *Optimization* is at the heart of many engineering problems. Many *optimization* problems, however, are computationally intractable. One approach to tackle such intractability is to find…
(more)

Subjects/Keywords: Convex Optimization; Signomial Programming; Algebraic Geometry

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Wang, H. (2020). A Convergent Hierarchy of Certificates for Constrained Signomial Positivity. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/16361

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Wang, Houze. “A Convergent Hierarchy of Certificates for Constrained Signomial Positivity.” 2020. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/16361.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Wang, Houze. “A Convergent Hierarchy of Certificates for Constrained Signomial Positivity.” 2020. Web. 04 Mar 2021.

Vancouver:

Wang H. A Convergent Hierarchy of Certificates for Constrained Signomial Positivity. [Internet] [Thesis]. University of Waterloo; 2020. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/16361.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang H. A Convergent Hierarchy of Certificates for Constrained Signomial Positivity. [Thesis]. University of Waterloo; 2020. Available from: http://hdl.handle.net/10012/16361

Not specified: Masters Thesis or Doctoral Dissertation

Portland State University

24.
Tran, Tuyen Dang Thanh.
* Convex* and Nonconvex

Degree: PhD, Mathematics and Statistics, 2020, Portland State University

URL: https://pdxscholar.library.pdx.edu/open_access_etds/5482

► This thesis contains contributions in two main areas: calculus rules for generalized differentiation and *optimization* methods for solving nonsmooth nonconvex problems with applications to…
(more)

Subjects/Keywords: Convex domains; Mathematical optimization; Calculus; Mathematics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Tran, T. D. T. (2020). Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering. (Doctoral Dissertation). Portland State University. Retrieved from https://pdxscholar.library.pdx.edu/open_access_etds/5482

Chicago Manual of Style (16^{th} Edition):

Tran, Tuyen Dang Thanh. “Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering.” 2020. Doctoral Dissertation, Portland State University. Accessed March 04, 2021. https://pdxscholar.library.pdx.edu/open_access_etds/5482.

MLA Handbook (7^{th} Edition):

Tran, Tuyen Dang Thanh. “Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering.” 2020. Web. 04 Mar 2021.

Vancouver:

Tran TDT. Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering. [Internet] [Doctoral dissertation]. Portland State University; 2020. [cited 2021 Mar 04]. Available from: https://pdxscholar.library.pdx.edu/open_access_etds/5482.

Council of Science Editors:

Tran TDT. Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering. [Doctoral Dissertation]. Portland State University; 2020. Available from: https://pdxscholar.library.pdx.edu/open_access_etds/5482

25.
Linhares Rodrigues, Andre.
Approximation Algorithms for Distributionally Robust Stochastic * Optimization*.

Degree: 2019, University of Waterloo

URL: http://hdl.handle.net/10012/14639

► Two-stage stochastic *optimization* is a widely used framework for modeling uncertainty, where we have a probability distribution over possible realizations of the data, called scenarios,…
(more)

Subjects/Keywords: approximation algorithms; stochastic optimization; discrete optimization; convex optimization

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Linhares Rodrigues, A. (2019). Approximation Algorithms for Distributionally Robust Stochastic Optimization. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/14639

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Linhares Rodrigues, Andre. “Approximation Algorithms for Distributionally Robust Stochastic Optimization.” 2019. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/14639.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Linhares Rodrigues, Andre. “Approximation Algorithms for Distributionally Robust Stochastic Optimization.” 2019. Web. 04 Mar 2021.

Vancouver:

Linhares Rodrigues A. Approximation Algorithms for Distributionally Robust Stochastic Optimization. [Internet] [Thesis]. University of Waterloo; 2019. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/14639.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Linhares Rodrigues A. Approximation Algorithms for Distributionally Robust Stochastic Optimization. [Thesis]. University of Waterloo; 2019. Available from: http://hdl.handle.net/10012/14639

Not specified: Masters Thesis or Doctoral Dissertation

Loughborough University

26.
Rossetti, Gaia.
Mathematical *optimization* techniques for cognitive radar networks.

Degree: PhD, 2018, Loughborough University

URL: http://hdl.handle.net/2134/33419

► This thesis discusses mathematical *optimization* techniques for waveform design in cognitive radars. These techniques have been designed with an increasing level of sophistication, starting from…
(more)

Subjects/Keywords: 621.3848; Waveform optimization; Convex optimization; Robust optimization; Cognitive radars

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Rossetti, G. (2018). Mathematical optimization techniques for cognitive radar networks. (Doctoral Dissertation). Loughborough University. Retrieved from http://hdl.handle.net/2134/33419

Chicago Manual of Style (16^{th} Edition):

Rossetti, Gaia. “Mathematical optimization techniques for cognitive radar networks.” 2018. Doctoral Dissertation, Loughborough University. Accessed March 04, 2021. http://hdl.handle.net/2134/33419.

MLA Handbook (7^{th} Edition):

Rossetti, Gaia. “Mathematical optimization techniques for cognitive radar networks.” 2018. Web. 04 Mar 2021.

Vancouver:

Rossetti G. Mathematical optimization techniques for cognitive radar networks. [Internet] [Doctoral dissertation]. Loughborough University; 2018. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2134/33419.

Council of Science Editors:

Rossetti G. Mathematical optimization techniques for cognitive radar networks. [Doctoral Dissertation]. Loughborough University; 2018. Available from: http://hdl.handle.net/2134/33419

Penn State University

27.
Jalilzadeh, Afrooz.
Variance-reduced First-Order Methods for *Convex* Stochastic *Optimization* and Monotone Stochastic Variational Inequality Problems.

Degree: 2020, Penn State University

URL: https://submit-etda.libraries.psu.edu/catalog/17968azj5286

► *Optimization* problems with expectation-valued objectives are afflicted by a difficulty in that the expectation of neither the objective nor the gradient can be evaluated in…
(more)

Subjects/Keywords: Stochastic Optimization; Convex Optimization; Stochastic Approximation; Nonsmooth Optimization; Stochastic Variational Inequality

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Jalilzadeh, A. (2020). Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems. (Thesis). Penn State University. Retrieved from https://submit-etda.libraries.psu.edu/catalog/17968azj5286

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Jalilzadeh, Afrooz. “Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems.” 2020. Thesis, Penn State University. Accessed March 04, 2021. https://submit-etda.libraries.psu.edu/catalog/17968azj5286.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Jalilzadeh, Afrooz. “Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems.” 2020. Web. 04 Mar 2021.

Vancouver:

Jalilzadeh A. Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems. [Internet] [Thesis]. Penn State University; 2020. [cited 2021 Mar 04]. Available from: https://submit-etda.libraries.psu.edu/catalog/17968azj5286.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Jalilzadeh A. Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems. [Thesis]. Penn State University; 2020. Available from: https://submit-etda.libraries.psu.edu/catalog/17968azj5286

Not specified: Masters Thesis or Doctoral Dissertation

Texas A&M University

28.
Fan, Siqi.
Learning Gaussian Latent Graphical Models Via Partial *Convex* * Optimization*.

Degree: MS, Electrical Engineering, 2019, Texas A&M University

URL: http://hdl.handle.net/1969.1/188729

► Latent Gaussian graphical models are very useful in probabilistic modeling to measure the statistical relationships between different variables and present them in the form of…
(more)

Subjects/Keywords: Gaussian latent graphical models; Convex optimization; Chow-Liu algorithm; CL Recursive Grouping; Partial convex optimization.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Fan, S. (2019). Learning Gaussian Latent Graphical Models Via Partial Convex Optimization. (Masters Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/188729

Chicago Manual of Style (16^{th} Edition):

Fan, Siqi. “Learning Gaussian Latent Graphical Models Via Partial Convex Optimization.” 2019. Masters Thesis, Texas A&M University. Accessed March 04, 2021. http://hdl.handle.net/1969.1/188729.

MLA Handbook (7^{th} Edition):

Fan, Siqi. “Learning Gaussian Latent Graphical Models Via Partial Convex Optimization.” 2019. Web. 04 Mar 2021.

Vancouver:

Fan S. Learning Gaussian Latent Graphical Models Via Partial Convex Optimization. [Internet] [Masters thesis]. Texas A&M University; 2019. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/1969.1/188729.

Council of Science Editors:

Fan S. Learning Gaussian Latent Graphical Models Via Partial Convex Optimization. [Masters Thesis]. Texas A&M University; 2019. Available from: http://hdl.handle.net/1969.1/188729

University of Waterloo

29. Sremac, Stefan. Error Bounds and Singularity Degree in Semidefinite Programming.

Degree: 2020, University of Waterloo

URL: http://hdl.handle.net/10012/15583

► An important process in *optimization* is to determine the quality of a proposed solution. This usually entails calculation of the distance of a proposed solution…
(more)

Subjects/Keywords: semidefinite programming; optimization; error bounds; singularity degree; mathematical programming; convex optimization; conic optimization; Semidefinite programming; Combinatorial optimization; Programming (Mathematics); Convex programming

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sremac, S. (2020). Error Bounds and Singularity Degree in Semidefinite Programming. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/15583

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Sremac, Stefan. “Error Bounds and Singularity Degree in Semidefinite Programming.” 2020. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/15583.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Sremac, Stefan. “Error Bounds and Singularity Degree in Semidefinite Programming.” 2020. Web. 04 Mar 2021.

Vancouver:

Sremac S. Error Bounds and Singularity Degree in Semidefinite Programming. [Internet] [Thesis]. University of Waterloo; 2020. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/15583.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sremac S. Error Bounds and Singularity Degree in Semidefinite Programming. [Thesis]. University of Waterloo; 2020. Available from: http://hdl.handle.net/10012/15583

Not specified: Masters Thesis or Doctoral Dissertation

Penn State University

30. Kang, Bosung. Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP).

Degree: 2015, Penn State University

URL: https://submit-etda.libraries.psu.edu/catalog/26539

► Estimating the disturbance or clutter covariance is a centrally important problem in radar space time adaptive processing (STAP) since estimation of the disturbance or interference…
(more)

Subjects/Keywords: convex optimization; STAP; radar signal processing; constrained optimization; detection and estimation

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Kang, B. (2015). Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP). (Thesis). Penn State University. Retrieved from https://submit-etda.libraries.psu.edu/catalog/26539

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Kang, Bosung. “Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP).” 2015. Thesis, Penn State University. Accessed March 04, 2021. https://submit-etda.libraries.psu.edu/catalog/26539.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Kang, Bosung. “Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP).” 2015. Web. 04 Mar 2021.

Vancouver:

Kang B. Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP). [Internet] [Thesis]. Penn State University; 2015. [cited 2021 Mar 04]. Available from: https://submit-etda.libraries.psu.edu/catalog/26539.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Kang B. Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP). [Thesis]. Penn State University; 2015. Available from: https://submit-etda.libraries.psu.edu/catalog/26539

Not specified: Masters Thesis or Doctoral Dissertation