Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Convex Optimization). Showing records 1 – 30 of 520 total matches.

[1] [2] [3] [4] [5] … [18]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Languages

Country

▼ Search Limiters

1. Uthayakumar, R. Study on convergence of optimization problems;.

Degree: 2014, INFLIBNET

In this thesis, various notions of convergence of sequence of sets and functions and their applications in the convergence of the optimal values under the… (more)

Subjects/Keywords: Convergence; Convex; Functions; Non-convex; Optimization; Sets

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Uthayakumar, R. (2014). Study on convergence of optimization problems;. (Thesis). INFLIBNET. Retrieved from http://shodhganga.inflibnet.ac.in/handle/10603/17964

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Uthayakumar, R. “Study on convergence of optimization problems;.” 2014. Thesis, INFLIBNET. Accessed March 04, 2021. http://shodhganga.inflibnet.ac.in/handle/10603/17964.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Uthayakumar, R. “Study on convergence of optimization problems;.” 2014. Web. 04 Mar 2021.

Vancouver:

Uthayakumar R. Study on convergence of optimization problems;. [Internet] [Thesis]. INFLIBNET; 2014. [cited 2021 Mar 04]. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/17964.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Uthayakumar R. Study on convergence of optimization problems;. [Thesis]. INFLIBNET; 2014. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/17964

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


NSYSU

2. Zhang, Shu-Bin. Study on Digital Filter Design and Coefficient Quantization.

Degree: Master, Communications Engineering, 2011, NSYSU

 In this thesis, the basic theory is convex optimization theory[1]. And we study the problem about how to transfer to convex optimization problem from the… (more)

Subjects/Keywords: Filter; Optimization; Convex; Bits; Quantization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhang, S. (2011). Study on Digital Filter Design and Coefficient Quantization. (Thesis). NSYSU. Retrieved from http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Zhang, Shu-Bin. “Study on Digital Filter Design and Coefficient Quantization.” 2011. Thesis, NSYSU. Accessed March 04, 2021. http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Zhang, Shu-Bin. “Study on Digital Filter Design and Coefficient Quantization.” 2011. Web. 04 Mar 2021.

Vancouver:

Zhang S. Study on Digital Filter Design and Coefficient Quantization. [Internet] [Thesis]. NSYSU; 2011. [cited 2021 Mar 04]. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhang S. Study on Digital Filter Design and Coefficient Quantization. [Thesis]. NSYSU; 2011. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0727111-135237

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Victoria University of Wellington

3. Jellyman, Dayle Raymond. Convex Optimization for Distributed Acoustic Beamforming.

Degree: 2017, Victoria University of Wellington

 Beamforming filter optimization can be performed over a distributed wireless sensor network, but the output calculation remains either centralized or linked in time to the… (more)

Subjects/Keywords: Distributed; Beamforming; Convex optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jellyman, D. R. (2017). Convex Optimization for Distributed Acoustic Beamforming. (Masters Thesis). Victoria University of Wellington. Retrieved from http://hdl.handle.net/10063/6650

Chicago Manual of Style (16th Edition):

Jellyman, Dayle Raymond. “Convex Optimization for Distributed Acoustic Beamforming.” 2017. Masters Thesis, Victoria University of Wellington. Accessed March 04, 2021. http://hdl.handle.net/10063/6650.

MLA Handbook (7th Edition):

Jellyman, Dayle Raymond. “Convex Optimization for Distributed Acoustic Beamforming.” 2017. Web. 04 Mar 2021.

Vancouver:

Jellyman DR. Convex Optimization for Distributed Acoustic Beamforming. [Internet] [Masters thesis]. Victoria University of Wellington; 2017. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10063/6650.

Council of Science Editors:

Jellyman DR. Convex Optimization for Distributed Acoustic Beamforming. [Masters Thesis]. Victoria University of Wellington; 2017. Available from: http://hdl.handle.net/10063/6650


Université Catholique de Louvain

4. Martin, Benoît. Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization.

Degree: 2018, Université Catholique de Louvain

Autonomous microgrid planning requires to consider both investments in network and generation assets as there is no connection to another power system. In this problem,… (more)

Subjects/Keywords: Planning; Microgrid; Convex optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Martin, B. (2018). Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization. (Thesis). Université Catholique de Louvain. Retrieved from http://hdl.handle.net/2078.1/214246

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Martin, Benoît. “Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization.” 2018. Thesis, Université Catholique de Louvain. Accessed March 04, 2021. http://hdl.handle.net/2078.1/214246.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Martin, Benoît. “Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization.” 2018. Web. 04 Mar 2021.

Vancouver:

Martin B. Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization. [Internet] [Thesis]. Université Catholique de Louvain; 2018. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2078.1/214246.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Martin B. Autonomous microgrids for rural electrification : joint investment planning of power generation and distribution through convex optimization. [Thesis]. Université Catholique de Louvain; 2018. Available from: http://hdl.handle.net/2078.1/214246

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

5. Cosentino, Alessandro. Quantum State Local Distinguishability via Convex Optimization.

Degree: 2015, University of Waterloo

 Entanglement and nonlocality play a fundamental role in quantum computing. To understand the interplay between these phenomena, researchers have considered the model of local operations… (more)

Subjects/Keywords: quantum information; convex optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Cosentino, A. (2015). Quantum State Local Distinguishability via Convex Optimization. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/9572

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Cosentino, Alessandro. “Quantum State Local Distinguishability via Convex Optimization.” 2015. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/9572.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Cosentino, Alessandro. “Quantum State Local Distinguishability via Convex Optimization.” 2015. Web. 04 Mar 2021.

Vancouver:

Cosentino A. Quantum State Local Distinguishability via Convex Optimization. [Internet] [Thesis]. University of Waterloo; 2015. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/9572.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Cosentino A. Quantum State Local Distinguishability via Convex Optimization. [Thesis]. University of Waterloo; 2015. Available from: http://hdl.handle.net/10012/9572

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Texas – Austin

6. Berning, Andrew Walter, Jr. Verification of successive convexification algorithm.

Degree: MSin Engineering, Aerospace engineering, 2016, University of Texas – Austin

 In this report, I describe a technique which allows a non-convex optimal control problem to be expressed and solved in a convex manner. I then… (more)

Subjects/Keywords: Convex; Convexification; Optimization; Verification

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Berning, Andrew Walter, J. (2016). Verification of successive convexification algorithm. (Masters Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/41579

Chicago Manual of Style (16th Edition):

Berning, Andrew Walter, Jr. “Verification of successive convexification algorithm.” 2016. Masters Thesis, University of Texas – Austin. Accessed March 04, 2021. http://hdl.handle.net/2152/41579.

MLA Handbook (7th Edition):

Berning, Andrew Walter, Jr. “Verification of successive convexification algorithm.” 2016. Web. 04 Mar 2021.

Vancouver:

Berning, Andrew Walter J. Verification of successive convexification algorithm. [Internet] [Masters thesis]. University of Texas – Austin; 2016. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2152/41579.

Council of Science Editors:

Berning, Andrew Walter J. Verification of successive convexification algorithm. [Masters Thesis]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/41579


University of Texas – Austin

7. Park, Dohyung. Efficient non-convex algorithms for large-scale learning problems.

Degree: PhD, Electrical and Computer engineering, 2016, University of Texas – Austin

 The emergence of modern large-scale datasets has led to a huge interest in the problem of learning hidden complex structures. Not only can models from… (more)

Subjects/Keywords: Machine learning; Non-convex optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Park, D. (2016). Efficient non-convex algorithms for large-scale learning problems. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/46581

Chicago Manual of Style (16th Edition):

Park, Dohyung. “Efficient non-convex algorithms for large-scale learning problems.” 2016. Doctoral Dissertation, University of Texas – Austin. Accessed March 04, 2021. http://hdl.handle.net/2152/46581.

MLA Handbook (7th Edition):

Park, Dohyung. “Efficient non-convex algorithms for large-scale learning problems.” 2016. Web. 04 Mar 2021.

Vancouver:

Park D. Efficient non-convex algorithms for large-scale learning problems. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2016. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2152/46581.

Council of Science Editors:

Park D. Efficient non-convex algorithms for large-scale learning problems. [Doctoral Dissertation]. University of Texas – Austin; 2016. Available from: http://hdl.handle.net/2152/46581


University of Texas – Austin

8. Wang, Ye, Ph. D. Novel convex optimization techniques for circuit analysis and synthesis.

Degree: PhD, Electrical and Computer Engineering, 2018, University of Texas – Austin

 Technology scaling brings about the need for computationally efficient methods for circuit analysis, optimization, and synthesis. Convex optimization is a special class of mathematical optimization(more)

Subjects/Keywords: Convex optimization; EDA problems

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, Ye, P. D. (2018). Novel convex optimization techniques for circuit analysis and synthesis. (Doctoral Dissertation). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/67661

Chicago Manual of Style (16th Edition):

Wang, Ye, Ph D. “Novel convex optimization techniques for circuit analysis and synthesis.” 2018. Doctoral Dissertation, University of Texas – Austin. Accessed March 04, 2021. http://hdl.handle.net/2152/67661.

MLA Handbook (7th Edition):

Wang, Ye, Ph D. “Novel convex optimization techniques for circuit analysis and synthesis.” 2018. Web. 04 Mar 2021.

Vancouver:

Wang, Ye PD. Novel convex optimization techniques for circuit analysis and synthesis. [Internet] [Doctoral dissertation]. University of Texas – Austin; 2018. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2152/67661.

Council of Science Editors:

Wang, Ye PD. Novel convex optimization techniques for circuit analysis and synthesis. [Doctoral Dissertation]. University of Texas – Austin; 2018. Available from: http://hdl.handle.net/2152/67661


Rutgers University

9. Yao, Wang, 1985-. Approximate versions of the alternating direction method of multipliers.

Degree: PhD, Operations Research, 2016, Rutgers University

Convex optimization is at the core of many of today's analysis tools for large datasets, and in particular machine learning methods. This thesis will develop… (more)

Subjects/Keywords: Mathematical optimization; Convex functions

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yao, Wang, 1. (2016). Approximate versions of the alternating direction method of multipliers. (Doctoral Dissertation). Rutgers University. Retrieved from https://rucore.libraries.rutgers.edu/rutgers-lib/51517/

Chicago Manual of Style (16th Edition):

Yao, Wang, 1985-. “Approximate versions of the alternating direction method of multipliers.” 2016. Doctoral Dissertation, Rutgers University. Accessed March 04, 2021. https://rucore.libraries.rutgers.edu/rutgers-lib/51517/.

MLA Handbook (7th Edition):

Yao, Wang, 1985-. “Approximate versions of the alternating direction method of multipliers.” 2016. Web. 04 Mar 2021.

Vancouver:

Yao, Wang 1. Approximate versions of the alternating direction method of multipliers. [Internet] [Doctoral dissertation]. Rutgers University; 2016. [cited 2021 Mar 04]. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/51517/.

Council of Science Editors:

Yao, Wang 1. Approximate versions of the alternating direction method of multipliers. [Doctoral Dissertation]. Rutgers University; 2016. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/51517/


Princeton University

10. Bullins, Brian Anderson. Efficient Higher-Order Optimization for Machine Learning .

Degree: PhD, 2019, Princeton University

 In recent years, stochastic gradient descent (SGD) has taken center stage for training large-scale models in machine learning. Although some higher-order methods have improved iteration… (more)

Subjects/Keywords: convex optimization; higher-order; machine learning; non-convex optimization; second-order

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bullins, B. A. (2019). Efficient Higher-Order Optimization for Machine Learning . (Doctoral Dissertation). Princeton University. Retrieved from http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c

Chicago Manual of Style (16th Edition):

Bullins, Brian Anderson. “Efficient Higher-Order Optimization for Machine Learning .” 2019. Doctoral Dissertation, Princeton University. Accessed March 04, 2021. http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c.

MLA Handbook (7th Edition):

Bullins, Brian Anderson. “Efficient Higher-Order Optimization for Machine Learning .” 2019. Web. 04 Mar 2021.

Vancouver:

Bullins BA. Efficient Higher-Order Optimization for Machine Learning . [Internet] [Doctoral dissertation]. Princeton University; 2019. [cited 2021 Mar 04]. Available from: http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c.

Council of Science Editors:

Bullins BA. Efficient Higher-Order Optimization for Machine Learning . [Doctoral Dissertation]. Princeton University; 2019. Available from: http://arks.princeton.edu/ark:/88435/dsp01zg64tp85c


Penn State University

11. Wang, Zi. First-Order Methods for Large Scale Convex Optimization.

Degree: 2016, Penn State University

 The revolution of storage technology in the past few decades made it possible to gather tremendous amount of data anywhere from demand and sales records… (more)

Subjects/Keywords: first-order methods; convex optimization; distributed optimization; convex regression; multi-agent consensus optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, Z. (2016). First-Order Methods for Large Scale Convex Optimization. (Thesis). Penn State University. Retrieved from https://submit-etda.libraries.psu.edu/catalog/13485zxw121

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Wang, Zi. “First-Order Methods for Large Scale Convex Optimization.” 2016. Thesis, Penn State University. Accessed March 04, 2021. https://submit-etda.libraries.psu.edu/catalog/13485zxw121.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Wang, Zi. “First-Order Methods for Large Scale Convex Optimization.” 2016. Web. 04 Mar 2021.

Vancouver:

Wang Z. First-Order Methods for Large Scale Convex Optimization. [Internet] [Thesis]. Penn State University; 2016. [cited 2021 Mar 04]. Available from: https://submit-etda.libraries.psu.edu/catalog/13485zxw121.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang Z. First-Order Methods for Large Scale Convex Optimization. [Thesis]. Penn State University; 2016. Available from: https://submit-etda.libraries.psu.edu/catalog/13485zxw121

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Iowa

12. Xu, Yi. Accelerating convex optimization in machine learning by leveraging functional growth conditions.

Degree: PhD, Computer Science, 2019, University of Iowa

  In recent years, unprecedented growths in scale and dimensionality of data raise big computational challenges for traditional optimization algorithms; thus it becomes very important… (more)

Subjects/Keywords: Convex Optimization; Local Error Bound; Computer Sciences

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Xu, Y. (2019). Accelerating convex optimization in machine learning by leveraging functional growth conditions. (Doctoral Dissertation). University of Iowa. Retrieved from https://ir.uiowa.edu/etd/7048

Chicago Manual of Style (16th Edition):

Xu, Yi. “Accelerating convex optimization in machine learning by leveraging functional growth conditions.” 2019. Doctoral Dissertation, University of Iowa. Accessed March 04, 2021. https://ir.uiowa.edu/etd/7048.

MLA Handbook (7th Edition):

Xu, Yi. “Accelerating convex optimization in machine learning by leveraging functional growth conditions.” 2019. Web. 04 Mar 2021.

Vancouver:

Xu Y. Accelerating convex optimization in machine learning by leveraging functional growth conditions. [Internet] [Doctoral dissertation]. University of Iowa; 2019. [cited 2021 Mar 04]. Available from: https://ir.uiowa.edu/etd/7048.

Council of Science Editors:

Xu Y. Accelerating convex optimization in machine learning by leveraging functional growth conditions. [Doctoral Dissertation]. University of Iowa; 2019. Available from: https://ir.uiowa.edu/etd/7048


Universidade Nova

13. Soares, Diogo Lopes. Design of multidimensional compact constellations with high power efficiency.

Degree: 2013, Universidade Nova

Dissertação apresentada para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia Advisors/Committee Members: Dinis, Rui, Beko, Marko.

Subjects/Keywords: Multidimensional constellations; Power efficiency; Convex optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Soares, D. L. (2013). Design of multidimensional compact constellations with high power efficiency. (Thesis). Universidade Nova. Retrieved from http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Soares, Diogo Lopes. “Design of multidimensional compact constellations with high power efficiency.” 2013. Thesis, Universidade Nova. Accessed March 04, 2021. http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Soares, Diogo Lopes. “Design of multidimensional compact constellations with high power efficiency.” 2013. Web. 04 Mar 2021.

Vancouver:

Soares DL. Design of multidimensional compact constellations with high power efficiency. [Internet] [Thesis]. Universidade Nova; 2013. [cited 2021 Mar 04]. Available from: http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Soares DL. Design of multidimensional compact constellations with high power efficiency. [Thesis]. Universidade Nova; 2013. Available from: http://www.rcaap.pt/detail.jsp?id=oai:run.unl.pt:10362/11111

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Université Catholique de Louvain

14. Orban de Xivry, François-Xavier. Nearest stable system.

Degree: 2013, Université Catholique de Louvain

Stability is a universal concept which we experience in our everyday lives. It plays a central role in the study of dynamical systems and is… (more)

Subjects/Keywords: Stability; Dynamical system; Convex optimization; Nonconvex; Nonsmooth

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Orban de Xivry, F. (2013). Nearest stable system. (Thesis). Université Catholique de Louvain. Retrieved from http://hdl.handle.net/2078.1/132586

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Orban de Xivry, François-Xavier. “Nearest stable system.” 2013. Thesis, Université Catholique de Louvain. Accessed March 04, 2021. http://hdl.handle.net/2078.1/132586.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Orban de Xivry, François-Xavier. “Nearest stable system.” 2013. Web. 04 Mar 2021.

Vancouver:

Orban de Xivry F. Nearest stable system. [Internet] [Thesis]. Université Catholique de Louvain; 2013. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2078.1/132586.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Orban de Xivry F. Nearest stable system. [Thesis]. Université Catholique de Louvain; 2013. Available from: http://hdl.handle.net/2078.1/132586

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Ghana

15. Katsekpor, T. Iterative Methods for Large Scale Convex Optimization .

Degree: 2017, University of Ghana

 This thesis presents a detailed description and analysis of Bregman’s iterative method for convex programming with linear constraints. Row and block action methods for large… (more)

Subjects/Keywords: Iterative Methods; Large Scale Convex; Optimization

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Katsekpor, T. (2017). Iterative Methods for Large Scale Convex Optimization . (Doctoral Dissertation). University of Ghana. Retrieved from http://ugspace.ug.edu.gh/handle/123456789/23393

Chicago Manual of Style (16th Edition):

Katsekpor, T. “Iterative Methods for Large Scale Convex Optimization .” 2017. Doctoral Dissertation, University of Ghana. Accessed March 04, 2021. http://ugspace.ug.edu.gh/handle/123456789/23393.

MLA Handbook (7th Edition):

Katsekpor, T. “Iterative Methods for Large Scale Convex Optimization .” 2017. Web. 04 Mar 2021.

Vancouver:

Katsekpor T. Iterative Methods for Large Scale Convex Optimization . [Internet] [Doctoral dissertation]. University of Ghana; 2017. [cited 2021 Mar 04]. Available from: http://ugspace.ug.edu.gh/handle/123456789/23393.

Council of Science Editors:

Katsekpor T. Iterative Methods for Large Scale Convex Optimization . [Doctoral Dissertation]. University of Ghana; 2017. Available from: http://ugspace.ug.edu.gh/handle/123456789/23393


Iowa State University

16. Li, Chong. Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds.

Degree: 2013, Iowa State University

 Since the success of obtaining the capacity (i.e. the maximal achievable transmission rate under which the message can be recovered with arbitrarily small probability of… (more)

Subjects/Keywords: Capacity; Convex Optimization; Feedback; Information Theory; Engineering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Li, C. (2013). Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds. (Thesis). Iowa State University. Retrieved from https://lib.dr.iastate.edu/etd/13421

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Li, Chong. “Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds.” 2013. Thesis, Iowa State University. Accessed March 04, 2021. https://lib.dr.iastate.edu/etd/13421.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Li, Chong. “Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds.” 2013. Web. 04 Mar 2021.

Vancouver:

Li C. Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds. [Internet] [Thesis]. Iowa State University; 2013. [cited 2021 Mar 04]. Available from: https://lib.dr.iastate.edu/etd/13421.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Li C. Fundamental limitations on communication channels with noisy feedback: information flow, capacity and bounds. [Thesis]. Iowa State University; 2013. Available from: https://lib.dr.iastate.edu/etd/13421

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Technology, Sydney

17. Ho, Huu Minh Tam. Interference management in 5G cellular networks.

Degree: 2016, University of Technology, Sydney

 This dissertation is concerned with the nonconvex optimization problems of interference management under the consideration of new disruptive technologies in the fifth-generation cellular networks. These… (more)

Subjects/Keywords: Convex programming.; Mathematical optimization.; Algorithms.; Nonconvex programming.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ho, H. M. T. (2016). Interference management in 5G cellular networks. (Thesis). University of Technology, Sydney. Retrieved from http://hdl.handle.net/10453/102703

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Ho, Huu Minh Tam. “Interference management in 5G cellular networks.” 2016. Thesis, University of Technology, Sydney. Accessed March 04, 2021. http://hdl.handle.net/10453/102703.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Ho, Huu Minh Tam. “Interference management in 5G cellular networks.” 2016. Web. 04 Mar 2021.

Vancouver:

Ho HMT. Interference management in 5G cellular networks. [Internet] [Thesis]. University of Technology, Sydney; 2016. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10453/102703.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Ho HMT. Interference management in 5G cellular networks. [Thesis]. University of Technology, Sydney; 2016. Available from: http://hdl.handle.net/10453/102703

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Delft University of Technology

18. Zhang, H.M. (author). Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers.

Degree: 2015, Delft University of Technology

The Primal-Dual Method of Multipliers (PDMM) is a new algorithm that solves convex optimization problems in a distributed manner. This study focuses on the convergence… (more)

Subjects/Keywords: convex optimization; distributed signal processing; ADMM; PDMM

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhang, H. M. (. (2015). Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b

Chicago Manual of Style (16th Edition):

Zhang, H M (author). “Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers.” 2015. Masters Thesis, Delft University of Technology. Accessed March 04, 2021. http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b.

MLA Handbook (7th Edition):

Zhang, H M (author). “Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers.” 2015. Web. 04 Mar 2021.

Vancouver:

Zhang HM(. Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers. [Internet] [Masters thesis]. Delft University of Technology; 2015. [cited 2021 Mar 04]. Available from: http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b.

Council of Science Editors:

Zhang HM(. Distributed Convex Optimization: A Study on the Primal-Dual Method of Multipliers. [Masters Thesis]. Delft University of Technology; 2015. Available from: http://resolver.tudelft.nl/uuid:932db0bb-da4c-4ffe-892a-036d01a8071b


University of Minnesota

19. Choi, Hyungjin. Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization.

Degree: PhD, Electrical Engineering, 2017, University of Minnesota

 Rampant integration of renewable resources (e.g., photovoltaic and wind-energy conversion systems) and uncontrollable and elastic loads (e.g., plug-in hybrid electric vehicles) are rapidly transforming power… (more)

Subjects/Keywords: Convex Optimization; Power Systems; Sensitivity; Stability; Uncertainty

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Choi, H. (2017). Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization. (Doctoral Dissertation). University of Minnesota. Retrieved from http://hdl.handle.net/11299/190457

Chicago Manual of Style (16th Edition):

Choi, Hyungjin. “Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization.” 2017. Doctoral Dissertation, University of Minnesota. Accessed March 04, 2021. http://hdl.handle.net/11299/190457.

MLA Handbook (7th Edition):

Choi, Hyungjin. “Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization.” 2017. Web. 04 Mar 2021.

Vancouver:

Choi H. Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization. [Internet] [Doctoral dissertation]. University of Minnesota; 2017. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/11299/190457.

Council of Science Editors:

Choi H. Quantication of the Impact of Uncertainty in Power Systems using Convex Optimization. [Doctoral Dissertation]. University of Minnesota; 2017. Available from: http://hdl.handle.net/11299/190457


University of Waterloo

20. Karimi, Mehdi. Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods.

Degree: 2017, University of Waterloo

 This thesis studies the theory and implementation of infeasible-start primal-dual interior-point methods for convex optimization problems. Convex optimization has applications in many fields of engineering… (more)

Subjects/Keywords: convex optimization; primal-dual interior-point methods

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Karimi, M. (2017). Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/12209

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Karimi, Mehdi. “Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods.” 2017. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/12209.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Karimi, Mehdi. “Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods.” 2017. Web. 04 Mar 2021.

Vancouver:

Karimi M. Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods. [Internet] [Thesis]. University of Waterloo; 2017. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/12209.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Karimi M. Convex Optimization via Domain-Driven Barriers and Primal-Dual Interior-Point Methods. [Thesis]. University of Waterloo; 2017. Available from: http://hdl.handle.net/10012/12209

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

21. Umenberger, Jack. Convex Identifcation of Stable Dynamical Systems .

Degree: 2017, University of Sydney

 This thesis concerns the scalable application of convex optimization to data-driven modeling of dynamical systems, termed system identi cation in the control community. Two problems… (more)

Subjects/Keywords: system identification; convex optimization; positive systems

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Umenberger, J. (2017). Convex Identifcation of Stable Dynamical Systems . (Thesis). University of Sydney. Retrieved from http://hdl.handle.net/2123/17321

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Umenberger, Jack. “Convex Identifcation of Stable Dynamical Systems .” 2017. Thesis, University of Sydney. Accessed March 04, 2021. http://hdl.handle.net/2123/17321.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Umenberger, Jack. “Convex Identifcation of Stable Dynamical Systems .” 2017. Web. 04 Mar 2021.

Vancouver:

Umenberger J. Convex Identifcation of Stable Dynamical Systems . [Internet] [Thesis]. University of Sydney; 2017. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2123/17321.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Umenberger J. Convex Identifcation of Stable Dynamical Systems . [Thesis]. University of Sydney; 2017. Available from: http://hdl.handle.net/2123/17321

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Delft University of Technology

22. Zattoni Scroccaro, Pedro (author). Online Convex Optimization with Predictions: Static and Dynamic Environments.

Degree: 2020, Delft University of Technology

In this thesis, we study Online Convex Optimization algorithms that exploit predictive and/or dynamical information about a problem instance. These features are inspired by recent… (more)

Subjects/Keywords: Online Convex Optimization; Prediction; Online Learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zattoni Scroccaro, P. (. (2020). Online Convex Optimization with Predictions: Static and Dynamic Environments. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b

Chicago Manual of Style (16th Edition):

Zattoni Scroccaro, Pedro (author). “Online Convex Optimization with Predictions: Static and Dynamic Environments.” 2020. Masters Thesis, Delft University of Technology. Accessed March 04, 2021. http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b.

MLA Handbook (7th Edition):

Zattoni Scroccaro, Pedro (author). “Online Convex Optimization with Predictions: Static and Dynamic Environments.” 2020. Web. 04 Mar 2021.

Vancouver:

Zattoni Scroccaro P(. Online Convex Optimization with Predictions: Static and Dynamic Environments. [Internet] [Masters thesis]. Delft University of Technology; 2020. [cited 2021 Mar 04]. Available from: http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b.

Council of Science Editors:

Zattoni Scroccaro P(. Online Convex Optimization with Predictions: Static and Dynamic Environments. [Masters Thesis]. Delft University of Technology; 2020. Available from: http://resolver.tudelft.nl/uuid:ce13b0da-fb0a-4e9f-b5a4-ef9b0dadf29b


University of Waterloo

23. Wang, Houze. A Convergent Hierarchy of Certificates for Constrained Signomial Positivity.

Degree: 2020, University of Waterloo

Optimization is at the heart of many engineering problems. Many optimization problems, however, are computationally intractable. One approach to tackle such intractability is to find… (more)

Subjects/Keywords: Convex Optimization; Signomial Programming; Algebraic Geometry

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wang, H. (2020). A Convergent Hierarchy of Certificates for Constrained Signomial Positivity. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/16361

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Wang, Houze. “A Convergent Hierarchy of Certificates for Constrained Signomial Positivity.” 2020. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/16361.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Wang, Houze. “A Convergent Hierarchy of Certificates for Constrained Signomial Positivity.” 2020. Web. 04 Mar 2021.

Vancouver:

Wang H. A Convergent Hierarchy of Certificates for Constrained Signomial Positivity. [Internet] [Thesis]. University of Waterloo; 2020. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/16361.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Wang H. A Convergent Hierarchy of Certificates for Constrained Signomial Positivity. [Thesis]. University of Waterloo; 2020. Available from: http://hdl.handle.net/10012/16361

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Portland State University

24. Tran, Tuyen Dang Thanh. Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering.

Degree: PhD, Mathematics and Statistics, 2020, Portland State University

  This thesis contains contributions in two main areas: calculus rules for generalized differentiation and optimization methods for solving nonsmooth nonconvex problems with applications to… (more)

Subjects/Keywords: Convex domains; Mathematical optimization; Calculus; Mathematics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Tran, T. D. T. (2020). Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering. (Doctoral Dissertation). Portland State University. Retrieved from https://pdxscholar.library.pdx.edu/open_access_etds/5482

Chicago Manual of Style (16th Edition):

Tran, Tuyen Dang Thanh. “Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering.” 2020. Doctoral Dissertation, Portland State University. Accessed March 04, 2021. https://pdxscholar.library.pdx.edu/open_access_etds/5482.

MLA Handbook (7th Edition):

Tran, Tuyen Dang Thanh. “Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering.” 2020. Web. 04 Mar 2021.

Vancouver:

Tran TDT. Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering. [Internet] [Doctoral dissertation]. Portland State University; 2020. [cited 2021 Mar 04]. Available from: https://pdxscholar.library.pdx.edu/open_access_etds/5482.

Council of Science Editors:

Tran TDT. Convex and Nonconvex Optimization Techniques for Multifacility Location and Clustering. [Doctoral Dissertation]. Portland State University; 2020. Available from: https://pdxscholar.library.pdx.edu/open_access_etds/5482

25. Linhares Rodrigues, Andre. Approximation Algorithms for Distributionally Robust Stochastic Optimization.

Degree: 2019, University of Waterloo

 Two-stage stochastic optimization is a widely used framework for modeling uncertainty, where we have a probability distribution over possible realizations of the data, called scenarios,… (more)

Subjects/Keywords: approximation algorithms; stochastic optimization; discrete optimization; convex optimization

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Linhares Rodrigues, A. (2019). Approximation Algorithms for Distributionally Robust Stochastic Optimization. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/14639

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Linhares Rodrigues, Andre. “Approximation Algorithms for Distributionally Robust Stochastic Optimization.” 2019. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/14639.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Linhares Rodrigues, Andre. “Approximation Algorithms for Distributionally Robust Stochastic Optimization.” 2019. Web. 04 Mar 2021.

Vancouver:

Linhares Rodrigues A. Approximation Algorithms for Distributionally Robust Stochastic Optimization. [Internet] [Thesis]. University of Waterloo; 2019. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/14639.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Linhares Rodrigues A. Approximation Algorithms for Distributionally Robust Stochastic Optimization. [Thesis]. University of Waterloo; 2019. Available from: http://hdl.handle.net/10012/14639

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Loughborough University

26. Rossetti, Gaia. Mathematical optimization techniques for cognitive radar networks.

Degree: PhD, 2018, Loughborough University

 This thesis discusses mathematical optimization techniques for waveform design in cognitive radars. These techniques have been designed with an increasing level of sophistication, starting from… (more)

Subjects/Keywords: 621.3848; Waveform optimization; Convex optimization; Robust optimization; Cognitive radars

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Rossetti, G. (2018). Mathematical optimization techniques for cognitive radar networks. (Doctoral Dissertation). Loughborough University. Retrieved from http://hdl.handle.net/2134/33419

Chicago Manual of Style (16th Edition):

Rossetti, Gaia. “Mathematical optimization techniques for cognitive radar networks.” 2018. Doctoral Dissertation, Loughborough University. Accessed March 04, 2021. http://hdl.handle.net/2134/33419.

MLA Handbook (7th Edition):

Rossetti, Gaia. “Mathematical optimization techniques for cognitive radar networks.” 2018. Web. 04 Mar 2021.

Vancouver:

Rossetti G. Mathematical optimization techniques for cognitive radar networks. [Internet] [Doctoral dissertation]. Loughborough University; 2018. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/2134/33419.

Council of Science Editors:

Rossetti G. Mathematical optimization techniques for cognitive radar networks. [Doctoral Dissertation]. Loughborough University; 2018. Available from: http://hdl.handle.net/2134/33419


Penn State University

27. Jalilzadeh, Afrooz. Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems.

Degree: 2020, Penn State University

Optimization problems with expectation-valued objectives are afflicted by a difficulty in that the expectation of neither the objective nor the gradient can be evaluated in… (more)

Subjects/Keywords: Stochastic Optimization; Convex Optimization; Stochastic Approximation; Nonsmooth Optimization; Stochastic Variational Inequality

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jalilzadeh, A. (2020). Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems. (Thesis). Penn State University. Retrieved from https://submit-etda.libraries.psu.edu/catalog/17968azj5286

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Jalilzadeh, Afrooz. “Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems.” 2020. Thesis, Penn State University. Accessed March 04, 2021. https://submit-etda.libraries.psu.edu/catalog/17968azj5286.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Jalilzadeh, Afrooz. “Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems.” 2020. Web. 04 Mar 2021.

Vancouver:

Jalilzadeh A. Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems. [Internet] [Thesis]. Penn State University; 2020. [cited 2021 Mar 04]. Available from: https://submit-etda.libraries.psu.edu/catalog/17968azj5286.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Jalilzadeh A. Variance-reduced First-Order Methods for Convex Stochastic Optimization and Monotone Stochastic Variational Inequality Problems. [Thesis]. Penn State University; 2020. Available from: https://submit-etda.libraries.psu.edu/catalog/17968azj5286

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Texas A&M University

28. Fan, Siqi. Learning Gaussian Latent Graphical Models Via Partial Convex Optimization.

Degree: MS, Electrical Engineering, 2019, Texas A&M University

 Latent Gaussian graphical models are very useful in probabilistic modeling to measure the statistical relationships between different variables and present them in the form of… (more)

Subjects/Keywords: Gaussian latent graphical models; Convex optimization; Chow-Liu algorithm; CL Recursive Grouping; Partial convex optimization.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fan, S. (2019). Learning Gaussian Latent Graphical Models Via Partial Convex Optimization. (Masters Thesis). Texas A&M University. Retrieved from http://hdl.handle.net/1969.1/188729

Chicago Manual of Style (16th Edition):

Fan, Siqi. “Learning Gaussian Latent Graphical Models Via Partial Convex Optimization.” 2019. Masters Thesis, Texas A&M University. Accessed March 04, 2021. http://hdl.handle.net/1969.1/188729.

MLA Handbook (7th Edition):

Fan, Siqi. “Learning Gaussian Latent Graphical Models Via Partial Convex Optimization.” 2019. Web. 04 Mar 2021.

Vancouver:

Fan S. Learning Gaussian Latent Graphical Models Via Partial Convex Optimization. [Internet] [Masters thesis]. Texas A&M University; 2019. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/1969.1/188729.

Council of Science Editors:

Fan S. Learning Gaussian Latent Graphical Models Via Partial Convex Optimization. [Masters Thesis]. Texas A&M University; 2019. Available from: http://hdl.handle.net/1969.1/188729


University of Waterloo

29. Sremac, Stefan. Error Bounds and Singularity Degree in Semidefinite Programming.

Degree: 2020, University of Waterloo

 An important process in optimization is to determine the quality of a proposed solution. This usually entails calculation of the distance of a proposed solution… (more)

Subjects/Keywords: semidefinite programming; optimization; error bounds; singularity degree; mathematical programming; convex optimization; conic optimization; Semidefinite programming; Combinatorial optimization; Programming (Mathematics); Convex programming

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sremac, S. (2020). Error Bounds and Singularity Degree in Semidefinite Programming. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/15583

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sremac, Stefan. “Error Bounds and Singularity Degree in Semidefinite Programming.” 2020. Thesis, University of Waterloo. Accessed March 04, 2021. http://hdl.handle.net/10012/15583.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sremac, Stefan. “Error Bounds and Singularity Degree in Semidefinite Programming.” 2020. Web. 04 Mar 2021.

Vancouver:

Sremac S. Error Bounds and Singularity Degree in Semidefinite Programming. [Internet] [Thesis]. University of Waterloo; 2020. [cited 2021 Mar 04]. Available from: http://hdl.handle.net/10012/15583.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sremac S. Error Bounds and Singularity Degree in Semidefinite Programming. [Thesis]. University of Waterloo; 2020. Available from: http://hdl.handle.net/10012/15583

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Penn State University

30. Kang, Bosung. Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP).

Degree: 2015, Penn State University

 Estimating the disturbance or clutter covariance is a centrally important problem in radar space time adaptive processing (STAP) since estimation of the disturbance or interference… (more)

Subjects/Keywords: convex optimization; STAP; radar signal processing; constrained optimization; detection and estimation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kang, B. (2015). Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP). (Thesis). Penn State University. Retrieved from https://submit-etda.libraries.psu.edu/catalog/26539

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Kang, Bosung. “Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP).” 2015. Thesis, Penn State University. Accessed March 04, 2021. https://submit-etda.libraries.psu.edu/catalog/26539.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Kang, Bosung. “Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP).” 2015. Web. 04 Mar 2021.

Vancouver:

Kang B. Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP). [Internet] [Thesis]. Penn State University; 2015. [cited 2021 Mar 04]. Available from: https://submit-etda.libraries.psu.edu/catalog/26539.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Kang B. Robust Covariance Matrix Estimation for Radar Space-Time Adaptive Processing (STAP). [Thesis]. Penn State University; 2015. Available from: https://submit-etda.libraries.psu.edu/catalog/26539

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

[1] [2] [3] [4] [5] … [18]

.