Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:


Written in Published in Earliest date Latest date

Sorted by

Results per page:

You searched for +publisher:"Georgia Tech" +contributor:("Anatoli Jouditski"). One record found.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters

Georgia Tech

1. Lan, Guanghui. Convex optimization under inexact first-order information.

Degree: PhD, Industrial and Systems Engineering, 2009, Georgia Tech

In this thesis we investigate the design and complexity analysis of the algorithms to solve convex programming problems under inexact first-order information. In the first part of this thesis we focus on the general non-smooth convex minimization under a stochastic oracle. We start by introducing an important algorithmic advancement in this area, namely, the development of the mirror descent stochastic approximation algorithm. The main contribution is to develop a validation procedure for this algorithm applied to stochastic programming. In the second part of this thesis we consider the Stochastic Composite Optimizaiton (SCO) which covers smooth, non-smooth and stochastic convex optimization as certain special cases. Note that the optimization algorithms that can achieve this lower bound had never been developed. Our contribution in this topic mainly consists of the following aspects. Firstly, with a novel analysis, it is demonstrated that the simple RM-SA algorithm applied to the aforementioned problems exhibits the best known so far rate of convergence. Moreover, by adapting Nesterov's optimal method, we propose an accelerated SA, which can achieve, uniformly in dimension, the theoretically optimal rate of convergence for solving this class of problems. Finally, the significant advantages of the accelerated SA over the existing algorithms are illustrated in the context of solving a class of stochastic programming problems. In the last part of this work, we extend our attention to certain deterministic optimization techniques which operate on approximate first-order information for the dual problem. In particular, we establish, for the first time in the literature, the iteration-complexity for the inexact augmented Lagrangian (I-AL) methods applied to a special class of convex programming problems. Advisors/Committee Members: Arkadi Nemirovski (Committee Chair), Alexander Shapiro (Committee Co-Chair), Renato D. C. Monteiro (Committee Co-Chair), Anatoli Jouditski (Committee Member), Shabbir Ahmed (Committee Member).

Subjects/Keywords: Convex optimization; Stochastic programming; First-order methods; Uncertainty; Mathematical optimization; Convex functions; First-order logic

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lan, G. (2009). Convex optimization under inexact first-order information. (Doctoral Dissertation). Georgia Tech. Retrieved from

Chicago Manual of Style (16th Edition):

Lan, Guanghui. “Convex optimization under inexact first-order information.” 2009. Doctoral Dissertation, Georgia Tech. Accessed November 16, 2019.

MLA Handbook (7th Edition):

Lan, Guanghui. “Convex optimization under inexact first-order information.” 2009. Web. 16 Nov 2019.


Lan G. Convex optimization under inexact first-order information. [Internet] [Doctoral dissertation]. Georgia Tech; 2009. [cited 2019 Nov 16]. Available from:

Council of Science Editors:

Lan G. Convex optimization under inexact first-order information. [Doctoral Dissertation]. Georgia Tech; 2009. Available from: