Koç University Mathematics Department Seminars

Superlinear convergence in modern convex optimization
Levent Tunçel
University of Waterloo, Canada
Özet : We propose new algorithms for solving convex optimization problems. The main structural properties used in our design and analysis of the algorithms hinge on some key properties of a special class of very smooth, strictly convex barrier functions. Even though our analysis has primal and dual components, our algorithms work with the dual iterates only, in the dual space. Our algorithms converge globally at the same worst-case rate as the current best polynomial-time interior-point methods. In addition, our algorithms have the local superlinear convergence property under some mild assumptions (including on a very general class of convex optimization problems defined via hyperbolic polynomials). Moreover, our algorithms are based on an easily computable gradient proximity measure, which ensures an automatic transformation of the global linear rate of convergence to the locally superlinear one under some mild assumptions. This talk is based on joint work with Yu. Nesterov.
  Tarih : 19.10.2015
  Saat : 17:00
  Yer : ENG B05
  Dil : English
    Yazdır