← volver

First-order Methods for Constrained Continuous Optimization, Haihao Lu The University of Chicago Booth School of Business

27Mar

speeding up constrained continuous optimization. Continuous
constrained optimization (CCO), including linear
programming (LP), quadratic programming (QP), second-order
cone programming (SOCP), semi-definite programming
(SDP), and nonlinear programming (NLP), is a fundamental
tool in operations research with wide applications in practice.
The state-of-the-art solvers for CCO are mature and
reliable at delivering accurate solutions. However, these
methods do not scale up with modern computational
resources and thus are unsuitable for big data applications.
The computational bottleneck was the matrix factorization,
which usually requires significant memory usage and cannot
be directly applied with modern computing resources. In
contrast, first-order methods (FOMs) only require
matrix-vector multiplications, which work well on these
modern computing infrastructures and have massively
accelerated machine learning training during the last 15
years. This ongoing line of research aims to scale up CCO
1000 times and speed up CCO 10-100 times by using
FOMs and modern computational resources, i.e., distributed
computing and/or GPUs. With an example of LP, I’ll discuss
how we can achieve this by explaining: (i) the intuitions
about designing FOMs for LP; (ii) theoretical results, including
complexity theory, condition number theory, infeasibility
detection, and how theory can lead to better computation
and better understanding of the algorithm’s performance;
(iii) numerical results of the proposed algorithm on large
instances and modern computing architectures; (iv)
large-scale applications. If time permits, I’ll also talk about
our recent results on QP. I’ll conclude the talk with open
questions and new directions on this line of research.

 

 

Inscripciones aquí