## COMPUTATIONAL SCIENCES LECTURE SERIES (CSLS)

Third meeting in the CSLS series:

Workshop on
Optimization of Eigenvalues

Abstracts

Stephen Boyd
Stanford University

"Semidefinite Programming "

In semidefinite programming (SDP) a linear function is minimized subject to the constraint that the eigenvalues of a symmetric matrix are nonnegative. While such problems were studied in a few papers in the 1970s, the relatively recent development of efficient interior-point algorithms for SDP has spurred research in a wide variety of application fields, including control system analysis and synthesis, combinatorial optimization, circuit design, structural optimization, finance, and statistics. In this overview talk I will cover the basic properties of SDP, survey some applications, and give a brief description of interior-point methods for their solution.

Cornell University

"Eigenvalue Optimization: Symmetric versus Nonsymmetric Matrices"

Including joint work with James Burke and Michael Overton

The eigenvalues of a symmetric matrix are Lipschitz functions with elegant convexity properties, amenable to efficient interior-point optimization algorithms. By contrast, for example, the spectral radius of a nonsymmetric matrix is neither a convex function, nor Lipschitz. It may indicate practical behaviour much less reliably than in the symmetric case, and is more challenging for numerical optimization (see Overton's talk). Nonetheless, this function does share several significant variational-analytic properties with its symmetric counterpart. I will outline these analogies, discuss the fundamental idea of Clarke regularity, highlight its usefulness in nonsmooth chain rules, and discuss robust regularizations of functions like the spectral radius.

Michael Overton
Courant Institute of Mathematical Sciences
New York University

"Local Optimization of Stability Functions in Theory and Practice"

Joint work with James V. Burke and Adrian S. Lewis

Stability measures arising in systems and control are typically
nonsmooth, nonconvex functions. The simplest examples are the abscissa and radius maps for polynomials (maximum real part, or modulus, of the roots) and the analagous matrix measures, the spectral abscissa and radius (maximum real part, or modulus, of the eigenvalues). More robust measures include the distance to instability (smallest perturbation that makes a polynomial or matrix unstable) and the $\epsilon$ pseudospectral abscissa or radius of a matrix (maximum real part or modulus of the $\epsilon$\-pseudospectrum). When polynomials or matrices depend on parameters it is natural to consider optimization of such functions. We discuss an algorithm for locally optimizing such nonsmooth, nonconvex functions over parameter space and illustrate its effectiveness, computing, for example, locally optimal low-order controllers for challenging problems from the literature.

We also give an overview of variational analysis of stabiity functions in polynomial and matrix space, expanding on some of the issues discussed in Lewis's talk.