HOME
SCHEDULE
ABSTRACTS
HOTELS
FOOD GUIDE
PARTY
TRAVEL
MAPS
CONTACT US
DEPARTMENT OF MATHEMATICS (UW)
UNIVERSITY OF WASHINGTON
|
 |
 |
 |
Abstracts of Invited Talks
- John
Dennis
(Rice University and University of Washington)
Optimization using Surrogates for Engineering Design
This talk will outline the surrogate management framework
for general nonlinear programming without derivatives. The point of this
talk is to show young optimizers how useful their work can be.
This line of research was motivated by
industrial applications, indeed, by a question I was asked by Paul
Frank of Boeing Phantom Works. His group was often asked for help
in dealing with very expensive low dimensional design problems
from all around the company. Everyone there was dissatisfied with
the common practice of substituting
inexpensive surrogates for the expensive ``true'' objective and
constraint functions in the optimal design formulation. We had been asked
this question before, but this time the ideas behind
the surrogate management framework occurred to us, and we hope to
demonstrate in this
talk just how simple that answer is.
- Michael P.
Friedlander
(University of British Columbia)
MINOS and Knossos with Second Derivatives
For problems with nonlinear constraints, Knossos currently uses MINOS
or SNOPT to solve a sequence of linearly constrained subproblems. The
convergence theory is largely independent of how the subproblems are
solved. We note that MINOS can be developed to use second derivatives
of the objective function (using a straightforward modified-Newton
method on each reduced Hessian). By implementing such a version of
MINOS, we obtain a second-derivative version of Knossos for general
NLPs. (Joint work with Michael Saunders.)
- Terry
Rockafellar (University of Washington)
Convex Analysis in Finance
Although convex analysis has come to be widely appreciated
for its importance in understanding and solving problems of
optimization, its applications in finance are relatively
new. In truth, many of the researchers who work on problems
about "portfolios", "insurance", and other such matters have
little background even in optimization, not to speak of
convexity. Nonetheless, significant inroads are being made
in dealing with notions or risk and other aspects of
uncertainty in the outcome of investments. This talk will
explain the basic framework and the reasons why convex
analysis is needed for good progress in this field.
- Yanfang
Shen
(University of Washington)
Annealing Adaptive Search with Hit-and-Run Sampling Methods for Global Optimization
Stochastic algorithm, such as simulated annealing and genetic algorithms,
have been widely applied to solve global optimization problems.
To understand the behavior of simulated annealing, the theoretical
performance of annealing adaptive search (AAS) is analyzed.
We show that for a large class of continuous/discrete global optimization problems,
the expected number of improving points generated by AAS grows linearly in dimension,
and the expected number of function evaluations can also be linear when our
adaptive cooling schedule is employed. This eliminates the need to heuristically
choose a cooling schedule for simulated annealing. AAS assumes points can be
exactly sampled according to a sequence of Boltzmann distributions.
A Markov chain Monte Carlo (MCMC) sampler is used to implement AAS and
performance bounds are derived in terms of the choice of cooling schedule
and the rate of convergence of the MCMC sampler to a Boltzmann distribution.
We develop and analyze the performance of several MCMC samplers, based on
Hit-and-Run, that can be applied to discrete and mixed continuous/discrete domains.
We conclude by embedding the family of Hit-and-Run samplers into the AAS
framework to provide robust global optimization algorithms.
Numerical results are presented. (Joint work with Zelda Zabinsky.)
- Jane
Ye
(University of Victoria)
Quasiconvex Programming with Locally Starshaped Constraint Region
and Applications to Quasiconvex MPEC
A quasiconvex programming problem is a mathematical programming problem where the objective function is quasiconvex. We derive some
necessary and sufficient conditions for quasiconvex programming
problem with a locally starshaped constraint region. Our optimality
conditions are different from the usual optimality conditions in
that the limiting subdifferential of the objective function is replaced by a
normal cone operator. Such an optimality condition has advantage over the
usual one in that it becomes sufficient even when the objective function is
only quasiconvex but not pseduconvex. As a special case we derive the
corresponding results for the class of Quasiconvex-Quasiaffine
MPEC which is a class of mathematical program with complementarity
constraints where the objective function is quasiconvex, the
inequality constraint is quasiconvex and the rest of constraints
are quasiaffine.
For a printable version of the abstracts click
HERE.
|