Optimization algorithms work to minimize (or maximize) an objective function subject
to constraints on design variables and responses.
The optimization to be solved is expressed as follows:
Minimize: |
|
(objective function) |
Subject to: |
|
(inequality constraints) |
|
|
(equality constraints) (design limits)
|
|
|
|
The functions
are assumed to have the form:
In this formulation:
-
is an n-dimensional vector of real-valued
design variables
-
and
are the lower and upper bounds,
respectively, on the design variables
-
is the set of states that the solver uses to
represent the system
-
is the value of the function from a previous
simulation. For the first simulation, it is always zero.
The goal of the optimization effort is to minimize the objective function
while satisfying the constraints
. Constraints are assumed to be inherently nonlinear.
They can be either inequality or equality constraints.
and
define the lower and upper bounds for the elements
of
. The set of all allowable values of
is known as the design space for the problem. A
design point or a sample point is a particular set of values within the design
space.
A design point is said to be feasible if and only if it satisfies all the
constraints. Correspondingly, a design point is said to be infeasible if it violates
one or more of the constraints. Our aim, of course, is to find a feasible design.
Sometimes, due to the presence of constraints, this may not be possible.
Many different methods are available for solving the above-mentioned optimization
problem. All of these iterate on the design
in some manner to find a better solution. A generic
algorithm that describes this process is:
- An initial value for the design variables
is provided to the optimizer.
- The response quantities
are computed by running a simulation.
- Some algorithm, sometimes a sensitivity-based method, is applied to generate
a new
that will either reduce the objective
function, reduce the amount of infeasibility, or both.
- When a sensitivity-based method is used, the optimizer also needs to compute
the sensitivity of the functions
with respect to the design
. This means the optimizer requires the
matrix of partial derivatives,
.
- In an iterative fashion, new designs (
) are generated by the optimizer until a
determination is made that the optimizer has found a minimum or the
iteration limits have been exceeded.
In some instances you may want to maximize the value of a certain objective. Without
any loss of generality, you can convert it to a minimization problem by simply
negating the objective you calculate. Thus, if you want to maximize a function
, you can convert it to a minimization problem by
defining the cost function as
.