# Objectives

Objectives are metrics to be minimized or maximized in an optimization exploration. Minimizing mass to find a lightweight design is a common example.

## Single and Multi-Objective Optimization Problems

Optimization problems can be grouped as single or multi-objective optimization problems depending on how many objectives are considered. Multi-objective optimization (MOO) objective functions are formulated as:
$\mathrm{min}f\left(x\right)=\left\{{f}_{1}\left(x\right),{f}_{2}\left(x\right),...,{f}_{n}\left(x\right)\right\}$

such that $gj\left(x\right)<0$

When dealing with multiple objectives $\left({f}_{1},{f}_{2},...\right)$ it is unlikely that one design will have minimum objective function values for all objectives. As a result, in MOO applications an optimal Pareto front is searched for instead of an optimal design. Optimal Pareto front is a collection of non-dominated designs. A non-dominated design has a lower objective function value than others with respect to at least one objective.

Usually the computational effort required to solve MOO problems are significantly more compared to single objective optimization problems. In cases where solving MOO problems are prohibitive, these problems have been converted to a single objective problem by summing all the objectives (Weighted Sum Method).

When dealing with probabilistic variables, the objective function also has an associated distribution. When doing robust optimization, instead of the deterministic objective, the objective function is the value of the objective distribution at a specified value of the cumulative distribution function (CDF). A minimization problem might use the 95% value of the CDF (default value), and a maximization problem might use the 5% value of the CDF.

## Objective Types

Types of objectives that can be defined when setting up an Optimization in HyperStudy.
Minimize and Maximize
When creating an objective that is of type Minimize or Maximize, you can edit the Weighted sum field to create a weighted sum of all objectives. If the Weighted sum field is edited, only Adaptive Response Surface Method, Method of Feasible Directions, Genetic Algorithm, Sequential Quadratic Programming methods are available in the Specifications step. If multiple Minimize and/or Maximize type objectives are created, but no weights are defined, only Multi - Objective Genetic Algorithm and Global Response Search Method are available.
This weighted objective function is always minimized, therefore it will be positive if the objective is to be minimized and it will be negative if the objective is to be maximized.
$f{x}_{1}=\sum {w}_{i}{x}_{i}$
System Identification
Attempts to minimize the difference between the output response values and the target values of selected objectives. Examples for typical applications are experimental curve fitting or parameter fitting. The objective function is formulated as a least squares formula, where ${\stackrel{˜}{f}}_{i}$ is the target value of the ith output response.
$\mathrm{min}\sum {\left(\frac{{f}_{i}-{\stackrel{˜}{f}}_{i}}{{\stackrel{˜}{f}}_{i}}\right)}^{2}$
If the target value is 0, the denominator will be set to 1.0 to avoid division by 0. In this case, the contribution to the sum is just the raw value itself, unnormalized.
When creating an objective of type System Identification, the Target value field must be edited. System Identification cannot be used with Sequential Optimization and Reliability Assessment or ARSM-Based Sequential Optimization and Reliability Assessment methods.
System Identification objectives cannot be combined with other types of objectives. If one objective is assigned to this type, then all objectives have to be assigned to the same type.
MinMax and MaxMin
Used to solve problems where the maximum (or minimum) of an output response is minimized (or maximized).
The MinMax problem is formulated as:
$\mathrm{min}⌊\mathrm{max}\left({f}_{1}\left(x\right)/{\stackrel{˜}{f}}_{1},{f}_{2}\left(x\right)/{\stackrel{˜}{f}}_{2},...,{f}_{k}\left(x\right)/{\stackrel{˜}{f}}_{k}\right)⌋$

Subject to:

$\begin{array}{cc}\begin{array}{l}{g}_{j}\left(x\right)\le 0\\ {x}_{i}^{L}\le {x}_{i}\le {x}_{i}^{U}\end{array}& \begin{array}{l}j=1,...,m\\ i=1,...,n\end{array}\end{array}$
These problems are solved using the Beta-method. In this method the problem is transformed into a regular optimization problem by introducing an additional input variable such that:
$\mathrm{min}\beta$

Subject to:

$\begin{array}{cc}\begin{array}{l}{f}_{\iota }/{\stackrel{˜}{f}}_{\iota }\left(x\right)\le \beta \\ {g}_{j}\left(x\right)\le 0\end{array}& \begin{array}{l}\iota =1,...,k\\ j=1,...,m\end{array}\end{array}$
When creating an objective of type MinMax or MaxMin, the Reference Value field can be edited to normalize the respective function values. MinMax and MaxMin cannot be used with reliability based optimization methods.
MinMax and MaxMin objectives cannot be combined with other types of objectives. If one objective is assigned to this type, then all objectives have to be assigned to the same type.