# Glossary

**Term****Definition**- Conjugate Gradient
- An iterative algorithm for the numerical solution of symmetric and positive-definite systems of linear equations. A key feature of this algorithm is that the direction at step K is conjugate (A-orthogonal) to the directions at all the previous steps.
- Constrained Optimization
- Constrained optimization is the process of optimizing (usually minimizing) an objective function with respect to some design variables in the presence of constraints on those designs variables. These constraints can be bounds on the design space, equality constraints and inequality constraints. The sought after solution is required to satisfy all the constraints.
- Constraints
- Constraints are algebraic relationships amongst the design variables that a
solution for the optimization must satisfy. The algebraic relationships come
in 3 flavors:
- Simple bound constraints
- Equality or bilateral constraints
- Inequality or unilateral constraints

- Cost Function
- Often called the objective function, this is the quantity that is to be minimized in an optimization process.
- Design Bound Constraints
- These specify lower bounds (b
_{L}) and upper bounds (b_{L}on the design parameters that the solution is required to satisfy. Thus at any feasible design point b^{*}, b_{L}≤ b^{*}≤ b_{U} - Design Variables
- These are the quantities that are to be determined in the optimization problem. In the CAE context, design variables are used to define parametric models. By changing the values for design variables in a model, you can change the model. An optimizer will change the values of the design variables for a model in its quest to find an optimum solution.
- Equality Constraints
- A set of nonlinear or linear algebraic equations that the solution must satisfy. The equality relationships are usually expressed in the form ${\psi}_{k}(b)+{\displaystyle \underset{t0}{\overset{tf}{\int}}{L}_{k}\left(x,b,t\right)dt}=0\text{;}k=p+1\dots m$
- Global Optimization
- An optimization process wherein the global minimum of a function is sought.
Let f() be the function being minimized, and b the design variables. A
solution b
^{*}is said to be a global minimum if f (b^{*}) ≤ f (b) for all b ≠ b*. - Gradient
- If f(x
_{1}, ..., x_{n}) is a differentiable, real-valued function of several variables, its gradient is the vector whose components are the n partial derivatives of f.$$\text{grad}\left(f\right)={\left[\begin{array}{cccc}\frac{\partial f}{\partial {x}_{1}}& \frac{\partial f}{\partial {x}_{2}}& \dots & \frac{\partial f}{\partial {x}_{n}}\end{array}\right]}^{T}$$ - Hessian Matrix
- The Hessian matrix or Hessian is a square matrix of second-order partial derivatives of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables. The Hessian matrix was developed in the 19th century by the German mathematician Ludwig Otto Hesse and later named after him. $\begin{array}{l}\text{TheHessianmatrix}H\text{of}f\left({x}_{1},\dots {x}_{n}\right)\text{isasquare}n\times n\text{matrix,computedas:}\\ H=\left[\begin{array}{cccc}\frac{{\partial}^{2}f}{\partial {x}_{1}^{2}}& \frac{{\partial}^{2}f}{\partial {x}_{1}\partial {x}_{2}}& \dots & \frac{{\partial}^{2}f}{\partial {x}_{1}\partial {x}_{n}}\\ \frac{{\partial}^{2}f}{\partial {x}_{2}\partial {x}_{1}}& \frac{{\partial}^{2}f}{\partial {x}_{2}^{2}}& \cdots & \frac{{\partial}^{2}f}{\partial {x}_{2}\partial {x}_{n}}\\ \vdots & \vdots & \ddots & \vdots \\ \frac{{\partial}^{2}f}{\partial {x}_{n}\partial {x}_{1}}& \frac{{\partial}^{2}f}{\partial {x}_{n}\partial {x}_{2}}& \dots & \frac{{\partial}^{2}f}{\partial {x}_{n}^{2}}\end{array}\right]\end{array}$
- Jacobian Matrix
- The Jacobian matrix is the matrix of all first-order partial derivatives of
a function f. It is named after the mathematician Carl Gustav Jacob Jacobi
(1804–1851).$$\begin{array}{l}\text{Let}f\left({x}_{1},\dots {x}_{n}\right)=\text{}{\left[\begin{array}{cccc}{f}_{1}\left({x}_{1},\dots {x}_{n}\right)& {f}_{2}\left({x}_{1},\dots {x}_{n}\right)& \dots & {f}_{m}\left({x}_{1},\dots {x}_{n}\right)\end{array}\right]}^{T}\\ \text{Jacobian}J=\left[\begin{array}{cccc}\frac{\partial {f}_{1}}{\partial {x}_{1}}& \frac{\partial {f}_{1}}{\partial {x}_{2}}& \dots & \frac{\partial {f}_{1}}{\partial {x}_{n}}\\ \frac{\partial {f}_{2}}{\partial {x}_{1}}& \frac{\partial {f}_{2}}{\partial {x}_{2}}& \cdots & \frac{\partial {f}_{2}}{\partial {x}_{n}}\\ \vdots & \vdots & \ddots & \vdots \\ \frac{\partial {f}_{m}}{\partial {x}_{1}}& \frac{\partial {f}_{m}}{\partial {x}_{2}}& \dots & \frac{\partial {f}_{m}}{\partial {x}_{n}}\end{array}\right]\end{array}$$
- Line Search
- In optimization, the line search strategy is one of two basic iterative
approaches to find a local minimum x
^{*}of an objective function f. - Local Optimization
- An optimization process wherein the local minimum of a function is sought.
Let f() be the function being minimized, and b the design variables. Let
b
_{0}be an initial guess for b. A solution b^{*}is said to be a local minimum if f’(b^{*}) = 0 and f’’(b^{*}) < 0. - Multi-Objective Optimization
- An optimization problem in which the objective is not a single scalar
function f but a set of scalar functions [f
_{1}, f_{2}, …f_{N}] that is to be optimized. Often, the objectives are conflicting. - Objective Function
- Often called the cost function, this is the quantity that is to be minimized in an optimization process.
- Design Optimization
- The selection of the values of the best design variables given a design space, a cost function and, optionally, some constraints on the design variables.
- Pareto Optimal Solution
- Often in multi-objective optimization there are conflicting objectives, such
as cost and performance. The answer to a multi-objective problem is usually
not a single point. Rather, it is a set of points called the Pareto front.
Each point on the Pareto front satisfies the Pareto optimality criterion, which is stated as follows: a feasible vector X

^{*}is Pareto optimal if there exists no other feasible solution X which would improve some objective without causing a simultaneous worsening in at least one other objective.A feasible point X that CAN be improved on one or more objectives simultaneously without worsening any of the others is not Pareto optimal.

- Response Variable
- Model behavior metrics are captured in terms of Response Variables. The optimizer in MotionSolve works with Response Variables.
- Search Method
- A search method is a mathematical algorithm used by the optimizer in MotionSolve to reduce the cost function. The optimizer in MotionSolve supports many different search methods. These are: L-BFGS-B, TNC, COBYLA, SLSQP, dogleg, trust-ncg, Nelder-Mead, Powell, CG, BFGS and Newton-CG.
- State Variable
- A state variable is one of the set of variables that are used to describe the mathematical "state" of a dynamical system.
- Steepest Descent
- Steepest descent is a first-order iterative optimization algorithm. To find a local minimum of a function using steepest descent, take steps proportional to the negative of the gradient (or of the approximate gradient) of the function at the current point.
- Unconstrained Optimization
- An unconstrained problem has no constraints. Thus there are no equality or inequality constraints that the solution b, has to satisfy. Furthermore there are no design limits either.