MathJax

Tuesday, September 22, 2015

Notes on optimization problem

Optimality criterion for differentiable \(f_0\)
Given a convex optimization problem with objective \(f_0\).  \(x\) is optimal iff it is feasible and
\[ \nabla f_0(x)^T (y-x) \ge 0 \quad \text{for all feasible } y \]

if \(\nabla f_0(x) \neq 0 \), it means that \( -\nabla f_0(x) \) defines a supporting hyperplane to the feasible set at \(x\).

Unconstrained problem: \(x\) is optimal iff
\[ x\in \text{dom }f_0, \quad \nabla f_0(x) = 0 \]
Equality constrained problem:
\[ \text{minimize } f_0(x) \quad \text{ subject to } Ax=b  \]
\(x\) is optimal iff there exists a \(\nu\) such that
\[ x\in \text{dom }f_0, \quad Ax=b, \quad \nabla f_0(x)+A^T\nu = 0\]
Note: Lagrange multiplier optimality condition.

Minimization over nonnegative orthant
\[ \text{minimize } f_0(x) \quad \text{ subject to } x\succeq 0  \]
\(x\) is optimal iff
\[ x\in\text{dom }f_0, \quad x\succeq 0, \quad  \left\{ \begin{array}{lr}
                                                                                                \nabla f_0(x)_i \ge 0, \quad &x_i = 0  \\
                                                                                                \nabla f_0(x)_i = 0, \quad &x_i \ge 0
                                                                                      \end{array} \right.   \]
Note: The last condition is a complementarity condition.


No comments: