next up previous
Next: Problem Reformulation Up: Theoretical Tools for Problem Previous: The Symbol of The

Problems Classification

Using the symbol of the Hessian, we can classify problems according to the asymptotic behavior of $\hat{\cal H} ({\bf k} )$ for large $\vert {\bf k}\vert$. We consider the equation

$\displaystyle {\cal H} \alpha = f.$     (49)

We want to classify problem into well-posed (good) problem, ill-posed (bad) problem, easy problems and difficult problems.



Well posed problems are characterized by having a unique solution that is stable to perturbation in the data of the problem. When we consider high frequencies, which is the range of frequencies where our analysis of the Hessian is accurate, the following property implies well posedness,

$\displaystyle \hat{\cal H} ({\bf k}) = O ( \vert {\bf k}\vert ^{\gamma} ) \qquad \gamma \geq 0.$     (50)

Note that a small change in the design variable in the high frequency range causes large changes in the right hand side, or the gradient. Hence, small changes in the data results in small changes in the solution, and the solution has the desired stability properties. This is summarized by
$\displaystyle \vert \hat \alpha ({\bf k})\vert \approx \vert {\bf k}\vert ^{-\g...
...hat {g} ({\bf k}) \vert \qquad \qquad \mbox{\rm for large } \vert {\bf k} \vert$     (51)

which follows from (35) and (50).



Ill-Posedness is referred to a case where the solution is not unique or that it is sensitive to data in the problem. Ill-posedness that result from the behavior of high frequencies can be characterized as

$\displaystyle \hat{\cal H} ({\bf k} ) = O ( \frac{1} {\vert{\bf k}\vert ^{\gamma}} ) \qquad \gamma >0.$     (52)

To see that such a behavior causes ill-posedness, consider a solution $\alpha$ and a perturbation of it in the form $\alpha + c \exp(i {\bf k}\cdot {\bf x} )$. The gradients evaluated for these two choices for the design variable are ${\cal H} \alpha$ and ${\cal H} \alpha + c \hat{\cal H} ({\bf k}) \exp(i {\bf k}\cdot {\bf x})$. The latter can be approximated by
$\displaystyle {\cal H} ( \alpha + c \exp(i {\bf k}\cdot {\bf x} ) )= {\cal H} (...
...bf x} ) \approx {\cal H} \alpha \qquad \mbox{\rm for large $\vert{\bf k}\vert$}$     (53)

Since this is true for an arbitrary $c$ and $\vert {\bf k}\vert$ sufficiently large, we get that if $\alpha$ is a solution then $\alpha + c \exp(i {\bf k}\cdot {\bf x} )$ is an approximate solution for an arbitrary $c$ and sufficiency large $\vert {\bf k}\vert$. This implies that small changes in the data of the problem will cause large changes in the solution. It is summarized in the relation
$\displaystyle \vert \hat\alpha ({\bf k})\vert \approx \vert {\bf k}\vert ^{\gam...
...rt \hat {g} ({\bf k}) \vert \qquad \qquad \mbox{\rm large } \vert {\bf k} \vert$     (54)

which follows from (35) and (52), which shows that small changes in the gradient are amplified significantly in the design variables, for the high frequencies. Thus, high frequencies in the design variables are unstable.



The Discrete Problem. On a finite grid with mesh size $h$ one consider ${\bf k}$ in the range $\vert{\bf k}\vert \leq \pi/h$. Thus, for well-posed problems satisfying (50), the eigenvalues of the Hessian corresponding to the highest frequencies behave as


$\displaystyle \max _{\vert {\bf k} \vert \leq \pi/h} \vert \hat{\cal H} ({\bf k})\vert = O(\frac{1}{h^\gamma}).$     (55)

Since the smallest eigenvalue of the discrete Hessian is given approximately by the corresponding eigenvalue of the differential Hessian, the condition number of the Hessian behaves as $O(\frac{1}{h^\gamma})$. This quantity is important in evaluating the performance of gradient based algorithms. As was mentioned in lecture 1, the convergence of gradient based methods is determined by $I - \delta {\cal H}$, and the high frequency components in the representation of the solution converge at a rate
$\displaystyle \max _{\pi/(2h) \leq \vert {\bf k} \vert \leq \pi/h} \vert 1 - \delta \hat{\cal H}({\bf k})\vert.$     (56)

The smallest eigenvalues of the Hessian are $O(1)$ being given approximately by their values from the continuous problem. From these observations we conclude that the expected rate of convergence for the full design problem is therefore
$\displaystyle 1 - O(h^\gamma),$     (57)

and that the complexity of a given problem can be determined by the exponent $\gamma$.

In summary, let the symbol of the Hessian satisfy

$\displaystyle \hat {\cal H} ( {\bf k}) = O ( \vert {\bf k}\vert ^\gamma )$     (58)

then, we have the following


$\displaystyle \begin{array}{lr}
\mbox{\tt Well-Posed (''good'') optimization pr...
...l 1 \\
\mbox{\tt Difficult optimization problems:} & \gamma \geq 1
\end{array}$     (59)


next up previous
Next: Problem Reformulation Up: Theoretical Tools for Problem Previous: The Symbol of The
Shlomo Ta'asan 2001-08-22