11) Stability#

Last time#

  • Newton’s method

  • Convergence of fixed-point method

  • Different formulation of Newton’s method

Today#

  1. Forward and backward stability

  2. Lax equivalence theorem

  3. Recap and Q/A

using Plots
default(linewidth=4, legendfontsize=12)

1. Stability#

Activity

Source: FNC: backward error

(Forward) Stability#

“nearly the right answer to nearly the right question”

\[ \frac{\lvert \tilde f(x) - f(\tilde x) \rvert}{| f(\tilde x) |} \in O(\epsilon_{\text{machine}}) \]
for some \(\tilde x\) that is close to \(x\)

Backward Stability#

“exactly the right answer to nearly the right question”

\[ \tilde f(x) = f(\tilde x) \]
for some \(\tilde x\) that is close to \(x\)

Note:

  • Every backward stable algorithm is (\(\implies\)) stable.

  • Not every stable algorithm is backward stable.

  • The difference is in the focus: forward analysis is concerned with what the method reaches, while backward analysis looks at the problem being solved (which is why we can speak of ill-conditioned methods and ill-conditioned problems).

  • In a backward stable algorithm the errors introduced during the algorithm have the same effect as a small perturbation in the data.

  • If the backward error is the same size as any uncertainty in the data then the algorithm produces as good a result as we can expect.

Accuracy of backward stable algorithms (Theorem)#

A backward stable algorithm for computing \(f(x)\) has relative accuracy

\[ \left\lvert \frac{\tilde f(x) - f(x)}{f(x)} \right\rvert \lesssim \kappa(f) \epsilon_{\text{machine}} . \]
Backward stability is generally the best we can hope for.

In practice, it is rarely possible for a function to be backward stable when the output space is higher dimensional than the input space.

2. Lax equivalence theorem#

  • Consistency: describes how well the numerical scheme aproximates the PDE (if the Finite Difference (FD) discretization is at least of order 1 \(\implies\) it is consistent — The residual reduces under grid refinement). We will see FD in more details when we cover numerical differentiation.

  • Stability: Numerical stability concerns how errors introduced during the execution of an algorithm affect the result. It is a property of an algorithm rather than the problem being solved (check Higham’s blog again). This gets subtle for problems like incompressible materials or contact mechanics.

  • Convergence: When the solution of the approximated equation approaches the actual solution of the continuous equation.

Consistency + Convergence \(\implies\) Stability
Consistency + Stability \(\implies\) Convergence
Hence,
Lax equivalence theorem: Consistency + Convergence \(\iff \) Stability

3. Recap and Q/A#

All good, but in praxtice?#

These are foundational theoretical tools, and can often be tested in practice, but

  • it doesn’t establish that the code works

  • it’s not possible to prove convergence for many real-world problems

  • there are open research questions about whether numerous important problems are even well-posed

Empirical measurement of convergence#

Convergence on a problem with an analytical solution:

  • These can be great, but analytical solutions of nonlinear equations are extremely hard to find.

  • Such solutions usually have many symmetries and rarely activate all terms.

Self-convergence:

  • Just set up a problem and solve it on a sequence of meshes, use a refined solution as reference (perhaps using Richardson extrapolation), then plot error.

  • You’ve checked that the code convergences to some solution, not the correct solution. (You could have a factor of 2 typo, or more serious mistakes.)

Method of Manufactured Solutions (MMS):#

  • Errors that affect solution accuracy can easily be detected.

  • There are some technical issues with singular or under-resolved problems (shocks, material discontinuities).