Math 342 - Week 8 Notes

Mon, Mar 14

Before the break, we talked about polynomial interpolation. This is the process of finding a degree n polynomial that passes through n+1 points in the plane. We described three different methods for solving this problem:

  1. Use the standard polynomial basis \(\{x^k\}\) and find the coefficients by solving the Vandermonde matrix equation \(V c = y\).

  2. Use the Lagrange polynomial basis, and then the coefficients are exactly the \(y\)-values.

  3. Use the Newton polynomial basis, and then the coefficients are the divided differences \(f[x_0,\ldots, x_k]\).

All three of these methods result in different formulas for the same polynomial (which is unique). Often the interpolating polynomial \(p_n\) is constructed for a function \(f\) so that \(p_n(x_k) = f(x_k)\) for each node \(x_0, \ldots x_n\). Then we call \(p_n\) an interpolating polynomial for the function \(f\). Using interpolating polynomials is one way to approximate a function.

  1. Find the 2nd degree interpolating polynomial for \(f(x)=10^x\) with nodes \(x_0=0, x_1=1, x_2 = 2\). Desmos graph

Here are some important results about these approximations.

Mean Value Theorem for Divided Differences. Let \(x_0, \ldots, x_n\) be distinct nodes in \([a,b]\) and let \(f \in C^{n}[a,b]\). Then there exists a number \(\xi\) between the values of \(x_0, \ldots, x_n\) such that \[f[x_0, \ldots, x_n] = \frac{ f^{(n)}(\xi) }{n!}.\]

Proof. The function \(f-p_n\) has \(n+1\) roots, so its derivative must have \(n\) roots, and so on, until the n-th derivative has at least one root. Then, the linear coefficient of the n-th derivative of \(p_n\) is \(n! f[x_0, \ldots, x_n]\) which completes the proof. \(\square\)

Interpolation Error Theorem. Let \(x_0, \ldots, x_n\) be distinct nodes in \([a,b]\) and let \(f \in C^{n+1}[a,b]\). For each \(x \in [a,b]\), there exists a number \(\xi\) between \(x_0, \ldots, x_n\), and \(x\) such that \[f(x)-p_n(x) = \frac{f^{(n+1)}(\xi)}{(n+1)!} (x-x_0) \cdots (x-x_n).\]

Proof. Add \(x\) to the list of nodes and construct the \(n+1\)-th degree interpolating polynomial \(p_{n+1}\). Then using the Newton form for both interpolating polynomials, \[p_{n+1} - p_n = f[x_0,\ldots,x_n,x](x-x_0)\cdots (x-x_n).\] So by the MVT for Divided Differences, there exists \(\xi\) between \(x\) and \(x_0, \ldots, x_n\) such that \[f(x)-p_n(x) = p_{n+1}(x)-p_n(x) = \frac{f^{(n+1)}(\xi)}{(n+1)!} (x-x_0) \cdots (x-x_n).~ \square\]

Note: The book has a special case upper bound for the interpolation error when the nodes are evenly spaced in the interval \([a,b]\), see Lemma 52 on page 84 for the following lemma and its proof.

Lemma 52. If \(x_0, \ldots, x_n\) are equally spaced in \([a,b]\) and \(x_0 = a\) and \(x_n = b\), then \[|f(x)-p_n(x)| \le \frac{h^{n+1}}{4(n+1)} \max_{\xi \in [a,b]} |f^{(n+1)}(\xi)|,\] where \(h = \frac{b-a}{n}\).

  1. Compute the worst case error in a 5th degree interpolating polynomial for \(\cos x\) on the interval \([0,\pi/2]\) using equally spaced nodes. Desmos graph.
  1. Compute the worst case error in a 6th degree interpolating polynomial for \(\ln x\) on the interval \([1,2]\) using equally spaced nodes. Desmos graph.

Note: The worst-case error in interpolating polynomials doesn’t always decrease, as the example in Section 3.2 of the book shows.


Wed, Mar 16

Today we looked at some interpolation examples in more detail. We used this Python Notebook to investigate how different interpolating polynomials behave.

  1. \(f(x) = \dfrac{1}{1+x^2}\) on [-5,5]. This function is known as Runge’s function (also known as the Witch of Agnesi). It is an example where the error gets worse as the number of nodes increases (if you use equally spaced nodes).

  2. \(\sin x\) on \([0, 10\pi]\). This example illustrates what goes wrong when you use the Vandermonde matrix approach. As the number of nodes grows past 20, the Vandermonde matrix is ill-conditioned, so it gives an incorrect interpolating polynomial.

Note: Because large Vandermonde matrices tend to be ill-conditioned, using the Newton basis for interpolation is usually preferred.

Next we looked at what happens when two nodes get really close together. In the limit, we get a polynomial that has the same derivative as \(f\) at that node. This lead to a discussion of Hermite polynomials.

  1. We calculated the Hermite polynomial for \(\cos(\pi x)\) with nodes \(x_0 = 0\), \(x_1 = 1/2\), and \(x_2 = 1\) by hand. Desmos graph.

  2. Calculate the Hermite polynomial for \(e^x\) with nodes \(x_0 = 0\) and \(x_1 = 1\). Desmos graph.


Fri, Mar 18

Instead of covering new material, we used today to recap what we have learned about polynomial interpolation. We did this lab in class:

For the last five minutes I talked about using the sympy symbolic algebra library in Python to solve problem 4 from the lab. With sympy you can create symbolic expressions in Python which you can differentiate, integrate, and simplify algebraically. First you need to import the sympy library. Either enter import sympy or you can just import all of the functions from sympy which saves some typing:

from sympy import *

Next you need to tell Python to treat the letter x as a symbolic variable (not like a regular Python variable).

x = Symbol('x')

Now you can enter symbolic expressions like the \(l_3\) polynomial from the lab:

l3 = 1/6*(x-1)*(x-2)*(x-3)

Once you have the Lagrange basis polynomials entered, you can start to compute the functions \(h_i\) and \(\tilde{h}_i\). For example:

h3 = (1-2*(x-3)*l3.diff(x).subs({x:3}))*(l3)**2
h3tilde = (x-3)*(l3)**2

You’re given the y-values \(y_0\), \(y_1\), \(y_2\), and \(y_3\) in the lab. To find the derivatives \(y_0'\), \(y_1'\), \(y_2'\), and \(y_3'\), you’ll need to differentiate \(f(x) = -x \cos(\pi x)\). For example:

f = -x*cos(pi*x)
y3prime = f.diff(x).subs({x:3}) 

Once you finish computing a complicated expression (like H7 in the lab), you can use the simplify function in sympy to algebraically simplify the result:

# Once you find H7, you can simplify it using:
simplify(H7)

Sometimes that helps make it look a lot nicer!