Wednesday, August 13, 2008

On Approximating Continuous Functions by Polynomials

In theoretical analysis and also numerical work, we often would like to approximate complicated functions with ones that are more easily managed. It turns out that, given a continuous function $f: [0,1] \rightarrow \mathbb{R}$, the sequence of Bernstein polynomials given by $p_n(x)=\sum_{k=0}^n {n \choose k}f(\frac{k}{n})x^k (1-x)^{n-k}$ converges uniformly to $f$ as $n \rightarrow \infty$. Here ${n \choose k}$ is the familiar binomial coefficient, given by ${n \choose k} = \frac{n!}{k!(n-k)!}$

Before validating this assertion with a proof, it is worthwhile to explain what might have motivated it in the first place. Indeed, the quantity ${n \choose k} x^k (1-x)^{n-k}$ is simply the probability of $k$ successes in $n$ Bernoulli trials, where a success occurs with probability $x$.

We can thus view ${n \choose k} x^k (1-x)^{n-k}$ as the probability of getting $k$ heads in $n$ flips of a coin (not necessarily fair), where the probability of a head is $x$. And $f(\frac{k}{n})$ can be seen as representing the payout when exactly $k$ heads are flipped. The expected value of the payout is thus $p_n(x)=\sum_{k=0}^n {n \choose k}f(\frac{k}{n})x^k (1-x)^{n-k}$.

Now, as $n$ gets bigger and bigger, the law of large numbers states that the probability of getting $k$ heads will be very small for most $k$, but large for $k/n$ close to $x$. Thus as $n$ increases, the payout clusters around $f(x)$. So for large $n$, we would expect $p_n(x)$, the expected value of the payout, to be $f(x)$.

Our intuition seems reasonable, but what we have written is hardly a rigorous proof. Rather than providing one, we refer the reader to the standard treatment given in Elementary Classical Analysis by Marsden and Hoffman.

No comments: