Here is the final analysis question from 2003.
12C. State carefully the formula for integration by parts for functions of a real variable.
Let be infinitely differentiable. Prove that for all and for all ,
By considering the function at , or otherwise, prove that the series
converges to .
What is implied by “state carefully”? It probably means that more is required than just writing
What else can one put? The main thing is the conditions under which the formula is valid. So I think what is required is something like this.
Let and be differentiable functions on the interval . Hmm … I have to confess that I’m not sure what the precise conditions are, or rather what a standard set of precise conditions is. I could go for continuously differentiable since that would guarantee that all the integrals exist. A quick check — that’s the formulation used by Wikipedia, so it’s probably fairly standard. So here’s what you probably need to say (unless you’ve been given some more general statement in lectures, in which case obviously you should use that).
Let and be continuously differentiable functions on the closed interval . Then
Now we have to prove Taylor’s theorem with the integral form of the remainder. I remember that at least one version of Taylor’s theorem always gives me trouble, but I think it’s the one with the mean-value-theorem-ish remainder, and a quick look at this one suggests that all we have to do is integrate the remainder term by parts, which is an obvious enough thing to try even without the huge clue that we have just been told to state the formula for integration by parts.
Obviously, integrating the remainder term by parts is done in order to produce a new term, and therefore to prove the statement by induction. So let’s write down the statement first. Here is what I would actually write.
When , the statement we are asked to prove is that . This is true by the fundamental theorem of calculus.
Now to the rest of the answer.
Let us now use integration by parts to rewrite the remainder term. Setting and , we have that both and are continuously differentiable. Also, . Therefore our integral is , which equals
The first term is equal to , which proves the inductive step.
It’s obvious what the last part is asking us to do: we must simply plug in . That requires us to differentiate infinitely many times. Fortunately, it’s a function where the result is extremely nice. The first derivative is . Then we get , then , then . OK, the pattern is clear now, so let’s do a proper proof by induction.
I claim that . This is true when , since then the derivative is . If it is true for , then it is true for , since the derivative of is . [It looks like a bit of a cheat to write that, since I haven't shown my working -- things like noticing that two minus signs cancelled out -- but it's hard to see how I could have reached this answer by accident, so the examiner couldn't reasonably remove marks for that. Maybe it would have been better to write slightly more.]
The question really is holding our hand here. Let’s apply Taylor’s theorem with . The one thing not to do is just calculate the infinite series. The whole point of the question is to prove that you understand that estimating the remainder term is necessary if you want a rigorous proof. Let’s underline that by doing it first.
We shall prove this result by applying Taylor’s theorem. First let us obtain a bound for the remainder term, which is
How are we going to estimate that? Well, that looks a lot smaller than and the s cancel out. How can we make that thought precise? Well, . OK, here goes.
Since for all in the range , the integrand is at most , which implies that the term is at most , which tends to zero. Therefore, by Taylor’s theorem,
By our earlier calculation, , and , so
But , so we are done.
Not much to say about that question, since it was an easy one. But as I’ve already said, if a question is easy, then the examiner wants you to do it properly, and if you don’t then you may well lose an alpha. In this case, doing it properly means stating some conditions on functions that appear in the formula for integration by parts, and more importantly it means bothering to prove that the remainder term tends to zero when you apply Taylors theorem. People who didn’t do the latter would not have got alphas.
Is there a reasonable “or otherwise” option? That’s a difficult one. If you’re allowed to differentiate a power series term by term, then you can differentiate to get , which is a geometric series (when as it will be here) that sums to . So the original function is, up to a constant, . Looking at what happens when we see that the constant is 0, and we can now plug in .
But was it reasonable to assume that a power series can be differentiated term by term inside its radius of convergence? It's certainly a different part of the course. My guess is that this proof — written out a bit less sketchily than I have written it — would have been accepted even if that result had been merely stated and not itself proved, simply because the examiner would have given some credit for independent thought. But I can't say that with total certainty, because it is a fairly substantial result to assume, whereas the intended approach doesn't ask you to assume anything more than you've just proved.