mersenneforum.org

mersenneforum.org (https://www.mersenneforum.org/index.php)
-   Math (https://www.mersenneforum.org/forumdisplay.php?f=8)
-   -   Derivative of Sum not Equal to Sum of Derivatives (https://www.mersenneforum.org/showthread.php?t=11018)

jinydu 2008-11-22 07:57

Derivative of Sum not Equal to Sum of Derivatives
 
One of my students complained this week after I took off points for assuming that a power series can be differentiated term-by-term with no further justification. He asked me for an example why this isn't ok.

I know of one example. The Taylor series for log(1+x) about x = 0 converges when |x| <= 1 except when x = -1. Evidently, this series is differentiable at x = 1 with derivative 1/(1+1) = 1/2. However, differentiating the series term-by-term and evaluating at x = 1 gives a series that does not converge.

I'm looking for further examples. More specifically:

1) Is there an example that only involves positive terms?

2) Is there an example where differentiating the series term-by-term and evaluating at a specific point gives a convergent series that converges to the wrong value (i.e. the derivative of the infinite fails to exist at the point or is has a different value)?

Of course, I know these things can only happen on the boundary of the disk of convergence

Thanks

Kevin 2008-11-23 13:39

The basic idea is when you differentiate, you're multiplying terms in the power series by n. So you want things that converge, but slowly enough so that they won't if terms are multiplied by n.

For the log(1+x) example, the original power series is sum of (-1)^n*x^n/n, and when you differentiate it you get sum of (-1)^n*x^n. So it goes from being alternating series with terms going to 0 linearly to something that just alternates. If you want an example of something using just positive terms, I'd figure out what originally has a power series where the value at x=1 correspond to zeta(2), and becomes the harmonic series after you differentiate (or just say consider the power series sum x^n/n^2 at x=1).

I'm pretty sure that if things converge, it's going to converge to the right thing, but I don't feel like thinking about which theorem of analysis would justify it (EDIT: try abel's theorem [url]http://en.wikipedia.org/wiki/Abel%27s_theorem[/url])

tmorrow 2008-11-24 10:22

Kevin showed the way, here's an example I made up.

f(x) = sum{1,inf} x^n/[n(n+1)] ...(1)...

Using log(1-x) = - sum{1,inf} on |x|<r=1 and the fact that 1/n(n+1) = 1/n - 1/(n+1), the above power series can be expressed in terms of familiar functions:

f(x) = (1/x)(1-x)log(1-x) + 1 on |x|<1, x!=0 (x=0 works in the limit x->0) ...(2)...

Formal differentiation of the power series (1) gives:

f'(x) = sum{1,inf} x^(n-1)/(n+1) ...(3)...
which clearly diverges at x=1 by comparison with the harmonic series.

Routine differentiation of our elementary form (2) gives (after a tidy up)

f'(x) = -(1/x)[1+(1/x).log(1-x)] ...(4)...

From (4) it is clear that we are seeing the same difficulty with non differentiability at x=1. The equality between (3) and (4) can be easily established as well.

Using the above approach students can relate the problems of formal power series differentiation with that of routine differentiation of elementary functions and illustrates a power series can represent poorly behaved functions (a_n may look well behaved on the surface but looks are deceiving). I'm sure simpler function examples can be found if you look hard enough.


All times are UTC. The time now is 16:15.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.