mersenneforum.org > Math Derivative of Sum not Equal to Sum of Derivatives
 Register FAQ Search Today's Posts Mark Forums Read

 2008-11-22, 07:57 #1 jinydu     Dec 2003 Hopefully Near M48 2·3·293 Posts Derivative of Sum not Equal to Sum of Derivatives One of my students complained this week after I took off points for assuming that a power series can be differentiated term-by-term with no further justification. He asked me for an example why this isn't ok. I know of one example. The Taylor series for log(1+x) about x = 0 converges when |x| <= 1 except when x = -1. Evidently, this series is differentiable at x = 1 with derivative 1/(1+1) = 1/2. However, differentiating the series term-by-term and evaluating at x = 1 gives a series that does not converge. I'm looking for further examples. More specifically: 1) Is there an example that only involves positive terms? 2) Is there an example where differentiating the series term-by-term and evaluating at a specific point gives a convergent series that converges to the wrong value (i.e. the derivative of the infinite fails to exist at the point or is has a different value)? Of course, I know these things can only happen on the boundary of the disk of convergence Thanks
 2008-11-23, 13:39 #2 Kevin     Aug 2002 Ann Arbor, MI 433 Posts The basic idea is when you differentiate, you're multiplying terms in the power series by n. So you want things that converge, but slowly enough so that they won't if terms are multiplied by n. For the log(1+x) example, the original power series is sum of (-1)^n*x^n/n, and when you differentiate it you get sum of (-1)^n*x^n. So it goes from being alternating series with terms going to 0 linearly to something that just alternates. If you want an example of something using just positive terms, I'd figure out what originally has a power series where the value at x=1 correspond to zeta(2), and becomes the harmonic series after you differentiate (or just say consider the power series sum x^n/n^2 at x=1). I'm pretty sure that if things converge, it's going to converge to the right thing, but I don't feel like thinking about which theorem of analysis would justify it (EDIT: try abel's theorem http://en.wikipedia.org/wiki/Abel%27s_theorem) Last fiddled with by Kevin on 2008-11-23 at 13:42
 2008-11-24, 10:22 #3 tmorrow     Jan 2004 2×3×17 Posts Kevin showed the way, here's an example I made up. f(x) = sum{1,inf} x^n/[n(n+1)] ...(1)... Using log(1-x) = - sum{1,inf} on |x|0) ...(2)... Formal differentiation of the power series (1) gives: f'(x) = sum{1,inf} x^(n-1)/(n+1) ...(3)... which clearly diverges at x=1 by comparison with the harmonic series. Routine differentiation of our elementary form (2) gives (after a tidy up) f'(x) = -(1/x)[1+(1/x).log(1-x)] ...(4)... From (4) it is clear that we are seeing the same difficulty with non differentiability at x=1. The equality between (3) and (4) can be easily established as well. Using the above approach students can relate the problems of formal power series differentiation with that of routine differentiation of elementary functions and illustrates a power series can represent poorly behaved functions (a_n may look well behaved on the surface but looks are deceiving). I'm sure simpler function examples can be found if you look hard enough.

 Similar Threads Thread Thread Starter Forum Replies Last Post LaurV Homework Help 34 2013-03-03 09:35 nibble4bits Math 0 2008-01-08 05:53 kdelisle2005 Hardware 11 2007-03-02 03:54 Mystwalker Factoring 1 2004-10-02 17:44 hbock Lone Mersenne Hunters 0 2004-01-03 20:51

All times are UTC. The time now is 00:12.

Tue Oct 27 00:12:18 UTC 2020 up 46 days, 21:23, 0 users, load averages: 1.85, 1.71, 1.78