The basic idea is when you differentiate, you're multiplying terms in the power series by n. So you want things that converge, but slowly enough so that they won't if terms are multiplied by n.

For the log(1+x) example, the original power series is sum of (-1)^n*x^n/n, and when you differentiate it you get sum of (-1)^n*x^n. So it goes from being alternating series with terms going to 0 linearly to something that just alternates. If you want an example of something using just positive terms, I'd figure out what originally has a power series where the value at x=1 correspond to zeta(2), and becomes the harmonic series after you differentiate (or just say consider the power series sum x^n/n^2 at x=1).

I'm pretty sure that if things converge, it's going to converge to the right thing, but I don't feel like thinking about which theorem of analysis would justify it (EDIT: try abel's theorem

http://en.wikipedia.org/wiki/Abel%27s_theorem)