mersenneforum.org  

Go Back   mersenneforum.org > Great Internet Mersenne Prime Search > Math

Reply
 
Thread Tools
Old 2006-12-07, 22:37   #1
Damian
 
Damian's Avatar
 
May 2005
Argentina

2×3×31 Posts
Default Tensor Analysis books

Wich book is the best to initiate to study tensor analysis? I have Levi-Civita "The absolute differential calculus" but is to abstract to me. I would prefer one with more numerical examples and graphics if possible.
Thanks in advance,
Damian.
Damian is offline   Reply With Quote
Old 2006-12-11, 19:23   #2
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
Repรบblica de California

5×2,351 Posts
Default

We used this one in my graduate-level differential geometry class, but I suspect you might also find it somewhat "abstract" for your taste, even though Fomenko and Novikov are well-known relativity physicists and try to keep things "physically grounded" whenever possible.

If differential geometry were easy, general relativity would be a high school subject.
ewmayer is offline   Reply With Quote
Old 2006-12-12, 08:57   #3
xilman
Bamboozled!
 
xilman's Avatar
 
"๐’‰บ๐’ŒŒ๐’‡ท๐’†ท๐’€ญ"
May 2003
Down not across

1164710 Posts
Default

Quote:
Originally Posted by ewmayer View Post
We used this one in my graduate-level differential geometry class, but I suspect you might also find it somewhat "abstract" for your taste, even though Fomenko and Novikov are well-known relativity physicists and try to keep things "physically grounded" whenever possible.

If differential geometry were easy, general relativity would be a high school subject.
I very much like Misner, Thorne & Wheeler Gravitation but that text is very much more than a book on tensor calculus.

It does have a lot of pretty pictures and sound physical interpretations of objects which are often treated as very abstract mathematical constructs.


Paul
xilman is online now   Reply With Quote
Old 2006-12-12, 17:03   #4
mfgoode
Bronze Medalist
 
mfgoode's Avatar
 
Jan 2004
Mumbai,India

40048 Posts
Lightbulb Tensor Analysis


Damian, I'm not in the league with Ewmayer or Xilman nor I can I ever measure up to them but I would advise the Schaum's outline series of Theory and problems of 'Vector Analysis and an introduction to Tensor Analysis' There are about 60 pages at the end devoted to Tensor Analysis and I think sufficient to move on to perhaps those recommended by our colleagues.

It has 480 solved problems so you can get a better idea on the subject. It is by Murray R. Spiegel, PhD. This is widely available in the U.S. libraries and in New York I can assure you as I picked my copy on sale from them for a throwaway price. My copy is collecting dust on my shelves .

I do not profess to have gone thru or even understood it, but I know a good book when I see one .

Mally

Last fiddled with by mfgoode on 2006-12-12 at 17:04
mfgoode is offline   Reply With Quote
Old 2006-12-14, 00:16   #5
Damian
 
Damian's Avatar
 
May 2005
Argentina

2728 Posts
Default

thanks for the replys. I'm downloading Thorne and Wheeler Gravitation book.
A newbie question: Does covariant and contravariant concept has anything to do with the transpose of a vector?
I ask because I see that a_i*b^i gives the dot product (that is the same as the product of a vector matrix with the traspose of the other vector matrix), and it also the same result as the contraction of tensors (the summation convention)
Another question: how can I use tex tags in these posts?
Thanks in advance
Damian.
Damian is offline   Reply With Quote
Old 2006-12-14, 09:42   #6
mfgoode
Bronze Medalist
 
mfgoode's Avatar
 
Jan 2004
Mumbai,India

22·33·19 Posts
Lightbulb High Brow.


I may be terribly wrong Damian.
I think you are jumping the gun, but it all depends on your level.
I knew a buddy of mine who was studying this book on gravitation for his PhD thesis. As xilman says it is more than just Tensor calculus and as ewmayer says about differential geometry.
The climb to Tensors is long and tedious and requires a good foundation in modern geometry. This sounds simple but I dont want to discourage you though.
The covariant curvature tensor is of fundamental importance in Einstein's
general theory of Relativity. The contravariance has to do with curvilinear co-ordinate systems. They are both related with the latter coming before the former.
All the best,
Mally
mfgoode is offline   Reply With Quote
Old 2006-12-14, 15:46   #7
Xyzzy
 
Xyzzy's Avatar
 
Aug 2002

23×1,069 Posts
Default

Quote:
Another question: how can I use tex tags in these posts?
http://www.mersenneforum.org/showthread.php?t=4576
Xyzzy is offline   Reply With Quote
Old 2006-12-14, 16:08   #8
Damian
 
Damian's Avatar
 
May 2005
Argentina

2·3·31 Posts
Default

Thanks,
what I ment was: if I have two tensors A and B, then the tensor contraction
 A_i  B^i
equals the dot product of two vectors, wich itself equals the product of column vector matrix A^t with file vector B
Is this casual, or there is a connection between covariance/contravariance and transpose of matrices.
I guess the answer is that is casual, because I can have a rank 3 tensor, and how would I define its transpose since it is "similar" to a three dimensional matrix.
Thanks,
Damian.
Damian is offline   Reply With Quote
Old 2006-12-14, 17:35   #9
xilman
Bamboozled!
 
xilman's Avatar
 
"๐’‰บ๐’ŒŒ๐’‡ท๐’†ท๐’€ญ"
May 2003
Down not across

101101011111112 Posts
Default

Quote:
Originally Posted by Damian View Post
Thanks,
what I ment was: if I have two tensors A and B, then the tensor contraction
 A_i  B^i
equals the dot product of two vectors, wich itself equals the product of column vector matrix A^t with file vector B
Is this casual, or there is a connection between covariance/contravariance and transpose of matrices.
I guess the answer is that is casual, because I can have a rank 3 tensor, and how would I define its transpose since it is "similar" to a three dimensional matrix.
Thanks,
Damian.
|In the special case of the Euclidean metric (Lorenz metric in GR) and Cartesian coordinates, the process of raising indices is the same as transposition. This is because the metric tensor, g, has an especially simple form --- the Euclidean metric is just the identity matrix and the Lorenz metric is the Euclidean metric with a single sign change in the x_0 component.

Paul
xilman is online now   Reply With Quote
Old 2006-12-14, 19:57   #10
Damian
 
Damian's Avatar
 
May 2005
Argentina

2×3×31 Posts
Default

Quote:
Originally Posted by xilman View Post
|In the special case of the Euclidean metric (Lorenz metric in GR) and Cartesian coordinates, the process of raising indices is the same as transposition. This is because the metric tensor, g, has an especially simple form --- the Euclidean metric is just the identity matrix and the Lorenz metric is the Euclidean metric with a single sign change in the x_0 component.

Paul
Ok, but take for example this tensor formula
A_{ij}x^iy^j

The vector related formula would be
 x^t A y

Because it would be inconsistent to write:
A x^t y^t

Why does that happen? (have to transpose only one variable and put it before the matrix?)
Damian is offline   Reply With Quote
Old 2006-12-14, 20:19   #11
ewmayer
2ω=0
 
ewmayer's Avatar
 
Sep 2002
Repรบblica de California

5·2,351 Posts
Default

Quote:
Originally Posted by Damian View Post
Ok, but take for example this tensor formula
A_{ij}x^iy^j

The vector related formula would be
 x^t A y

Because it would be inconsistent to write:
A x^t y^t

Why does that happen? (have to transpose only one variable and put it before the matrix?)
The difference is that matrix-vector multiply has conventions about how to loop over the rows and columns, which, to get a scalar result from the vector-vector product of 2 length-n vectors (one row, one column) x and y (which could give either an nxn result or a 1x1, i.e. a scalar, depending on the order of the operands) make it necessary to have the row vector on the left of the product and the column vector on the right. If one's convention is that vectors without transpose superscripts denote column vector, that means x^t y gives a scalar. Similary, for a 3-way product of x, y and nxn matrix A to yield a scalar, one must order things as (row vector)*A*(column vector). In your example, x^t A y is the only way for a matrix-vector product of A, x^t (row vector), and y (column vector) to both be well-defined and yield a scalar result. Note that even though matrix multiply does not commute in general, it *is* associative, i.e. you could first calculate (x^t A) and right-multiply the resulting row vector with y, or first calculate (A y) and then left-multiply the resulting column vector with x^t; in either case the result is the same scalar.

The tensor index notation replaces this row/column-based convention with a different one, based on implied summation over a repeated index. This leads to a less visually intuitive procedure than above, but again, it is unambiguous and (at least for vectors and matrices) completely equivalent to conventional matrix multiply. In A_{ij}x^iy^j, the fact that you can either do the index sum over i (equivalent to x^t A) or j (== A y) first simply reflects the associativity of matrix multiply.
ewmayer is offline   Reply With Quote
Reply

Thread Tools


Similar Threads
Thread Thread Starter Forum Replies Last Post
Favorite Math Books Orgasmic Troll Other Mathematical Topics 17 2011-05-26 17:49
Recommendations for algebra books blob100 Miscellaneous Math 15 2010-06-01 02:25
Text books... Xyzzy Soap Box 9 2007-07-10 17:09
Attitude to books devarajkandadai Lounge 7 2005-06-05 16:38
Books on maths Washuu Miscellaneous Math 1 2005-03-24 11:57

All times are UTC. The time now is 22:56.


Sat Jan 28 22:56:18 UTC 2023 up 163 days, 20:24, 0 users, load averages: 0.88, 0.96, 1.01

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.

This forum has received and complied with 0 (zero) government requests for information.

Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.
A copy of the license is included in the FAQ.

โ‰  ยฑ โˆ“ รท ร— ยท โˆ’ โˆš โ€ฐ โŠ— โŠ• โŠ– โŠ˜ โŠ™ โ‰ค โ‰ฅ โ‰ฆ โ‰ง โ‰จ โ‰ฉ โ‰บ โ‰ป โ‰ผ โ‰ฝ โŠ โŠ โŠ‘ โŠ’ ยฒ ยณ ยฐ
โˆ  โˆŸ ยฐ โ‰… ~ โ€– โŸ‚ โซ›
โ‰ก โ‰œ โ‰ˆ โˆ โˆž โ‰ช โ‰ซ โŒŠโŒ‹ โŒˆโŒ‰ โˆ˜ โˆ โˆ โˆ‘ โˆง โˆจ โˆฉ โˆช โจ€ โŠ• โŠ— ๐–• ๐–– ๐–— โŠฒ โŠณ
โˆ… โˆ– โˆ โ†ฆ โ†ฃ โˆฉ โˆช โŠ† โŠ‚ โŠ„ โŠŠ โŠ‡ โŠƒ โŠ… โŠ‹ โŠ– โˆˆ โˆ‰ โˆ‹ โˆŒ โ„• โ„ค โ„š โ„ โ„‚ โ„ต โ„ถ โ„ท โ„ธ ๐“Ÿ
ยฌ โˆจ โˆง โŠ• โ†’ โ† โ‡’ โ‡ โ‡” โˆ€ โˆƒ โˆ„ โˆด โˆต โŠค โŠฅ โŠข โŠจ โซค โŠฃ โ€ฆ โ‹ฏ โ‹ฎ โ‹ฐ โ‹ฑ
โˆซ โˆฌ โˆญ โˆฎ โˆฏ โˆฐ โˆ‡ โˆ† ฮด โˆ‚ โ„ฑ โ„’ โ„“
๐›ข๐›ผ ๐›ฃ๐›ฝ ๐›ค๐›พ ๐›ฅ๐›ฟ ๐›ฆ๐œ€๐œ– ๐›ง๐œ ๐›จ๐œ‚ ๐›ฉ๐œƒ๐œ— ๐›ช๐œ„ ๐›ซ๐œ… ๐›ฌ๐œ† ๐›ญ๐œ‡ ๐›ฎ๐œˆ ๐›ฏ๐œ‰ ๐›ฐ๐œŠ ๐›ฑ๐œ‹ ๐›ฒ๐œŒ ๐›ด๐œŽ๐œ ๐›ต๐œ ๐›ถ๐œ ๐›ท๐œ™๐œ‘ ๐›ธ๐œ’ ๐›น๐œ“ ๐›บ๐œ”