mersenneforum.org rotating a shear
 Register FAQ Search Today's Posts Mark Forums Read

 2019-10-13, 11:16 #1 wildrabbitt   Jul 2014 1C216 Posts rotating a shear Hi, my nephew recently asked me a question about a 2 by 2 matrix with real entries. The question was, what is the shear factor of the matrix $$\begin{bmatrix} -1 & 1 \\ -4 & 3 \end{bmatrix}$$. We worked out that it was 5 by taking the coordinates of a point transformed by the matrix, and dividing the distance the point moves due to the shear by the perpendicular distance of the point from the invariant line of the shear which was $$y=2x$$. My nephew thought that there must be an easier way which got me thinking. AFAIK, an $$\textbf{x-shear}$$ is a shear with matrix $$\begin{bmatrix} 1 & a \\ 0 & 1 \end{bmatrix}$$, for which a is the shear factor. The shear of the question is like an x-shear except the line of shear is not the x-axis but the line $$y=2x$$. So I thought that the shear transformation could be composed with a rotation which effectively renders it a shear about the x-axis and then the shear factor would be the entry in row 1, column 2. So I worked out the rotation matrix $$R_{-\theta}$$using the triangle with vertices $$(0,0), (0, 1) \text{ and } (1,2)$$ to get the angle of rotation between the x-axis and the invariant line $$\theta$$. The matrix I got was $$\begin{bmatrix} \frac{1}{\sqrt{5}} & \frac{2}{\sqrt{5}} \\ \frac{-2}{\sqrt{5}} & \frac{1}{\sqrt{5}} \end{bmatrix}$$ The trouble if I multiply the shear matrix on the left by the rotation matrix I get $$\begin{bmatrix} \frac{-9}{\sqrt{5}} & \frac{7}{\sqrt{5}} \\ \frac{-2}{\sqrt{5}} & \frac{1}{\sqrt{5}} \end{bmatrix}$$ This isn't what I expected. Can someone explain how to do the matrix multiplication to get an x-shear with shear factor 5.
 2019-10-13, 13:18 #2 Nick     Dec 2012 The Netherlands 72616 Posts Let t be the (smaller) angle between the line y=2x and the x-axis. You want to rotate clockwise through the angle t, then perform the shear with the x-axis as its invariant line, then rotate anticlockwise through the angle t, so that the combined effect is to leave the line y=2x invariant.
 2019-10-13, 13:37 #3 wildrabbitt   Jul 2014 7028 Posts I see Thanks. So if I didn't know previously what the shear factor is I need to find the angle t and then solve $$R_{t}\begin{bmatrix} 1 & a \\ 0 & 1\end{bmatrix}R_{-t}=A$$ to get the value of a. Last fiddled with by wildrabbitt on 2019-10-13 at 13:45
2019-10-13, 13:42   #4
Nick

Dec 2012
The Netherlands

2×3×5×61 Posts

Quote:
 Originally Posted by wildrabbitt Thanks. So if I didn't know previously what the shear factor is I need to find the angle t and then solve $$\textbf{R}_{t}\begin{bmatrix} 1 & a \\ 0 & 1\end{bmatrix}\textbf{R}_{-t}=\textbf{A}$$ to get the value of a.
Yes, and that is equivalent to
$\left(\begin{array}{cc} 1 & a \\ 0 & 1 \end{array}\right) =R_{-t}AR_t$

 2019-10-13, 13:46 #5 wildrabbitt   Jul 2014 2×32×52 Posts Brilliant. Thanks very much Nick.
 2019-10-14, 16:26 #6 Dr Sardonicus     Feb 2017 Nowhere 13×487 Posts I tried the following: I obtained the eigenvectors of the matrix, which I called A. There was (up to scalar multiples) only one. Luckily, the eigenvalue is 1. I then used the obvious orthogonal vector of the same magnitude to create a matrix for a similarity transformation which expresses A in a coordinate system with the eigenvector pointing along one of the coordinate axes. Because the vectors in the transformation matrix are (by construction) orthogonal and of the same magnitude, the matrix is, up to a nonzero scalar factor, an orthogonal matrix (a matrix that defines a rotation and/or a reflection). Since the similarity transformation involves multiplying by the matrix and its inverse, the scalar factor cancels out, so you can avoid expressions with messy square roots. Since I can never remember which order to multiply in -- Inverse first? Matrix first? -- the first time I tried I did it the wrong way, and got a messy, obviously wrong answer. That told me I'd gotten the order of multiplication wrong, so I did it the other way. Code: ? A=[-1,1;-4,3] %1 = [-1 1] [-4 3] ? v=mateigen(A) %2 = [1/2] [1] ? B=concat(v,[-1,1/2]~) %3 = [1/2 -1] [1 1/2] ? B*A*B^(-1) %4 = [17/5 9/5] [-16/5 -7/5] ? B^(-1)*A*B %5 = [1 5] [0 1] Last fiddled with by Dr Sardonicus on 2019-10-14 at 16:27 Reason: clarification
2019-10-14, 16:43   #7
wildrabbitt

Jul 2014

1C216 Posts

Thanks for all that.

Quote:
 I then used the obvious orthogonal vector of the same magnitude to create a matrix for a similarity transformation which expresses A in a coordinate system with the eigenvector pointing along one of the coordinate axes.

Can you tell me how you did that, giving the actual matrix?

2019-10-15, 01:44   #8
LaurV
Romulan Interpreter

"name field"
Jun 2011
Thailand

5×112×17 Posts

Quote:
 Originally Posted by wildrabbitt Can you tell me how you did that, giving the actual matrix?
This series about linear algebra may help. What you ask for is in the fourth video called matrix multiplication, and also the thirteenth video, called changing of the base, but to understand better you have to watch all the series.

Last fiddled with by LaurV on 2019-10-15 at 03:49

 2019-10-15, 10:03 #9 wildrabbitt   Jul 2014 2·32·52 Posts Thanks.
2019-10-15, 13:11   #10
Dr Sardonicus

Feb 2017
Nowhere

13·487 Posts

Quote:
 Originally Posted by wildrabbitt Thanks for all that. Can you tell me how you did that, giving the actual matrix?
First, I mention the fact that not every 2x2 matrix can be transformed into a shear matrix by a rotation, or even by a similarity transformation. Among the things about a square matrix that remain invariant under a similarity transformation, are its characteristic polynomial, and therefore all the coefficients of this polynomial (which include, up to sign, the trace and the determinant of the matrix). Now a shear matrix

M = [1,a;0,1]

has characteristic polynomial x^2 - 2*x + 1 = (x-1)^2. The trace is 2 and the determinant is 1. Therefore any matrix that is similar to a shear matrix has characteristic polynomial (x-1)^2, trace 2, and determinant 1.

Next, a rotation is accomplished by a similarity transformation using an orthogonal matrix with determinant +1. An orthogonal matrix is a matrix whose inverse is equal to its transpose. It follows from this definition that a matrix is orthogonal when its rows (or its columns) are mutually orthogonal vectors of magnitude 1.

Luckily, a similarity transformation

$A \;\rightarrow \; B^{-1}AB$

is unaffected if B is multiplied by a nonzero scalar factor. So as long as the columns (or rows) of B are mutually orthogonal and all have the same magnitude, the similarity transformation gives the same result as if B were orthogonal.

Now, where in the blue tunket do eigenvectors come from? Well, if A is an nxn matrix, the (nonzero) nx1 vector v is an eigenvector of A if A*v = k*v for some scalar k. The scalar k is called an eigenvalue of A. If f(x) is the characteristic polynomial of A, then f(k) = 0 [and, if f(k) = 0, then k is an eigenvalue of A.]

So, suppose v is an eigenvector of A, and A*v = k*v for a scalar k. Suppose B is an invertible nxn matrix having v as its first column. We compute the first column of B^(-1)*A*B as follows (remember, matrix multiplication is associative). The first column of A*B is A*v, which is kv.

So the first column of B^(-1)*A*B is B^(-1)*kv = kB^(-1)*v (because k is a scalar factor). Now v is the first column of B, so B^(-1)*v = [1;0;0;...;0], the column with 1 in the first row and 0 in all the other rows.

So, the first column of B^(-1)*A*B is [k;0;0...;0], the column with the eigenvalue k in the first row, and 0's everywhere else.

(If you had another eigenvalue k2, with eigenvector w, linearly independent from v, and used it in the second column of B, then the same rigamarole shows that

the second column of B^(-1)*A*B would then be [0;k2;0;0;...;0], and so on.)

So, if you have real eigenvalues, and use their eigenvectors as the initial columns of an invertible matrix B, you can at least partially "diagonalize" the matrix A by a similarity transformation.

Hey -- this is great! By going through this carefully, which I haven't done in at least 40 years, I think I'll be able to remember it again!

Now, in the case at hand, A = [-1,1;-4,3] is 2x2, and its characteristic polynomial is (x-1)^2, so there is a single real eigenvalue 1, which is repeated. There is, up to scalar multiples, a single eigenvector. The one Pari-GP spit out was [1/2;1]. The scalar multiple [1;2] would work just as well. You can check that A*v = v.

Now, for this A, we're out of eigenvectors. We can use any vector we like (other than a scalar multiple of v) for the second column of B, and we will be guaranteed that the first column of B^(-1)*A*B will be [1;0].

In order for the transformation to act like an orthogonal transformation, we need a vector that is orthogonal to v, and has the same magnitude.

No problem! If v = [a;b] is a nonzero 2x1 vector, the vector w = [-b;a] fills the bill. Even better, the matrix B = [a,-b;b,a] has positive determinant, so the "normalized" orthogonal matrix has determinant +1, so is a rotation matrix.

Here, then, with A = [-1,1;-4,3] the matrix B = [1,-2;2,1] gives a similarity transformation identical to a rotation, and transforms A into a matrix with first column [1;0] and (since the determinant remains 1) 2,2 entry equal to 1.

 2019-10-15, 19:11 #11 wildrabbitt   Jul 2014 2·32·52 Posts Thanks very much. I understand now. Am I write in thinking that this method would work for any shear? Last fiddled with by wildrabbitt on 2019-10-15 at 19:11 Reason: forgot the question mark

 Similar Threads Thread Thread Starter Forum Replies Last Post davieddy Puzzles 7 2008-04-09 01:12 dave_0273 Math 9 2004-08-25 18:24

All times are UTC. The time now is 05:45.

Sun Mar 26 05:45:46 UTC 2023 up 220 days, 3:14, 0 users, load averages: 0.53, 0.70, 0.75