Question: is rotation about X linear tranform in 2D? (Hint: what's a quick justification?)

lucida

This isn't a quick justification, but it helped me better understand the linear algebra behind the matrix used for rotation in 2D:

cos(Theta) -sin(Theta)

sin(Theta) cos(Theta)

The matrix contains these values because its columns are the actual x, y values taken by the unit orthonormal basis vectors (1,0) and (0,1) respectively after a counter clockwise rotation by theta.

Since any vector in 2D can be represented as a linear combination of these unit orthonormal basis vectors, whenever we are told to rotate any vector by theta, we can imagine that the vector is the hypotenuse of a right triangle
whose legs are scaled versions of the unit orthonormal basis vectors.

Rotating this vector is thus the same as rotating the entire triangle. And now we observe that when we rotate the triangle, we obviously also rotate the two legs of the triangle by the same angle.

This means we can rotate any vector in 2D by decomposing it into a linear combination of the unit orthonormal basis vectors, rotating those vectors individually, and adding them up.

We know this is done mathematically by taking each column of the above matrix, scaling the left one by the x component of the vector we want to rotate and scaling the right column by the y component of the vector we want to rotate, and then adding those two resulting vectors.

Which is the same thing as just multiplying the original vector by the above matrix where we use the components of the vector to obtain a weighted combination of the two columns of the matrix.

Question:is rotation about X linear tranform in 2D? (Hint: what's a quick justification?)This isn't a quick justification, but it helped me better understand the linear algebra behind the matrix used for rotation in 2D:

cos(Theta) -sin(Theta)

sin(Theta) cos(Theta)

The matrix contains these values because its columns are the actual x, y values taken by the unit orthonormal basis vectors (1,0) and (0,1) respectively after a counter clockwise rotation by theta.

Since any vector in 2D can be represented as a linear combination of these unit orthonormal basis vectors, whenever we are told to rotate any vector by theta, we can imagine that the vector is the hypotenuse of a right triangle whose legs are scaled versions of the unit orthonormal basis vectors.

Rotating this vector is thus the same as rotating the entire triangle. And now we observe that when we rotate the triangle, we obviously also rotate the two legs of the triangle by the same angle.

This means we can rotate any vector in 2D by decomposing it into a linear combination of the unit orthonormal basis vectors, rotating those vectors individually, and adding them up.

We know this is done mathematically by taking each column of the above matrix, scaling the left one by the x component of the vector we want to rotate and scaling the right column by the y component of the vector we want to rotate, and then adding those two resulting vectors.

Which is the same thing as just multiplying the original vector by the above matrix where we use the components of the vector to obtain a weighted combination of the two columns of the matrix.