Another reason why matrices with matching columns and rows can be multiplied could be as follows:
If we think of matrices as transforms in space. We need to read them from right to left. So the first matrix in the figure above would transform a 1 dim vector to n dim.
And the corresponding matrix on the left would transform the n dim vector back to 1 dim.
So these transformations would only make sense if the rows of the 2nd matrix matched the column of the 1st one.
Misaka-10032
$A$ is symmetric because of the symmetric property of inner product.
$$
\langle u, v \rangle = \langle v, u \rangle \\
u^TAv = v^TAu
$$
As both sides of the equation are scaler, we can transpose the right hand side.
$$
u^TAv = u^TA^Tv
$$
That should hold for all u and v, so $A=A^T$.
keenan
@Misaka-10032. Bingo. In general, it's useful to double check that matrices capture properties of the objects they represent. A very good way to debug is to do "sanity checks" that verify these properties. For instance, if you're building a matrix and you know it's supposed to represent an inner product, a simple but very useful debug check is to print out the difference $A - A^T$.
Another reason why matrices with matching columns and rows can be multiplied could be as follows:
If we think of matrices as transforms in space. We need to read them from right to left. So the first matrix in the figure above would transform a 1 dim vector to n dim. And the corresponding matrix on the left would transform the n dim vector back to 1 dim.
So these transformations would only make sense if the rows of the 2nd matrix matched the column of the 1st one.
$A$ is symmetric because of the symmetric property of inner product.
$$ \langle u, v \rangle = \langle v, u \rangle \\ u^TAv = v^TAu $$
As both sides of the equation are scaler, we can transpose the right hand side.
$$ u^TAv = u^TA^Tv $$
That should hold for all
u
andv
, so $A=A^T$.@Misaka-10032. Bingo. In general, it's useful to double check that matrices capture properties of the objects they represent. A very good way to debug is to do "sanity checks" that verify these properties. For instance, if you're building a matrix and you know it's supposed to represent an inner product, a simple but very useful debug check is to print out the difference $A - A^T$.