Previous | Next --- Slide 58 of 61
Back to Lecture Thumbnails
rgrao

There was an introduction to linear algebra covered during a Deep Learning course which I took, that nicely explained the concept of eigenvectors/eigenvalues for a matrix A. Normally, a matrix A would cause a vector to "stray from its path and scale" when applied to it. But there are special vectors for every square matrix A, which don't stray from their path. These are the eigenvectors (Ax=kx, k is scalar). They are useful for analyzing several properties of matrices. https://drive.google.com/file/d/10AGgYq-Jq7zWv0gPPnISYC661LcjuL-u/view?usp=sharing