To elaborate on why linear maps are computationally easy, any linear map can be computed with a single matrix multiplication. In computers we have today, matrix multiplication is often done very efficiently via hardware design or different algorithms.
keenan
Right. Does anybody know how much it costs to multiply two matrices?
Apart from matrix multiplication, another key algorithm is solving a linear system $Ax=b$. Any idea how much it costs to solve a linear system? (There are lots of different answers here!)
graphicnerd
Multiplying two n*n matrices using the regular definition of matrix multiplication would require a running time of O(n^3)
jellybean
While O(n^3) is the complexity of the standard approach, there are algorithms with slightly better complexities, such as the Strassen algorithm, which is O(n^2.8...). For very large matrices, even a seemingly small improvement in complexity can make it much faster. https://en.wikipedia.org/wiki/Matrix_multiplication_algorithm has some good explanations of the various algorithmic approaches.
just_render
Yes multiplying matrices takes polynomial time.
Therefore in order to solve a linear system using matrix we can write
x = (A^-1)b
Both the operations that is calculating inverse of a matrix and multiplying them has a complexity of n^3, therefore the cost will be O(n^3)
keenan
@jellybean Right—for general matrix multiplication, you can do a fair bit better than cubic, though many of these algorithms start to become practical only for (very) large matrices. The lowest upper bound I know of (due to Virginia Williams) is $O(n^{2.373})$, though there may be more recent results.
keenan
...More importantly, many of the large matrices we work with in computer graphics are sparse, meaning that only around $O(n)$ of the entries are nonzero. For matrices like these, there are much more efficient algorithms for multiplication and for solving systems like $Ax = b$. For instance, for standard two-dimensional problems (like the Poisson equation, for instance), one can basically get the job done in linear ($(O(n))$) time. Lots of results on this kind of stuff from Dan Spielman and colleagues.
To elaborate on why linear maps are computationally easy, any linear map can be computed with a single matrix multiplication. In computers we have today, matrix multiplication is often done very efficiently via hardware design or different algorithms.
Right. Does anybody know how much it costs to multiply two matrices?
Apart from matrix multiplication, another key algorithm is solving a linear system $Ax=b$. Any idea how much it costs to solve a linear system? (There are lots of different answers here!)
Multiplying two n*n matrices using the regular definition of matrix multiplication would require a running time of O(n^3)
While O(n^3) is the complexity of the standard approach, there are algorithms with slightly better complexities, such as the Strassen algorithm, which is O(n^2.8...). For very large matrices, even a seemingly small improvement in complexity can make it much faster. https://en.wikipedia.org/wiki/Matrix_multiplication_algorithm has some good explanations of the various algorithmic approaches.
Yes multiplying matrices takes polynomial time.
Therefore in order to solve a linear system using matrix we can write
x = (A^-1)b
Both the operations that is calculating inverse of a matrix and multiplying them has a complexity of n^3, therefore the cost will be O(n^3)
@jellybean Right—for general matrix multiplication, you can do a fair bit better than cubic, though many of these algorithms start to become practical only for (very) large matrices. The lowest upper bound I know of (due to Virginia Williams) is $O(n^{2.373})$, though there may be more recent results.
...More importantly, many of the large matrices we work with in computer graphics are sparse, meaning that only around $O(n)$ of the entries are nonzero. For matrices like these, there are much more efficient algorithms for multiplication and for solving systems like $Ax = b$. For instance, for standard two-dimensional problems (like the Poisson equation, for instance), one can basically get the job done in linear ($(O(n))$) time. Lots of results on this kind of stuff from Dan Spielman and colleagues.