Previous | Next --- Slide 49 of 61
Back to Lecture Thumbnails
jefftan

What is the time complexity of Gram Schmidt? It seems like this is at least O(k^2) or O(k^3) since you have to repeatedly subtract components of the first few vectors from all the remaining vectors, and each of these subtractions is a vector computation. Are there any more efficient algorithms for finding an orthonormal basis in a high dimensional space?

dl123

Is this algorithm still used in production anymore? It sounds like the other algorithm can be generalized to more cases.

blahaj

What algorithms are used besides Gram-Schmidt?

WhaleVomit

What's an intuitive explanation for why <u2, e1> is the magnitude of the vector we should subtract from u2? I'm having trouble seeing why the dot product is being used here.

bunnybun99

When talking about the disadvantage of the Gram-Schmidt algorithm when dealing with large number of vectors, you mentioned the QR decomposition method. How faster is the QR decomposition method comparing to the Gram-Schmidt method?

large_monkey

If the vectors are near parallel, what makes Graham Schmidt a poor choice? Would this be due to precision errors? Would an alternative approach involve finding a different basis, and then using Graham Schmidt on that?

BlueCat

I do not get what the relationship is between Gram-Schmidt and Orthonormal Basis. How do we choose between them?

coolpotato

This algorithm seems to be highly inefficient for a basis with a dimension of 3 or higher? Will we go over more efficient algorithms to find an orthonormal basis?