What are some other popular choices of basis for functions?
I don't fully understand this idea. What would the vector space be? How many basis vectors are there for the space?
Are there functions that are orthonormal in higher dimensions for functions?
this is pretty interesting, thinking of fourier transforms in terms of orthonomal changes in basis.
I don't really understand how the sin and cosine functions are "orthonormal." For example, if we use have sin(2x) and cos(4x) as our basis, how are we defining orthonormal in this case? What if it was sin(4x) and cos(2x) instead?
Can you explain more about how to find the basis of the given function?
Are there other bases that can be used to represent functions other than the set of sinusoids/Fourier transform?
Also, are there any limitations on the set of functions that can be represented in this way or is it truly the full R^R?
Are the sinusoids the only relevant repeating functions that can be used as a basis?
Intuitively, simple polynomials (x^n) also seem like they should make sense as basis vectors for other functions. Is there some intuition to why we would prefer sin and cos? Is it primarily because they are orthonormal?
Can we still have norm for functions after Fourier transform?
If sinusoid functions are used as the basis, then what does it mean for functions to be orthogonal? Does this relate to how sine and cosine are shifted by 90 degrees? Also, does the different values of m and n mean that the function is decomposed into more dimensions?