I feel like there is a point where you can end up sampling too much, and it doesn't end up helping recreate the wave anymore. So what is the point at which there are too many sampling point? Is there some sort of standard that people go by, or for each problem is there some process that people go through to get the optimal number of sampling points so that you aren't doing extra work?

sjip

I read somewhere :

To accurately measure the frequency of a signal, we need a sampling rate of at least twice the highest frequency in the signal. This concept is known as Nyquist's theorem.

And to get the shape of the signal, you will need a sampling rate of at least ten times higher than the highest frequency in the signal.

keenan

@ljelenak Nothing fundamentally breaks if you have too many samples; you're just using more compute/data than you really need to. If you sample at the Nyquist frequency, then you can exactly reconstruct the original signal using a sinc filter. (Not sure what @sjip means about the 10x rate.)

I feel like there is a point where you can end up sampling too much, and it doesn't end up helping recreate the wave anymore. So what is the point at which there are too many sampling point? Is there some sort of standard that people go by, or for each problem is there some process that people go through to get the optimal number of sampling points so that you aren't doing extra work?

I read somewhere :

To accurately measure the frequency of a signal, we need a sampling rate of at least twice the highest frequency in the signal. This concept is known as Nyquist's theorem.

And to get the shape of the signal, you will need a sampling rate of at least ten times higher than the highest frequency in the signal.

@ljelenak Nothing fundamentally breaks if you have too many samples; you're just using more compute/data than you really need to. If you sample at the Nyquist frequency, then you can exactly reconstruct the original signal using a sinc filter. (Not sure what @sjip means about the 10x rate.)