Originally Posted by
tclune
My guess is that 2.3 is just "Kentucky windage" to accommodate real-world data.
Now, to be able to reasonably interpret real-world data that is not infinitely repetitive; does not consist only of overtones of the sampling window base frequency; and is not limited to frequencies that are below the Nyquist limit, we need to back off from the Nyquist limit a bit. I have often used a value of 1/2.5 times the sampling frequency for practical data, but that will depend on various things (how sharp is the rolloff of your anti-aliasing filter, for example).
So, after all is said and done, I think they're just trying to accommodate the messiness of reality. If you research microscopy, you'll probably find that some influential lab manual used that for the limit, and it became the norm. FWIW