lundi 1 décembre 2014

Sample rate vs. audio bandwidth - misconceptions?

I always assumed that the purpose of high sample rates was to "oversample" the audio, but another post here by FabienTDR and some experiments with the Izotope SRC has me wondering - is that REALLY how it works?



In the early days of digital audio the A/D antialiasing and D/A reconstruct filters were ANALOG filters that had fixed cutoff frequencies. When you selected the 48KHz sample rate you were actually oversampling the audio (AFAIK).



However, most of the digital audio converters in use today are "delta-sigma" types that use digital filters. Since the filters are digital, is it possible their bandwidths actually SCALE with sample rate? In other words, can some 96KHz systems record and reproduce a 40KHz audio signal?



If that's the case, then I submit that high sample rates (as FabienTDR pointed out) can actually cause problems. If there is any super-sonic noise in the audio (EMI interference, harmonics, etc.) then that noise can get mixed with and/or modulate the desired audio during processing - creating garbage tones in the rendered audio.



Is this really what's going on? Is it possible that some high-end converters sound good because they actually limit the audio to 20KHz - whereas maybe some cheap converters simply SCALE their filters?





Sample rate vs. audio bandwidth - misconceptions?

Aucun commentaire:

Enregistrer un commentaire