Head-Fi.org › Forums › Equipment Forums › Sound Science › The Optimal Sample Rate for Quality Audio
New Posts  All Forums:Forum Nav:

The Optimal Sample Rate for Quality Audio

post #1 of 32
Thread Starter 

Interested in the facts?

 

One of the world’s top converter designers Dan Lavry has written a new paper in simple language to demystify the subject.

 

http://www.lavryengineering.com/pdfs/lavry-white-paper-the_optimal_sample_rate_for_quality_audio.pdf

 

See why many professional engineers still work at 96kHz years after 192kHz became available.

 

Find out why “more” is not always “better!”

 

Dan Lavry is the designer of the LavryBlack DA10 and DA11.

post #2 of 32

i find it interesting that somebody who manufactures d/a converters doesn't fully understand the nyquist sampling theorem.

post #3 of 32
Quote:
Originally Posted by scud80 View Post

i find it interesting that somebody who manufactures d/a converters doesn't fully understand the nyquist sampling theorem.

http://lavryengineering.com/pdfs/lavry-sampling-theory.pdf

 

I find it amusing that you think we can perceive 48khz. 


Edited by firev1 - 5/17/12 at 6:10am
post #4 of 32
Thread Starter 

Well Scud80;

Seeing as the Forum is the "Science of Sound;" it would be appropriate to back up your assertion with some factual evidence as well as some mention of your qualifications.

 

Until that time it would only be logical to assume that you are the one that does not understand the Nyquist Theorum.
 

post #5 of 32
Quote:
Originally Posted by firev1 View Post

http://lavryengineering.com/pdfs/lavry-sampling-theory.pdf

I find it amusing that you think we can perceive 48khz. 
I loved that paper.

Although the mathematics seemed a bit irrelevant at times, it was very interesting overall.


The Nyquist theorem should be enough evidence to support we don't need higher than 44.1kHz sample rates in terms of frequency extension.
But isn't it true that operating at 96kHz usually provides better performance than both 44.1 and 192? At least that is what can readily be noticed in both the measurements of DAC's and DAC chips.
Edited by Tilpo - 5/17/12 at 10:09am
post #6 of 32
Quote:
Originally Posted by Lavry Tech View Post

Well Scud80;

Seeing as the Forum is the "Science of Sound;" it would be appropriate to back up your assertion with some factual evidence as well as some mention of your qualifications.

 

Until that time it would only be logical to assume that you are the one that does not understand the Nyquist Theorum.
 

Most people believe that in order to reconstruct the analog, one sample can be used at a time.  For example, just using a sample and hold at the output, then brickwall filtering it.

 

Unfortunately, to reconstruct perfectly requires knowledge of the stream before and after the instant.  If insufficient information is available, the absolute phase and amplitude may not reconstruct correctly.

 

The same basic problem occurs when doing an FFT.  Window size and type is important to the outcome.

 

j

post #7 of 32
Quote:
Originally Posted by jnjn View Post

Most people believe that in order to reconstruct the analog, one sample can be used at a time.  For example, just using a sample and hold at the output, then brickwall filtering it.

Unfortunately, to reconstruct perfectly requires knowledge of the stream before and after the instant.  If insufficient information is available, the absolute phase and amplitude may not reconstruct correctly.

The same basic problem occurs when doing an FFT.  Window size and type is important to the outcome.

j
Isn't that pretty much part of the Nyquist theorem though?

One of the conditions for perfect reproduction is a sample of infinite length. Of course this is not possible in real life, but an incredible close approximation is possible with far less samples. I do not know how many samples this would be, though.
post #8 of 32
Quote:
Originally Posted by Tilpo View Post


Isn't that pretty much part of the Nyquist theorem though?
One of the conditions for perfect reproduction is a sample of infinite length. Of course this is not possible in real life, but an incredible close approximation is possible with far less samples. I do not know how many samples this would be, though.

Yes.

 

If reconstruction occurs one sample at a time with no regard to past or future, then higher than Nyquist rates may be required to retain signal information.

 

If one were to sample a 1Khz sine at a 2 ksps rate, the outcome is phase dependent.  It could be you find only the zero crossings, you may find only the peaks.  If you reduce the measured sine to 999 hz, the digital stream will vary at a 1 hz rate, and the 1 sample reconstruction will change amplitude from zero to full scale at a 1 hz rate.  Brickwall filtering of that output will not recover from that modulation artifact.

 

j

post #9 of 32
Just throwing this out here, but is the finite sample rate the cause of the high frequency roll-off of most DAC's?
Would these same DAC's have a brickwall frequency response when using an infinite sampling rate? Or is the roll-off caused by analog parts?
post #10 of 32
Quote:
Originally Posted by Tilpo View Post

Just throwing this out here, but is the finite sample rate the cause of the high frequency roll-off of most DAC's?
Would these same DAC's have a brickwall frequency response when using an infinite sampling rate? Or is the roll-off caused by analog parts?

The brickwall gets rid of all the nasty high frequency stuff that is a consequence of the sampling/reconstruction.  Without it, there would probably be as much aliasing noise as music.

 

An infinite sampling rate is just another word for analog...

 

Perhaps lavrytech can elaborate for us on the tradeoffs between sampling rate, reconstruction word depth, aliasing, and brickwalls.

 

j

post #11 of 32
Quote:
Originally Posted by jnjn View Post

The brickwall gets rid of all the nasty high frequency stuff that is a consequence of the sampling/reconstruction.  Without it, there would probably be as much aliasing noise as music.

An infinite sampling rate is just another word for analog...

Perhaps lavrytech can elaborate for us on the tradeoffs between sampling rate, reconstruction word depth, aliasing, and brickwalls.

j
I didn't mean sample rate, I meant sample size. As in, how many samples the DAC is looking at in order to do its mathemagic and reconstruct the analog waveform.

Given that the Nyquist theorem assumes an infinite amount of samples to reconstruct the waveform perfectly, I wonder what the effect would be of changing the amount of samples at constant sample rate.

The effects of wordlength are obvious, though. It defines the noise floor. At 16 bit this is -96dBFS, at 24 bit it's -144dBFS (which is impossible to achieve in the real world). Or did you mean something else with 'reconstruction word depth'.?
post #12 of 32
Thread Starter 
Quote:
Originally Posted by jnjn View Post

The brickwall gets rid of all the nasty high frequency stuff that is a consequence of the sampling/reconstruction.  Without it, there would probably be as much aliasing noise as music.

 

An infinite sampling rate is just another word for analog...

 

Perhaps lavrytech can elaborate for us on the tradeoffs between sampling rate, reconstruction word depth, aliasing, and brickwalls.

 

j


I contacted Dan and asked him to take a look at the on-going discussion. He is busy with new designs; but took a minute to address the question of how many samples are required for good results:

 

"At 44.1KHz, 1 second contains 44100 samples, so .1sec contains 4410 samples. There are 441 samples in 10 milliseconds.

At 96KHz, there are 960 samples in 10 milliseconds…

 

Clearly to reach perfection we need infinite time. But to stay practical one needs to define an acceptable performance goal. The answer depends on what one may want a filter to do. One may wish for good enough characteristics to satisfy say 24 bits accuracy (144db which is around 0.00000596%). One may need say 120dB (around 20 bits thus 0.0001%), or what not.

 

There are as many answers as there are possible filter designs. But in all cases, a good DSP designer can accomplish even a very demanding task with a few hundred samples, or even less.

 

Dan Lavry"

post #13 of 32

I have the qualification of being able to read.  From the top of the sampling theory whitepaper: "Nyquist Sampling Theory: A sampled waveforms contains ALL the information without any distortions, when the sampling rate exceeds twice the highest frequency contained by the sampled waveform."  This is simply incorrect, as mentioned by a couple other posts.  I'm not going to argue that most of a signal can be reproduced with a reasonable number of samples, but saying that all of it is reproduced (which is repeated, again in caps, later on the same page) is misleading.

Quote:
Originally Posted by Lavry Tech View Post

Well Scud80;

Seeing as the Forum is the "Science of Sound;" it would be appropriate to back up your assertion with some factual evidence as well as some mention of your qualifications.

 

Until that time it would only be logical to assume that you are the one that does not understand the Nyquist Theorum.
 

post #14 of 32
Quote:
Originally Posted by Tilpo View Post

Just throwing this out here, but is the finite sample rate the cause of the high frequency roll-off of most DAC's?
Would these same DAC's have a brickwall frequency response when using an infinite sampling rate? Or is the roll-off caused by analog parts?

 

Roll off you see are in both amps and dacs is the result of filtering. Leaving ultrasonics to play in the DACs or Amps will cause increased distortion and noise characteristics, it is standard design.

Quote:
Originally Posted by Tilpo View Post


Isn't that pretty much part of the Nyquist theorem though?
One of the conditions for perfect reproduction is a sample of infinite length. Of course this is not possible in real life, but an incredible close approximation is possible with far less samples. I do not know how many samples this would be, though.

I have never tried 192 but indeed, 96khz does give noticeably(on graph, and for me audible due to my DAC) better measurements than 44.1in terms of THD and IMD notably. I think that with 96khz, it is the optimal sampling frequency, with 192khz, physics kicks the DACs in their balls and thus giving worse measurements for 192khz. Of course in audio, pseudo science "more is better!" kicks in together with other factors, give the misconception that 192khz is audibly better when they might just be hearing increased distortion/ expectational bias.  

post #15 of 32

i would be pretty surprised if any human can actually hear a difference between 96 and 192kHz sampling rate.  even if there's somebody that can i doubt the difference is worth doubling the storage and processing requirements (or more, i'm not really familiar with whether or not decoding/processing is linear with sample size).

New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Sound Science
Head-Fi.org › Forums › Equipment Forums › Sound Science › The Optimal Sample Rate for Quality Audio