Benchmark talked about headroom for intersample peaks in DAC, does it really matter?
Feb 11, 2017 at 7:16 AM Post #16 of 90
It can indeed be a problem, but it should be relatively easily solved by lowering digital volume on your computer to 80-90% so the DAC doesn't ever see 0dBFS samples.


The heresy! :D :p
 
HiBy Stay updated on HiBy at their facebook, website or email (icons below). Stay updated on HiBy at their sponsor profile on Head-Fi.
 
https://www.facebook.com/hibycom https://store.hiby.com/ service@hiby.com
Feb 11, 2017 at 8:15 AM Post #17 of 90
  omg, looks like a pretty bad situation 
eek.gif
  this is madness...

 
As it is such a common thing and has been for such a long time, I would expect any oversampling DAC to provide enough head room to cope with the inevitable ISPs. If it doesn't, I wouldn't consider it competently designed!
 
G
 
Jun 6, 2017 at 9:44 AM Post #18 of 90
The term "inter-sample peak" is misleading. If the converter is designed properly, there will be no "peak" (or "over") that is missed by the conversion process. There is no signal that is missed "in between" samples.. There may be peaks (or overs) outside the Nyquist limit, but in a properly designed converter, those peaks will be ignored. Besides, in real world music, we rarely encounter "overs" in the range beyond 20kHz. The peaky energy in music is typically below 5kHz (yes, there are exceptions). Best I can tell, this "inter-sample peak" issue is related to other issues common to converter design (analog and digital filtering, oversampling artifacts, etc.), which of course can be a real issue when signals get really close to full-scale. If anyone is mixing music that hot, suggest you back off a few tenths of a dB.
 
Last edited:
Jun 7, 2017 at 1:36 AM Post #19 of 90
[1] The term "inter-sample peak" is misleading. If the converter is designed properly, there will be no "peak" (or "over") that is missed by the conversion process. There is no signal that is missed "in between" samples..
[2] The peaky energy in music is typically below 5kHz (yes, there are exceptions).
[3] If anyone is mixing music that hot, suggest you back off a few tenths of a dB.

I think you may have misunderstood. It's not that a DAC will "miss" the intersample peaks, it's that it may not have enough headroom to represent them without clipping. This potential clipping would typically occur in the oversampling stage of the DAC but could occur in the analogue stage of a NOS DAC. I don't see how the term is misleading?
2. While it's true that most of the energy in a music mix is typically well below 5kHz, there is also typically several/many very brief (transient) peaks at higher freqs. Typically in popular music genres rather than classical, from such things as cymbal hits.
3. Firstly, a few tenths of a dB would often not be enough and secondly, good luck with that suggestion, it's going to take more than just a suggestion to reverse decades of loudness wars.

G
 
Jul 31, 2017 at 5:25 PM Post #20 of 90
We believe this is the single most important innovation in the DAC2, and we believe it is one of the largest contributors to the overall sound quality of the DAC2. The ability to reproduce intersample peaks trumps all other digital processing enhancements.
...
This is probably the single most important improvement in D/A technology in the the past 10 years!

Well, that was written by someone in marketing.

The quote from Benchmark is, "If the peaks of the sine wave are precisely positioned between the samples, the sine wave can reach a level of +3.01 dBFS before the digital meters will show a clipping event." Do you disagree with that?

Well it can go a lot higher than 3 dB if you try. If you use a signal like [... +1 -1 +1 +1 -1 +1, ...] you can repeat the pattern to make the intersample peak go arbitrarily high.

Intersample peaks 2.4 kHz.png

original signal length.png

This is highly contrived, though. I think it unlikely that you'd see more than a few dB with real music.

The term "inter-sample peak" is misleading. If the converter is designed properly, there will be no "peak" (or "over") that is missed by the conversion process. There is no signal that is missed "in between" samples..

The reconstructed analog signal exists between the samples, so if the samples are already at FS, the reconstructed signal can exceed this in level.

There may be peaks (or overs) outside the Nyquist limit

Has nothing to do with Nyquist. Nyquist limit for 48 kHz is 24 kHz.
  • A sine wave at 12 kHz with samples [+1, +1, -1, -1, +1, +1] will have intersample peaks of +3 dBFS top and bottom.
  • A DC-shifted sine wave at 16 kHz of [+1, +1, -1, +1, +1, -1] will have intersample peaks of +4.5 dB on top side only.
Both are under Nyquist limit.

it's that it may not have enough headroom to represent them without clipping

Or worse. I found this thread because I'm designing a product with a DAC that doesn't clip for intersample peaks, but instead becomes "uncontrolled" and produces random signals in that region instead. So if you're playing at 96 kHz, and play a 32 kHz signal as described above [+1, +1, -1, +1, +1, -1], it should be completely inaudible ultrasound, but the intersample peaks produce very audible white noise/distortion instead. So I'm trying to figure out how much I need to attenuate the digital signal to make sure this never happens in realistic situations. -3.5 dB is probably good enough?

Though this experiment found 3% of songs with >4 dB intersample peaks, though they couldn't say which songs they were, or whether those samples of the songs were already at FS due to clipping.
 
Last edited:
Jul 31, 2017 at 6:32 PM Post #21 of 90
Sigh..... If a DAC, or reconstruction filter, or analog filter or IV converter (or whatever) can't handle valid data within the Nyquist limit, then the DAC is poorly designed and/or improperly specified. Inter-sample peaking is not a physical law of conversion, it's simply bad conversion design. In a properly designed DAC, at any frequency within the Nyquist limit, a waveform will be properly reconstructed, and an MSB clip (+1 +1) will give a corresponding analog clip. There is no other "law" governing this, and there is no magic. There are no exceptions to Nyquist's sampling theory.

That said, at sufficiently low levels, a DAC will reach its un-dithered linearity limits and the output waveform starts looking like noisy crap. Stereophile tests review DACs at -90dBFS (which is audible), and every single DAC looks terrible, easily 20-30% THD. Personally, I think this is profoundly more important to the "DAC quality" conversation than near-ultra-sonic peak energy that rarely exists in music, and rarely causes a negative listening experience.

If a DAC goes unstable at some high-freq, high-level valid Nyquist state, then the mfr needs to reduce their maximum output specification to a level where this doesn't happen (or ... fix the underlying reason that causes the design instability in the first place). For example, an incoming digital signal would be shifted downward, before the actual DAC (IC or otherwise) input, until said instability was no longer an issue.
 
Last edited:
Jul 31, 2017 at 7:53 PM Post #22 of 90
Sigh..... If a DAC, or reconstruction filter, or analog filter or IV converter (or whatever) can't handle valid data within the Nyquist limit, then the DAC is poorly designed and/or improperly specified.

Agreed, but it's too late to change the chip now.

a waveform will be properly reconstructed, and an MSB clip (+1 +1) will give a corresponding analog clip.

Analog clipping at 0 dBFS is perfectly valid, and better than becoming unstable, but reconstructing the wave above 0 dBFS is even better.

If a DAC goes unstable at some high-freq, high-level valid Nyquist state, then the mfr needs to reduce their maximum output specification to a level where this doesn't happen ... shifted downward, before the DAC input, until said instability was no longer an issue.

As I showed above, "valid" states can include arbitrarily high intersample peaks, so how much attenuation is "no longer an issue"?
 
Jul 31, 2017 at 8:04 PM Post #23 of 90
"reconstructing the wave above 0 dBFS is even better."

Unless the recording engineer, mix engineer, and/or mastering engineer decided to intentionally record hotter than digital-FS (either by design or by poorly-designed equipment), there is no such thing as "the wave above 0dBFS."

"As I showed above, "valid" states can include arbitrarily high intersample peaks, so how much attenuation is "no longer an issue?"

You're asking the wrong question. The right question is: "why is my DAC manufacturer lying to me?" If Benchmark admits that their DACs exhibit problems with valid Nyquist data, then they need to adjust their specifications downward.
 
Last edited:
Jul 31, 2017 at 8:32 PM Post #24 of 90
Unless the recording engineer, mix engineer, and/or mastering engineer decided to intentionally record hotter than digital-FS (either by design or by poorly-designed equipment), there is no such thing as "the wave above 0dBFS."

It doesn't matter who put the signal there or why. There is such a thing as a signal above 0 dBFS, between the samples, and it either needs to be clipped off, or reconstructed correctly.
 
Jul 31, 2017 at 8:34 PM Post #25 of 90
Sigh..... If a DAC, or reconstruction filter, or analog stage or analog filter (or whatever) can't handle valid data within the Nyquist limit, then the DAC is poorly designed. Inter-sample peaking is not a physical law of conversion, it's simply bad conversion design. In a properly designed DAC, at any frequency within the Nyquist limit, a waveform will be properly reconstructed, and an MSB clip will give a corresponding analog clip. There is no other "law" governing this, and there is no magic. There are no exceptions to Nyquist's sampling theory.

That said, at really low levels, a DAC will typically reach its un-dithered linearity limits and the output waveform starts looking like noisy crap. Stereophile tests review DACs at -90dBFS (which is audible), and every single DAC looks terrible, easily 20-30% THD.
it has nothing to do with how, good guy Nyquist did great on his math, or about calling digital to analog reconstruction wrong. you're fighting the wrong fight, friend. ^_^
it's a mastering issue that isn't addressed properly, and the potential loss of information from clipping happens to manifest at the DAC. you're thinking in term of sampled amplitude that is then reconstructed, so you can't imagine why anything would go over 0dB. and doing so you're missing how the master is pushed up close to 0dB on the final release without regard to the level used to encode the tracks mixed. and as a sample point doesn't necessarily is located at the analog peak, we end up with a signal reconstruction trying to go above 0dB. it's not Nyquist or digital audio or DACs, it's the damn loudness war and masters with no headroom for no legit reason.

@gjfs, do you wish to reduce most occurrences or really make it impossible? because 3dB made clipping anecdotal on my computer when I was testing this(checking with dpmeter2 and my EQ as they both have a clipped light showing up if it happened since last launch/reset). but while super rare with my library, it still happened.
 
Jul 31, 2017 at 8:40 PM Post #26 of 90
do you wish to reduce most occurrences or really make it impossible? because 3dB made clipping anecdotal on my computer when I was testing this(checking with dpmeter2 and my EQ as they both have a clipped light showing up if it happened since last launch/reset). but while super rare with my library, it still happened.

Well it can't be made impossible, but very rare for realistic recordings is the goal. I guess I should analyze my entire library and see how high the peaks go for real recordings.
 
Last edited:
Jul 31, 2017 at 9:52 PM Post #27 of 90
I've found that encoding music normalized up to zero on CD as AAC in iTunes pushes them into clipping. It doesn't clip on the CD, but it clips when you run it through iTunes. I think some ways of processing or playing digital music increases the volume a hair. It can put borderline stuff over the border. Whenever I make CDs, I always normalize to 85% instead of 100%.
 
Jul 31, 2017 at 10:10 PM Post #28 of 90
it's a mastering issue that isn't addressed properly

Which is exactly what I said earlier. Someone in the chain must violate Nyquist, or mis-use their gear, to cause this phenomenon.

as a sample point doesn't necessarily is located at the analog peak

No. That would violate Nyquist theory. Any high-level sample sub-Nyquist will be reconstructed properly. If it's not, there can only be two reasons, (1) MSB clip (embedded in the PCM), or (2) a DAC that's not meeting it's own specification.

it's not Nyquist or digital audio or DACs, it's the damn loudness war and masters with no headroom for no legit reason.

If a mastering engineer pushes program to just below 0dBFS, then a properly specified DAC will reconstruct that master properly. If the mastering engineer purposely clips the program, then a properly specified DAC will clip the master in the same manner. If something other happens, then, yes, the DAC is at fault.
 
Jul 31, 2017 at 10:20 PM Post #29 of 90
It doesn't matter who put the signal there or why. There is such a thing as a signal above 0 dBFS, between the samples, and it either needs to be clipped off, or reconstructed correctly.

No, there isn't. In a properly band-limited modulator, it's impossible to have lost information "between the samples." Really, go back and read Nyquist. If "inter-sample information" is valid, then the modulator is broken. This becomes a DAC design issue, nothing more.
 
Jul 31, 2017 at 10:31 PM Post #30 of 90
No, there isn't. In a properly band-limited modulator, it's impossible to have lost information "between the samples." Really, go back and read Nyquist. If "inter-sample information" is valid, then the modulator is broken. This becomes a DAC design issue, nothing more.

Again, this has nothing to do with Nyquist. Intersample peaks above full-scale occur for correctly-sampled, correctly-bandlimited sine waves below the Nyquist frequency. Nyquist sampling theorem is about discrete-time analog systems, has nothing to do with digital quantization, and doesn't even have a concept of full-scale amplitude.
 
Last edited:

Users who are viewing this thread

Back
Top