Benchmark talked about headroom for intersample peaks in DAC, does it really matter?
Aug 2, 2017 at 12:16 PM Post #61 of 90
.. it can happen to anybody with a little self confidence. TBH it happens to me rather often. :sweat_smile:

Agreed, it happens on occasion to me too. I said pretty much the same thing quite a few posts ago, that we all have "bad days" ... But how many consecutive bad days does it take to conclude that it's not just a bad day any more but a normal day for someone who's nuts? :)

TBH, I can even overlook a number of consecutive "bad days" but the fact that he also peppers his posts with insults (when he's the one being ignorant), that's going well beyond what's acceptable.

G

EDIT: I've edited my previous post and taken out the "clearly insane" references.
 
Last edited:
Aug 2, 2017 at 12:51 PM Post #62 of 90
I read something I find silly, start responding and by the end of my post I'm on fire. I go seek the guy's quote to really rub his nose in it, except there is no quote to be found because I misread his sentence. :sweat_smile:

Great example of cognitive bias and subjective social reality!
 
Aug 2, 2017 at 4:41 PM Post #63 of 90
Thanks so much for all the information.

He certainly had me confused enough that I edited my first post in this thread like an idiot, after re-reading the op.
 
Aug 3, 2017 at 10:23 AM Post #64 of 90
My last sentence, last comment, was incorrect (edited) and I apologize. It got me thinking about Nyquist math, which shows elements of both power law and voltage law. I tentatively believe we can draw a general rule that "any two contiguous Nyquist-valid samples under -3dBFS define non-clipped original program" (at least for sine waves! others not positive). But I can envision synthetically-generated bit patterns designed to confuse normal filters, resulting in clips with contiguous samples under -3dBFS.

The practical takeaway here (at least for me) is that audio producers using sample-point metering are playing with fire. Any non-Nyquist-reconstructive digital metering system can give wrong data, and should never be trusted when working anywhere near FS. Another takeaway which I hope has been drilled into Grigorio's head is that valid Nyquist data is always 100% reconstructable. If there is a clip in the original signal, it will clip in reconstructed Nyquist space. If there is no clip in the original signal, there will be no clip in a valid reconstructed Nyquist space unless something is the path is broken (filters, etc.).
 
Last edited:
Aug 3, 2017 at 11:44 AM Post #65 of 90
The practical takeaway here (at least for me) is that audio producers using sample-point metering are playing with fire. Any non-Nyquist-reconstructive digital metering system can give wrong data, and should never be trusted when working anywhere near FS.
I"m not sure where this would occur. Even the basic Adobe Audition has means of detecting reconstructed clipping (at least two methods). And that's pretty entry-level software as far as professional DAWs go.
Another takeaway which I hope has been drilled into Grigorio's head is that valid Nyquist data is always 100% reconstructable.
Nope, wrong. 100% reconstruction applies only to signals with total spectrum below Nyquist.
If there is a clip in the original signal, it will clip in reconstructed Nyquist space.
…with the effects of the reconstruction filter added. That could easily be an overshoot.
If there is no clip in the original signal, there will be no clip in a valid reconstructed Nyquist space
unless something is the path is broken (filters, etc.).
This contradicts your argument. You can have a non-"broken" filter and still create a clip during reconstruction. Simply band-limiting a signal with super-Nyquist content creates an overshoot even with a "perfect" filter if the system post-filter doesn't provide headroom above 0dBFS for the overshoot. And then there's the inter-sample issue. Apparently, that does happen.

Isn't that the entire point of the argument?
 
Aug 3, 2017 at 12:39 PM Post #66 of 90
gregorio: The EBU/AES papers are referring to sample-based (discrete-sample) metering. As I said back around page 2 of this thread, if someone in the chain is not paying attention (i.e., mastering engineer using discrete-sample-based metering), then unexpected/unregistered clips can occur in the reconstructed program. But ... a clip is either embedded into Nyquist space, or it is not (if the signal path is not broken). There's no "0.01% exception" to Nyquist theory. I thought we got that out of the way days ago? By the way, yes, of course, I am an insane, idiotic, deluded no-nothing (did I miss any insult?). Thanks for the reminders. I believe the only disparaging remark I hurled at you was that you don't understand Nyquist theory, which was clear. I hope we've cleared up some misconceptions.

pinnaheartz: "Even the basic Adobe Audition has means of detecting reconstructed clipping (at least two methods)." Again, that's not the point. The point is that any non-clipped original signal (Analog => ADC) will remain unclipped in valid Nyquist space and resultant reconstruction, unless a processing stage is causing a clip (SRC, anti-aliasing, etc.). You disagree? Tell me why. Keep in mind, I said "valid" Nyquist space. Not "out of band" or "band-limited," a result of poor SRC or filter design, etc. For instance, if there is an "over" as a result of out-of-band information, that's a bad aliasing or imaging filter.

castleoffargh: thanks.
 
Last edited:
Aug 3, 2017 at 1:49 PM Post #67 of 90
I"m not sure where this would occur. Even the basic Adobe Audition has means of detecting reconstructed clipping (at least two methods). And that's pretty entry-level software as far as professional DAWs go.
gregorio: The EBU/AES papers are referring to sample-based (discrete-sample) metering. As I said back around page 2 of this thread, if someone in the chain is not paying attention (i.e., mastering engineer using discrete-sample-based metering), then unexpected/unregistered clips can occur in the reconstructed program.

It is still standard practice in the music industry to use sample value metering and it always has been. It's only in the TV broadcast industry where we use True Peak metering as standard and that's because delivery requirements in many/most countries specify peak limits in dBTP. -1dBTP in most of Europe and -2dBTP in North America for example. And, it's nothing to do with mastering engineers not paying attention, pro DACs (and some/many consumer DACs) typically have headroom so that ISPs exceeding 0dBFS are not clipped.

BTW Pinnahertz, most pro DAWS do not include true peak metering as standard, just sample value metering. Many (optional) plugins are available for true peak metering.

[1] I tentatively believe we can draw a general rule that "any two contiguous Nyquist-valid samples under -3dBFS define non-clipped original program" (at least for sine waves! others not positive).
[2] The practical takeaway here (at least for me) is that audio producers using sample-point metering are playing with fire.
[3] Another takeaway which I hope has been drilled into Grigorio's head is that valid Nyquist data is always 100% reconstructable.
[4] If there is no clip in the original signal, there will be no clip in a valid reconstructed Nyquist space unless something is the path is broken (filters, etc.).

1. It is possible, though rare, to have ISP clipping occur even when -3dBFS is not exceeded.
2. Pretty much all music producers are playing with fire then, and have been for several decades.
3. With the exception of ISPs and not enough headroom in a DAC or SR converter to deal with them, I already know that valid Nyquist data is always 100% reconstructable and have known that more more than 2 decades. It's not my head which needs drilling!
4. Except in the case of ISPs and insufficient headroom to deal with them!

G
 
Aug 3, 2017 at 9:18 PM Post #69 of 90
I know that valid Nyquist data is always 100% reconstructable ... except in the case of ISPs and insufficient headroom to deal with them!

"Insufficient headroom" which causes a clip ... that falls into the broad category of "broken." I think you get this, but your statement #4 above leaves me uncertain.

If there is no clip in the original signal (Analog => ADC = clean), there can be no (nada) inter-sample-clip in a valid reconstructed Nyquist space ... unless something is the path is broken (filters, headroom, etc.). If you disagree, give me an example.
 
Aug 4, 2017 at 2:19 AM Post #70 of 90
[1] "Insufficient headroom" which causes a clip ... that falls into the broad category of "broken." I think you get this, but your statement #4 above leaves me uncertain.

[2] If there is no clip in the original signal (Analog => ADC = clean), there can be no (nada) inter-sample-clip in a valid reconstructed Nyquist space ... unless something is the path is broken (filters, headroom, etc.). If you disagree, give me an example.

1. I'm not sure I'd say "broken". The problem is, what is "insufficient headroom"? Typically about 3dB should be enough but that would still leave a small percentage of commercial audio clipped. Theoretically 30dB is about the max ISP which could occur but 30dB of headroom is impractical, although I doubt there is any commercial content which gets anywhere near 30dB ISPs. I have seen ISPs of about +6dB but not in music. If a DAC (or SR converter) had no headroom above 0dBFS and therefore clipped quite frequently, I would consider that incompetently designed, which I suppose is a type of "broken".

2. In theory I suppose there could be but in practice I would think extremely rarely to never. Pretty much all commercial recording is done in 24bit, which provides plenty of headroom and therefore no need to get any where near the point where ISPs could cause clipping. However, your statement is pretty much irrelevant as there is almost no commercial audio content available which hasn't been mixed and therefore an "original signal" is not what gets distributed to consumers.

G
 
Aug 4, 2017 at 2:28 AM Post #71 of 90
Have we determined "whether ir really matters" yet? I decided the answer was no in the very beginning of the thread, but what do I know? I just normalize with a bit of headroom and don't worry about it.
 
Aug 4, 2017 at 7:05 AM Post #72 of 90
Have we determined "whether ir really matters" yet? I decided the answer was no in the very beginning of the thread, but what do I know? I just normalize with a bit of headroom and don't worry about it.
1. At a consumer level your approach is probably the most appropiate.
2. At a pro level, I am expecting more care.
The good thing is that nowadays it is quite easy to have a good estimation of True Peak Levels of your digital audio file/stream only (DAC & following elements not included).
 
Aug 4, 2017 at 9:28 AM Post #73 of 90
Have we determined "whether ir really matters" yet?

Yes, it definitely does matter. The question though, is; does it matter to you personally? If you have a DAC which has already been designed with a reasonable amount of headroom for ISPs, then no, it doesn't matter. Even if you don't have such a DAC it still might not matter, depending on what you listen to and how much digital clipping bothers you. You did say that you noticed some clipping when converting to AAC though. AAC, like MP3, over-samples as part of it's encoding and with significantly more sample points you're likely to run into ISPs clipping, although I believe AAC does have some headroom and is generally more forgiving than MP3 as far as ISP are concerned. If you are getting ISP clipping (and it bothers you), there is no solution except lowering the peak sample level of whatever you're encoding, until the ISPs no longer clip when you encode.

G
 
Aug 4, 2017 at 9:50 AM Post #74 of 90
Reading the thread from the beginning...it looks like we are back into the age-old "practical vs theoretical" argument.

cloggins: "Insufficient headroom" which causes a clip ... that falls into the broad category of "broken."

Depends on your definition of "broken". Typically that means a thing is defective relative to it's original design and construction because of some failure, damage, or defect in construction. What you mean by "broken" is the device is operating without defect relative to the original design, but the original design is flawed. True, perhaps, but that includes millions, possibly most of the DACs out in the field. So your definition of "broken" is also the definition of the norm. We need to be aware of the average and norm. That means, "broken" or not, we must work with it if we don't want our audio clipped at the DAC, ever.

Arpiben: "The good thing is that nowadays it is quite easy to have a good estimation of True Peak Levels of your digital audio file/stream only (DAC & following elements not included)."
The problem with metering, even true peak, is we don't know how "broken" the norm is on an individual level, which makes accurate compensation difficult, and we have to resort to a general compensation. We also are working against expectations of peak and loudness levels. Yes, the ideal solution is to build a DAC that won't clip on an IPS, but that's not going to happen, even for the top quarter of the bell curve, for quite a while, if ever. You can't pre-compensate accurately for an unknown degree of problem, so the compensation is a compromise based on understanding the application.

So the binary answer is "yes". It matters. There's a question of degree as it relates to application and even audibility.
 
Aug 4, 2017 at 10:15 AM Post #75 of 90
1. I'm not sure I'd say "broken". The problem is, what is "insufficient headroom"? Typically about 3dB should be enough but that would still leave a small percentage of commercial audio clipped.

"Broken" = "insufficient." Different terminology describing design deficiency. In a properly designed audio chain (Analog => ADC => processing => DAC => amp) signal integrity is maintained, and sufficient headroom is designed-in. If the original analog signal is not clipped, and the ADC properly converts the signal, the resultant PCM data will be unclipped 100% of the time. Yes, PCM processing (DAW, plugins, etc.) can screw anything up via user error, or poor design, or not having proper metering to assess PCM clipping. That's poor design or operator error, and has nothing to do with "headroom above 0dBFS." In PCM, there is no such thing as "headroom above 0dBFS." Any attempted signal above true 0dBFS in an ADC or DAC is a clip, therefore "undefined" or "invalid" in Nyquist space (caveat: if the ADC or DAC designer arbitrarily set 0dBFS at some level below true rail-rail (Vpp) clipping, then yes, there could be "headroom above 0dBFS" -- which is actually a common design technique, more about the historical migration from analog to digital signal paths than an objective design requirement). If an operator has maintained unclipped program through the processing stage, then a properly designed DAC will maintain unclipped data back to analog, 100% of the time. Inter-sample clipping is a result of poor product design and/or operator misuse.

your statement is pretty much irrelevant as there is almost no commercial audio content available which hasn't been mixed and therefore an "original signal" is not what gets distributed to consumers.

My statement is entirely relevant throughout the entire audio sampling and processing chain (see above). A properly designed digital mixing engine, or any digital processing system, will maintain unclipped signal integrity unless the operator pushes the program level beyond full-scale. In a properly designed digital metering system, all peaks are identified. In a poorly designed digital metering system, some peaks are missed, which is missed inter-sample information. Poorly designed tools, nothing more. We also know that most mixing engines are based on a 32-bit IEEE float topology, which assures that all 24 audio audio bits are maintained perfectly (the actual mixing algorithm is a different story, which is why different DAWs sound different after a mix), hence any "missed clipping" after mixing results from either insufficient tools or operator error.
 
Last edited:

Users who are viewing this thread

Back
Top