Possible Explanations.
Core Audio has a variety of features, all of which can interfere with the sound output if you are looking for 1-to-1 (and 0-to-0
) bit perfect output. I'm not clear if you tested iTunes and VLC running concurrently, but any time hog mode is off, CoreAudio's mixer -- which combines output of multiple programs together before outputting -- can get involved, and that's a whole can of worms. Heck
ANYTHING can be interfering at that point, even something outputting the simplest beep! That's why hog mode is an important feature. That alone could explain the screw ups. (I have more to say on iTunes in a different post.)
It may be that iTunes et al. are engaging some additional CoreAudio feature such as the Equalizer, which can be utilized even if you don't see the interface. One other issue is the constant recommendation to turn System Volume and iTunes Volume to maximum to achieve bit-for-bit output. I believe this is specific to USB, but I'm not sure. When I use SPDIF output, the System volume is greyed out, but I can still adjust volume in iTunes. (Decibel does not allow you to adjust volume unless it's enabled in Preferences!)
Also remember Core Audio deals with Floating Point. Anything that does math -- volume applies multiplication, for example -- on the floating point sample will introduce error. Your DAC cannot deal with floating point, so there must be some conversion back to integer before output, and this involves truncation or rounding error if you have done some math or bit manipulation to the sample in Core Audio. Once again, the Equalizer or mixer can be major culprits here. One engineer said the conversion to floating point is analogous to lossless compression; that's only true if you don't touch the data after conversion... once you do any math or mix or transform on the floating point data, you are dealing with something analogous to
lossy compression. That means if you're not in hog mode, or if your volume is wrong, you have introduced potential error.
On "Bit Perfect."
One last comment, this on "bit perfect." Here is an oft-cited paper on Computer Audiophile "proving" bit perfect output.
http://www.acourate.com/OperatingSystemsHandlingOfSampleRates.pdf
There is one HUGE problem, though.
They never compared any bits! Comparing wave forms does not prove bit perfect output. The only way to do that is to directly compare the bits from the file (losslessly converted to LPCM) to the bits being output. From what I can tell of your test, you've fallen in to the same trap. Sure, that wave form looks identical in some pictures, but there still could be bits that are lost or off that error correction took care of.
I appreciate your work an do not want to diminish it, but for accuracy's sake, you proved they are "bit similar" rather than "bit perfect." They may actually be bit perfect, but you have to compare the bits, not the graphical representation of the wave form.
Bit perfect is only bit perfect if you compare the digital data bit-for-bit.
-Pie