ArmchairPhilosopher
Headphoneus Supremus
I would love to see some experimentation around that and the hypotheses driving those experiments.
Some thoughts:
- Inter sample timing differences? (caused possibly by more "effort" to decode FLAC vs WAV?)
- FLAC clipping and WAV not clipping or vice versa?
- Differences in hardware decoding effects/floating point math/etc?
4. Electrical noise caused by cpu working harder when decoding flac?
What eventually leaves your computer and travels to the DAC is in both cases PCM, meaning WAV. FLAC doesn't get sent out to your DAC, it gets decompressed first.
The compression algorithm that is used for FLAC is mathematically absolutely lossless. The bits you put in are precisely the same you will get out. You can think of FLAC like a ZIP file, it simply uses a bunch of relatively simple math to reduce the bits required to store the information without losing the ability to restore the original data bit-perfectly through essentially the inverse of the math that was used to compress it in the first place.
You can easily test that by comparing the original WAV file with the uncompressed data from a FLAC file that was created from that same WAV file. They will be precisely the same.
This is in contrast to lossy compression formats which first filter out actual information from your audio data that the developer of that format deemed unnecessary for some reason or another using a variety of digital signal processing methods. That reduced dataset is then compressed mathematically to take up even less space. You can still use the inverse math to uncompress the data, but what you end up with is what you had after the digital signal processing was done and before you reduced the file size.
The processing power required to decompress FLAC has a meaningless impact on modern CPUs. Even on CPUs from 20 years ago, it barely makes a dent. So I'd be surprised if you "heard" a difference there.
Unless your DAC is built into your computer and sharing the same power source as your CPU, you won't hear any electrical noise from your CPU, or any other difference for that matter.
UAC, the protocol that is used to send audio information over a USB wire, is asynchronous. Meaning the audio data is not sent as a "real-time stream" but in chunks, and then buffered on the receiving end. And remember, this is uncompressed PCM audio information in all cases, there's no difference whether the originating file was FLAC or WAV or even MP3.
That buffer is then used to feed a continuous stream of audio data from the buffer to the actual DAC chip.
This is also why the quality of your USB connection doesn't really matter, and why the clock inside your DAC is what makes or breaks the quality of the audio signal that you get out of your DAC.
The only thing that matters here is the cleanliness of the data stream from the buffer of the USB receiver to the DAC chip, and how constant the clock is that feeds into the DAC chip. If the data stream gets interfered with in a bad enough way it might make the 1s and the 0s (the highs and lows within the signal, whereby the lows are never perfectly at 0V and the highs are never perfectly at 3.3 or 5V, but somewhere in between, and with rather noticeable slur rates and ringing on the leading edges) no longer perfectly identifiable and thus prone for misinterpretation. And if the clock that controls that data stream and the DAC chip doesn't remain perfectly constant at all times, you end up with inconsistencies in the frequencies that the DAC puts out, somewhat similar in concept to the inconsistencies in frequencies you get as a result from the slight variations in the speed that the needle travels through the groove of a warped LP, even if the rotational speed of the platter the LP sits on is perfectly constant.
This buffering, by the way, is also the main reason why Unison is able to outperform all other ways you can connect your sources to your DAC. Because thanks to Unison, Schiit has full control over all of those parts of the path your signal takes that actually matter, including the bit between the buffer and the DAC, and the buffer itself. Toslink, AES, and SPDIF are real-time streams, any fluctuation or interruptions there directly affect the quality of information the DAC can work with, and off-the-shelf USB controllers do their own buffering and give you little to no control over what they do and how they do it.
And no, the buffer used for USB doesn't introduce any meaningful delay. That buffer is large enough to bridge the gaps between UAC audio data packets, and short enough to still be considered real-time within the capabilities of a human's brain. The exact delay introduced varies by manufacturer, but is seldom longer than a few milliseconds. For context: The human brain has been demonstrated to be incapable to reliably and repeatably detect an offset in synchronization between audio and visual information as long as that offset remains below a few tens of milliseconds.
So, if you heard any differences between your FLAC and WAV files, and you didn't use hardware from two decades ago, I could think of three potential causes for what you experienced:
One: The FLAC and WAV files you compared were not created from the same source material.
Two: One of them was DSP'd in some way along the way or they didn't take the same routes through your computer's different audio pipelines. (…as in: You used your computer's built-in audio player application to play the WAV file and Roon with its own audio pipeline to play the FLAC.)
Three: Imagination. Your brain is hardwired to interpret sensory information differently depending on the parameters that were set before it experienced the sensory information. Biases are a bitch to avoid, that's just human.
Last edited: