Schiit Happened: The Story of the World's Most Improbable Start-Up
Jul 15, 2023 at 7:10 PM Post #121,561 of 150,628
I would love to see some experimentation around that and the hypotheses driving those experiments.

Some thoughts:
  1. Inter sample timing differences? (caused possibly by more "effort" to decode FLAC vs WAV?)
  2. FLAC clipping and WAV not clipping or vice versa?
  3. Differences in hardware decoding effects/floating point math/etc?

4. Electrical noise caused by cpu working harder when decoding flac?

What eventually leaves your computer and travels to the DAC is in both cases PCM, meaning WAV. FLAC doesn't get sent out to your DAC, it gets decompressed first.

The compression algorithm that is used for FLAC is mathematically absolutely lossless. The bits you put in are precisely the same you will get out. You can think of FLAC like a ZIP file, it simply uses a bunch of relatively simple math to reduce the bits required to store the information without losing the ability to restore the original data bit-perfectly through essentially the inverse of the math that was used to compress it in the first place.
You can easily test that by comparing the original WAV file with the uncompressed data from a FLAC file that was created from that same WAV file. They will be precisely the same.

This is in contrast to lossy compression formats which first filter out actual information from your audio data that the developer of that format deemed unnecessary for some reason or another using a variety of digital signal processing methods. That reduced dataset is then compressed mathematically to take up even less space. You can still use the inverse math to uncompress the data, but what you end up with is what you had after the digital signal processing was done and before you reduced the file size.

The processing power required to decompress FLAC has a meaningless impact on modern CPUs. Even on CPUs from 20 years ago, it barely makes a dent. So I'd be surprised if you "heard" a difference there.
Unless your DAC is built into your computer and sharing the same power source as your CPU, you won't hear any electrical noise from your CPU, or any other difference for that matter.

UAC, the protocol that is used to send audio information over a USB wire, is asynchronous. Meaning the audio data is not sent as a "real-time stream" but in chunks, and then buffered on the receiving end. And remember, this is uncompressed PCM audio information in all cases, there's no difference whether the originating file was FLAC or WAV or even MP3.
That buffer is then used to feed a continuous stream of audio data from the buffer to the actual DAC chip.
This is also why the quality of your USB connection doesn't really matter, and why the clock inside your DAC is what makes or breaks the quality of the audio signal that you get out of your DAC.
The only thing that matters here is the cleanliness of the data stream from the buffer of the USB receiver to the DAC chip, and how constant the clock is that feeds into the DAC chip. If the data stream gets interfered with in a bad enough way it might make the 1s and the 0s (the highs and lows within the signal, whereby the lows are never perfectly at 0V and the highs are never perfectly at 3.3 or 5V, but somewhere in between, and with rather noticeable slur rates and ringing on the leading edges) no longer perfectly identifiable and thus prone for misinterpretation. And if the clock that controls that data stream and the DAC chip doesn't remain perfectly constant at all times, you end up with inconsistencies in the frequencies that the DAC puts out, somewhat similar in concept to the inconsistencies in frequencies you get as a result from the slight variations in the speed that the needle travels through the groove of a warped LP, even if the rotational speed of the platter the LP sits on is perfectly constant.

This buffering, by the way, is also the main reason why Unison is able to outperform all other ways you can connect your sources to your DAC. Because thanks to Unison, Schiit has full control over all of those parts of the path your signal takes that actually matter, including the bit between the buffer and the DAC, and the buffer itself. Toslink, AES, and SPDIF are real-time streams, any fluctuation or interruptions there directly affect the quality of information the DAC can work with, and off-the-shelf USB controllers do their own buffering and give you little to no control over what they do and how they do it.

And no, the buffer used for USB doesn't introduce any meaningful delay. That buffer is large enough to bridge the gaps between UAC audio data packets, and short enough to still be considered real-time within the capabilities of a human's brain. The exact delay introduced varies by manufacturer, but is seldom longer than a few milliseconds. For context: The human brain has been demonstrated to be incapable to reliably and repeatably detect an offset in synchronization between audio and visual information as long as that offset remains below a few tens of milliseconds.

So, if you heard any differences between your FLAC and WAV files, and you didn't use hardware from two decades ago, I could think of three potential causes for what you experienced:
One: The FLAC and WAV files you compared were not created from the same source material.
Two: One of them was DSP'd in some way along the way or they didn't take the same routes through your computer's different audio pipelines. (…as in: You used your computer's built-in audio player application to play the WAV file and Roon with its own audio pipeline to play the FLAC.)
Three: Imagination. Your brain is hardwired to interpret sensory information differently depending on the parameters that were set before it experienced the sensory information. Biases are a bitch to avoid, that's just human.
 
Last edited:
Jul 15, 2023 at 7:19 PM Post #121,563 of 150,628
One of my personal favorites was a company I once contracted with that rated the quality* of the work their internal and external programmers did by the number of lines of code they committed to a project by the end of each day. You can probably imagine what kind of behavior this fostered and the quality of the product they eventually ended up with.
So did you score high on your rating? :relaxed:

For software geeks only...
https://www.reddit.com/r/Programmer...ound_this_at_work_someone_padded_a_repo_with/
Screenshot 2023-07-15 at 4.06.00 PM.png
 
Jul 15, 2023 at 7:22 PM Post #121,564 of 150,628
Please, can you comment on whether the URD improves the USB signal in comparison to a set-up without URD.
I mean Macbook > URD > DAC vs Macbook > DAC.
Thanks

Matt
IMO it improves the sound considerably. It noticeably lowers the sound floor and the quality of the sound got a huge boost. I am using Apple Music lossless. Urd is made for someone like myself who prefers CD and supplements that with streaming. I used to own a roon nucleus and streaming through the Urd sounds much, much better than when I streamed through a computer or the Roon Nucleus.
 
Jul 15, 2023 at 7:24 PM Post #121,565 of 150,628
So did you score high on your rating? :relaxed:

For software geeks only...
https://www.reddit.com/r/Programmer...ound_this_at_work_someone_padded_a_repo_with/
Screenshot 2023-07-15 at 4.06.00 PM.png
That is an extreme example of what eventually happened, but 100% accurate in principle. Whenever someone had the choice between copy-pasting code between two or more sections instead of wrapping it into a separate function, people always chose the copy-paste method to pad their quotas. That, and other more or less creative methods of doing something in ten lines of code that could just as well be done in one or two.

And I eventually asked to be let out of the contract because I was indeed asked to "up my efforts" on a bi-weekly basis — and refused to do so.
 
Jul 15, 2023 at 7:28 PM Post #121,566 of 150,628
Jul 15, 2023 at 7:29 PM Post #121,567 of 150,628
What eventually leaves your computer and travels to the DAC is in both cases PCM, meaning WAV. FLAC doesn't get sent out to your DAC, it gets decompressed first.

The compression algorithm that is used for FLAC is mathematically absolutely lossless. The bits you put in are precisely the same you will get out. You can think of FLAC like a ZIP file, it simply uses a bunch of relatively simple math to reduce the bits required to store the information without losing the ability to restore the original data bit-perfectly through essentially the inverse of the math that was used to compress it in the first place.
You can easily test that by comparing the original WAV file with the uncompressed data from a FLAC file that was created from that same WAV file. They will be precisely the same.

This is in contrast to lossy compression formats which first filter out actual information from your audio data that the developer of that format deemed unnecessary for some reason or another using a variety of digital signal processing methods. That reduced dataset is then compressed mathematically to take up even less space. You can still use the inverse math to uncompress the data, but what you end up with is what you had after the digital signal processing was done and before you reduced the file size.

The processing power required to decompress FLAC has a meaningless impact on modern CPUs. Even on CPUs from 20 years ago, it barely makes a dent. So I'd be surprised if you "heard" a difference there.
Unless your DAC is built into your computer and sharing the same power source as your CPU, you won't hear any electrical noise from your CPU, or any other difference for that matter.

UAC, the protocol that is used to send audio information over a USB wire, is asynchronous. Meaning the audio data is not sent as a "real-time stream" but in chunks, and then buffered on the receiving end. And remember, this is uncompressed PCM audio information in all cases, there's no difference whether the originating file was FLAC or WAV or even MP3.
That buffer is then used to feed a continuous stream of audio data from the buffer to the actual DAC chip.
This is also why the quality of your USB connection doesn't really matter, and why the clock inside your DAC is what makes or breaks the quality of the audio signal that you get out of your DAC.
The only thing that matters here is the cleanliness of the data stream from the buffer of the USB receiver to the DAC chip, and how constant the clock is that feeds into the DAC chip. If the data stream gets interfered with in a bad enough way it might make the 1s and the 0s (the highs and lows within the signal, whereby the lows are never perfectly at 0V and the highs are never perfectly at 3.3 or 5V, but somewhere in between, and with rather noticeable slur rates and ringing on the leading edges) no longer perfectly identifiable and thus prone for misinterpretation. And if the clock that controls that data stream and the DAC chip doesn't remain perfectly constant at all times, you end up with inconsistencies in the frequencies that the DAC puts out, somewhat similar in concept to the inconsistencies in frequencies you get as a result from the slight variations in the speed that the needle travels through the groove of a warped LP, even if the rotational speed of the platter the LP sits on is perfectly constant.

This buffering, by the way, is also the main reason why Unison is able to outperform all other ways you can connect your sources to your DAC. Because thanks to Unison, Schiit has full control over all of those parts of the path your signal takes that actually matter, including the bit between the buffer and the DAC, and the buffer itself. Toslink, AES, and SPDIF are real-time streams, any fluctuation or interruptions there directly affect the quality of information the DAC can work with, and off-the-shelf USB controllers do their own buffering and give you little to no control over what they do and how they do it.

And no, the buffer used for USB doesn't introduce any meaningful delay. That buffer is large enough to bridge the gaps between UAC audio data packets, and short enough to still be considered real-time within the capabilities of a human's brain. The exact delay introduced varies by manufacturer, but is seldom longer than a few milliseconds. For context: The human brain has been demonstrated to be incapable to reliably and repeatedly detect an offset in synchronization between audio and visual information as long as that offset remains below a few tens of milliseconds.

So, if you heard any differences between your FLAC and WAV files, and you didn't use hardware from two decades ago, I could think of three potential causes for what you experienced:
One: The FLAC and WAV files you compared were not created from the same source material.
Two: One of them was DSP'd in some way along the way or they didn't take the same routes through your computer's different audio pipelines. (…as in: You used your computer's built-in audio player application to play the WAV file and Roon with its own audio pipeline to play the FLAC.)
Three: Imagination. Your brain is hardwired to interpret sensory information differently depending on the parameters that were set before it experienced the sensory information. Biases are a bitch to avoid, that's just human.
You may not "hear" noise from your cpu directly, but it can make its way into your dac, and induce jitter. In digital systems, everything matters.
 
Jul 15, 2023 at 7:38 PM Post #121,568 of 150,628
You may not "hear" noise from your cpu directly, but it can make its way into your dac, and induce jitter. In digital systems, everything matters.
Incorrect. Re-read the bit about the buffer.
As long as the source is not so slow that it causes buffer under-run on the DAC's side, any variations in transmission speed or length of gaps between UAC packets get simply ironed over because of that buffer. And what happens after that buffer is independent from your source and entirely up to how it was implemented by the DAC manufacturer.

And please don't conflate electrical noise and jitter, those are two entirely different things.
 
Jul 15, 2023 at 7:43 PM Post #121,569 of 150,628
Incorrect. Re-read the bit about the buffer.
As long as the source is not so slow that it causes buffer under-run on the DAC's side, any variations in transmission speed or length of gaps between UAC packets get simply ironed over because of that buffer. And what happens after that buffer is independent from your source and entirely up to how it was implemented by the DAC manufacturer.

And please don't conflate electrical noise and jitter, those are two entirely different things.
Maybe you should reread my post. I didn't say noise and jitter are the same thing. I said noise induces jitter. A buffer is not a magic bullet against noise and it's been widely reported that people hear differences in usb sources even with Unison.

Or, maybe I should have loaded my post down with lots of facts probably already known by 80% of the people here, injected a serious tone, and thus improve the truthiness of my statements.
 
Jul 15, 2023 at 7:43 PM Post #121,570 of 150,628
Listening to the 24/192 remaster of Tom Waits' Rain Dogs, I think I may have found the limitations of my gear (Or perhaps my ears?). It doesn't sound any better than 16/44 Redbook. This seems like it was a waste of time, though of course, it will probably make money for the record company and Mr. Waits, so it could be argued it wasn't.
The new Waits remasters are exactly why I won’t be able to fully switch to streaming ever… they are a disaster. The original Cd releases are the way to go and sound exceptional.
 
Jul 15, 2023 at 8:02 PM Post #121,571 of 150,628
The new Waits remasters are exactly why I won’t be able to fully switch to streaming ever… they are a disaster. The original Cd releases are the way to go and sound exceptional.
Yes, I'm listening to the 16/44 right now and it IS better.
 
Jul 15, 2023 at 8:03 PM Post #121,572 of 150,628
I once heard a quote that I don't know the source of, "Numbers aren't facts." It kind of blew my mind at the time because of how obviously true it is, but also because of how long I'd gone through life as a good little citizen of the western world without ever having really been exposed to the idea so directly.

The IT department where I work lives and dies by what I would call bullschiit metrics. Things like call abandonment rate and percentage of tickets updated within a day. We even see graphs of who performs best on these metrics at our all hands meetings. Some of the people who rate best are the same ones who I've had subjectively terrible experiences dealing with when trying to get help. Go figure!

So did you score high on your rating? :relaxed:

For software geeks only...
https://www.reddit.com/r/Programmer...ound_this_at_work_someone_padded_a_repo_with/
Screenshot 2023-07-15 at 4.06.00 PM.png
Oh my - Triggering!!! Wow.
 

Users who are viewing this thread

Back
Top