Just a thought about Lossless vs. WAV
Sep 15, 2006 at 7:20 PM Thread Starter Post #1 of 34

Gnus

100+ Head-Fier
Joined
Apr 21, 2006
Posts
294
Likes
10
I know this topic has been done to death but I just had a thought about maybe why people hear differences when using lossless vs. wav files.

When used in a DAP, wouldn't the decoding process of the codec affect the sound quality in a way? For example, let's look at Rockbox since it supports FLAC, ALAC, and WAV. The ALAC decoder is still not worked out yet, while the FLAC decoder is probably the most used and worked on encoder, so could this affect the quality of the music that comes out of it?

Another way to look at it is support for a codec in a DAP that is just came out and is still in beta, versus one that has been stable and finalized. Would this contribute to changes in SQ?
 
Sep 15, 2006 at 7:54 PM Post #3 of 34
Quote:

Originally Posted by Kees
I think (and posted) that this is obviously true.
There were some people around that can't see the logic of it.
They like to ridicule this idea.
So put on your flame suit.....



the flaming isn't because of this idea, which is reasonable. it's because those who purport to hear a difference ALWAYS say that WAV sounds better than AL, which is dubious. so there's never a problem with WAV decoding...the problem's always with AL decoding? if the files have both been ripped and encoded properly and if the decoding is done properly, there is no difference in how the two filetypes sound. simple as that. so the onus is on the listener to tell us whether the AL decoding is somehow crippled and how they've determined that. unless that information is provided, placebo is all i can say.
 
Sep 15, 2006 at 7:55 PM Post #4 of 34
Quote:

Originally Posted by Gnus
I know this topic has been done to death but I just had a thought about maybe why people hear differences when using lossless vs. wav files.

When used in a DAP, wouldn't the decoding process of the codec affect the sound quality in a way? For example, let's look at Rockbox since it supports FLAC, ALAC, and WAV. The ALAC decoder is still not worked out yet, while the FLAC decoder is probably the most used and worked on encoder, so could this affect the quality of the music that comes out of it?



No. This is not how computer software works. An exact decoder either works or it doesn't. How long they've been working on it is immaterial. They could have been working on it for an evening and gotten it to work, or they could have been working on it for months and still not have gotten it right. It is simple to write test cases for a lossless decoder; you just compare what you should have gotten with what the decoder produced. Rockbox and Foobar are open source, people have verified that the output is exact, and yet some people still claim to hear differences. It comes down to the placebo effect.

The situation is a little bit different with lossy decoders. For various reasons, the lossy decoding standards do not specify exact output. There is a range of acceptable output, ways to deal with overflow, etc. Thus, people may tweak lossy decoders for a while, and each tweak may produce slightly different output.
 
Sep 15, 2006 at 8:10 PM Post #7 of 34
Quote:

Originally Posted by Kees
I think (and posted) that this is obviously true.
There were some people around that can't see the logic of it.
They like to ridicule this idea.
So put on your flame suit.....



When people attempt to apply "common sense" to specialized subjects, rarely does the result make any sense. In this case it still does not.
 
Sep 15, 2006 at 8:16 PM Post #8 of 34
Quote:

Originally Posted by K2Grey
When people attempt to apply "common sense" to specialized subjects, rarely does the result make any sense. In this case it still does not.


The problem is that some people here like to apply analog audio concepts to audio software because, hey, "it's audio." It doesn't work that way.
 
Sep 15, 2006 at 9:45 PM Post #9 of 34
Quote:

Originally Posted by kugino
the flaming isn't because of this idea, which is reasonable. it's because those who purport to hear a difference ALWAYS say that WAV sounds better than AL, which is dubious. so there's never a problem with WAV decoding...the problem's always with AL decoding? if the files have both been ripped and encoded properly and if the decoding is done properly, there is no difference in how the two filetypes sound. simple as that. so the onus is on the listener to tell us whether the AL decoding is somehow crippled and how they've determined that. unless that information is provided, placebo is all i can say.


If the idea is reasonable, why is it dubious that people state they can hear the difference?
Nobody said EVER anything about problems with decoding. They probably work as designed. Which MAY be flawed in certain situations. And there is reason (see this idea, which you yourself say is reasonable) to believe that there is a possible difference in the quality of the signal delivery of the diverse codecs. That's all.
And I don't feel I need to prove anything to anybody. I just would like somebody to explain why the idea is not true, or does not result in audible differences.
And I would like a better explanation than "it just doesn't work that way", or "lossless = lossless, there just cant be a difference" .
If this were true, there couldn't be an audible difference between DACs either. Because these are just CPUs performing the same algorithm. With the exact same outcome every time.
 
Sep 15, 2006 at 10:02 PM Post #10 of 34
Quote:

Originally Posted by Kees
If the idea is reasonable, why is it dubious that people state they can hear the difference?
Nobody said EVER anything about problems with decoding. They probably work as designed. Which MAY be flawed in certain situations. And there is reason (see this idea, which you yourself say is reasonable) to believe that there is a possible difference in the quality of the signal delivery of the diverse codecs. That's all.
And I don't feel I need to prove anything to anybody. I just would like somebody to explain why the idea is not true, or does not result in audible differences.
And I would like a better explanation than "it just doesn't work that way", or "lossless = lossless, there just cant be a difference" .
If this were true, there couldn't be an audible difference between DACs either. Because these are just CPUs performing the same algorithm. With the exact same outcome every time.



i love guys who bold and underline things
rolleyes.gif
makes it so much easier to see.

the dubious part is the extreme asymmetry of the preference. if it is the case that decoding might be in error (which wodgy has already stated that in lossless that isn't the case), why must it always be the AL decoder? why does WAV always sound better than AL? well, it doesn't. it's a bias that goes into the test. expectations are difficult to overcome.

show me that AL decoders are systematically more prone to errors than WAV decoders...then we'll talk.

and no, that is not exactly how DACs work. DACs come in many flavors (pulse width modulating, oversampling, etc.) and can sound different depending on the specific algorithms used. DACs can also add noise to the analog signal, resulting in differing THD values, and this can make the sound of various DACs sound different, too. so no, DAC is not a good example. NEXT!
 
Sep 15, 2006 at 10:02 PM Post #11 of 34
Quote:

Originally Posted by Kees
If the idea is reasonable, why is it dubious that people state they can hear the difference?
Nobody said EVER anything about problems with decoding. They probably work as designed. Which MAY be flawed in certain situations. And there is reason (see this idea, which you yourself say is reasonable) to believe that there is a possible difference in the quality of the signal delivery of the diverse codecs. That's all.
And I don't feel I need to prove anything to anybody. I just would like somebody to explain why the idea is not true, or does not result in audible differences.
And I would like a better explanation than "it just doesn't work that way", or "lossless = lossless, there just cant be a difference" .
If this were true, there couldn't be an audible difference between DACs either. Because these are just CPUs performing the same algorithm. With the exact same outcome every time.



Here you go. The actual analog sound is produced by the DAC chip in your digital audio player, which translates the digital information into analog audio signal, before the digital information hits the DAC it is simply 0s and 1s. The fact that the file is lossless means that after decoding the 0s and 1s of the lossless file and the wav file are IDENTICAL (which can be easily verified and has been confirmed to be true many times). That means that in the case of a lossless file and the wav file the DAC chip is being feed THE SAME information and thus produces THE SAME analog output. There is nothing complicated about it.

Your argument about all DACs sounding the same is not accurate because different DACs go about translating the digital signal into an analog signal in different ways. Delivery of digital signal to a DAC and translation of digital information into analog signal are not analogous things, they are entirely different processes.

If you hear differences between wav files and lossless files that means that either something is wrong with the way you encode your lossless files (like if DSP processing is enabled, etc.) or you are experiencing a placebo effect (which is something very, very real - psychologically speaking).
 
Sep 15, 2006 at 10:33 PM Post #12 of 34
Quote:

Originally Posted by Wodgy
No. This is not how computer software works. An exact decoder either works or it doesn't. How long they've been working on it is immaterial. They could have been working on it for an evening and gotten it to work, or they could have been working on it for months and still not have gotten it right. It is simple to write test cases for a lossless decoder; you just compare what you should have gotten with what the decoder produced. Rockbox and Foobar are open source, people have verified that the output is exact, and yet some people still claim to hear differences. It comes down to the placebo effect.

The situation is a little bit different with lossy decoders. For various reasons, the lossy decoding standards do not specify exact output. There is a range of acceptable output, ways to deal with overflow, etc. Thus, people may tweak lossy decoders for a while, and each tweak may produce slightly different output.



Thanks for clarifying things, because all this talk about how there IS a difference between ALAC and WAV is unsetting especially as I go about re-ripping my collection into ALAC.

ABX'ing on foobar and they both sound the same to me, so that's good.
wink.gif
 
Sep 15, 2006 at 11:06 PM Post #13 of 34
Quote:

Originally Posted by fwojciec
Here you go. The actual analog sound is produced by the DAC chip in your digital audio player, which translates the digital information into analog audio signal, before the digital information hits the DAC it is simply 0s and 1s. The fact that the file is lossless means that after decoding the 0s and 1s of the lossless file and the wav file are IDENTICAL (which can be easily verified and has been confirmed to be true many times). That means that in the case of a lossless file and the wav file the DAC chip is being feed THE SAME information and thus produces THE SAME analog output. There is nothing complicated about it.

Your argument about all DACs sounding the same is not accurate because different DACs go about translating the digital signal into an analog signal in different ways. Delivery of digital signal to a DAC and translation of digital information into analog signal are not analogous things, they are entirely different processes.

If you hear differences between wav files and lossless files that means that either something is wrong with the way you encode your lossless files (like if DSP processing is enabled, etc.) or you are experiencing a placebo effect (which is something very, very real - psychologically speaking).



Thanks for trying to actually explain.
You say there are tests that prove the digital signal from different lossless codecs to be always identical. Could you please point me to one? I would like to know how they measure (binary compare?) that.
I am not shure how in PCs the digital signal (output of the codec algorithm) is presented to the DAC. How is this buffered? How fast, how big is the buffer, what sort of buffer space is allocated, how consistent is the speed with which the codec delivers the signal? What sort of subroutine is actually reading the buffer and feeding the DAC? How accurate is this routine? How do these functions differ for different codecs or on different platforms? I just don't seem to be able to find much info on this.
I know from experience that computers often don't perform as accurately as we would like to think. (never seen any unintentional video effects in a video game?) And I think that different algorithms/codecs can cause different load/problems for (sub routines of) certain platforms they are using.
This causes me to be sceptic about the supposed infallability of the playback of lossless coded audio.

But if nobody can hear the audible differences this could make: That's fine by me. I can live with that and my "placebo".

Thanks for your effort.
 
Sep 15, 2006 at 11:18 PM Post #14 of 34
Quote:

Originally Posted by Gnus
Thanks for clarifying things, because all this talk about how there IS a difference between ALAC and WAV is unsetting especially as I go about re-ripping my collection into ALAC.

ABX'ing on foobar and they both sound the same to me, so that's good.
wink.gif



If you don't hear a difference there is nothing for you to worry about, I'd think.
cool.gif
 
Sep 15, 2006 at 11:45 PM Post #15 of 34
Quote:

Originally Posted by Kees
Thanks for trying to actually explain.
You say there are tests that prove the digital signal from different lossless codecs to be always identical. Could you please point me to one? I would like to know how they measure (binary compare?) that.
I am not shure how in PCs the digital signal (output of the codec algorithm) is presented to the DAC. How is this buffered? How fast, how big is the buffer, what sort of buffer space is allocated, how consistent is the speed with which the codec delivers the signal? What sort of subroutine is actually reading the buffer and feeding the DAC? How accurate is this routine? How do these functions differ for different codecs or on different platforms? I just don't seem to be able to find much info on this.
I know from experience that computers often don't perform as accurately as we would like to think. (never seen any unintentional video effects in a video game?) And I think that different algorithms/codecs can cause different load/problems for (sub routines of) certain platforms they are using.
This causes me to be sceptic about the supposed infallability of the playback of lossless coded audio.



You can actually compare wave file and lossless file yourself:

1) Make a wav file
2) Encode it to a lossless file (whichever format)
3) Decode the lossless file into a new wav file
4) Compare the wav file from 1 with the wav file from 3 (I think you can use Total Commander to do it on a PC) - they should be bit-for-bit identical.

Think of a lossless file as essentially a ziped/rared (or otherwise compressed) wav file - this is essentially what they are, except decompression is performed by the codec on-the-fly when the file is played.

As for your other questions - I am not an engineer so I can't answer any of them authoritatively. However, given that the time necessary to decode a lossless file into a wav file is a small fraction of the time it takes to play the file as music I don't think that the additional step of decoding is going to pose significant challenges during playback. Wav file is also buffered prior to being sent to the DAC and I wouldn't be surprised if the lossless files were buffered in exactly the same way as wav files (or other audio formats): it would simply take slightly longer to fill the buffer in the initial milliseconds of playback in the case of lossless files as compared to wav files. The way the files are buffered will be different in different players - for example in foobar you can set the length of a playback buffer (it's the single setting for all file formats).

The speed, or rather the regularity with which the digital bits are delivered to the DAC chip is another issue, quite independent, as far as I know, from codec issues (buffering assures that, I guess) - this problem depends on the way the DAC chip is implemented in the circuit, the type of transport used (CD, hard drive, etc.), whether the signal is being first converted into optical signal (like in toslink), and so on. The irregularities in the way the bits are delivered to the DAC essentially cause jitter, which can decrease SQ. Again, I have never heard anything about lossless vs wav being a factor in how much jitter is produced.
 

Users who are viewing this thread

Back
Top