Quote:
Originally Posted by Kees
Thanks for trying to actually explain.
You say there are tests that prove the digital signal from different lossless codecs to be always identical. Could you please point me to one? I would like to know how they measure (binary compare?) that.
I am not shure how in PCs the digital signal (output of the codec algorithm) is presented to the DAC. How is this buffered? How fast, how big is the buffer, what sort of buffer space is allocated, how consistent is the speed with which the codec delivers the signal? What sort of subroutine is actually reading the buffer and feeding the DAC? How accurate is this routine? How do these functions differ for different codecs or on different platforms? I just don't seem to be able to find much info on this.
I know from experience that computers often don't perform as accurately as we would like to think. (never seen any unintentional video effects in a video game?) And I think that different algorithms/codecs can cause different load/problems for (sub routines of) certain platforms they are using.
This causes me to be sceptic about the supposed infallability of the playback of lossless coded audio.
|
You can actually compare wave file and lossless file yourself:
1) Make a wav file
2) Encode it to a lossless file (whichever format)
3) Decode the lossless file into a new wav file
4) Compare the wav file from 1 with the wav file from 3 (I think you can use Total Commander to do it on a PC) - they should be bit-for-bit identical.
Think of a lossless file as essentially a ziped/rared (or otherwise compressed) wav file - this is essentially what they are, except decompression is performed by the codec on-the-fly when the file is played.
As for your other questions - I am not an engineer so I can't answer any of them authoritatively. However, given that the time necessary to decode a lossless file into a wav file is a small fraction of the time it takes to play the file as music I don't think that the additional step of decoding is going to pose significant challenges during playback. Wav file is also buffered prior to being sent to the DAC and I wouldn't be surprised if the lossless files were buffered in exactly the same way as wav files (or other audio formats): it would simply take slightly longer to fill the buffer in the initial milliseconds of playback in the case of lossless files as compared to wav files. The way the files are buffered will be different in different players - for example in foobar you can set the length of a playback buffer (it's the single setting for all file formats).
The speed, or rather the regularity with which the digital bits are delivered to the DAC chip is another issue, quite independent, as far as I know, from codec issues (buffering assures that, I guess) - this problem depends on the way the DAC chip is implemented in the circuit, the type of transport used (CD, hard drive, etc.), whether the signal is being first converted into optical signal (like in toslink), and so on. The irregularities in the way the bits are delivered to the DAC essentially cause jitter, which can decrease SQ. Again, I have never heard anything about lossless vs wav being a factor in how much jitter is produced.