Noticeable sonic differences between WAV, AIFF, and ALAC -- why?
Nov 4, 2007 at 7:43 AM Thread Starter Post #1 of 168

IPodPJ

MOT: Bellatone Audio
Caution: Incomplete customer orders
Joined
Apr 17, 2006
Posts
7,870
Likes
62
Ok, I know it's been discussed before and I've read through all the arrogant posts on Hydrogen Audio about this. But regardless, I have noticed sonic differences and I want to know what is causing it.

AIFF and WAV, pure CD-DA, sounds better than Apple Lossless (ALAC) when played in Foobar or any other media player. I never noticed this before I got my Opera, but now the sonic transparency is obvious.

I am using the Opera's DAC through USB which is being sent from my computer at 16/48. The following are the formats I tested using Mike Oldfield's Tubular Bells II, song Maya Gold, ripped directly from CD, in the order in which they sound best:

1) -- WAV (32 bit PCM) ripped with Foobar
2) -- WAV (16 bit PCM) ripped with Foobar
3) -- AIFF ripped with iTunes
4) -- ALAC ripped with iTunes

I have been comparing the above four copies of the same song, ripped from the same CD burner in my computer for the past several hours. They are all being played back through Foobar 0.9.5 beta2 + ASIO4ALL, through USB 16/48 into the Opera's DAC. The differences between numbers 2 - 4 listed above are minimal but they are noticable. Number 1 is a significant improvement over the other three.

I want to know why this is. Lossless formats are supposed to be lossless, so what artifacts are being introduced or what is being left out, if anything? Why would the 32-bit WAV file sound sonically superior than the rest when USB is only transferring 16-bit data to the DAC? Is something happening when the computer is reading the audio and interpreting it?
 
Nov 4, 2007 at 8:02 AM Post #2 of 168
Here we go again... !
very_evil_smiley.gif
wink.gif
 
Nov 4, 2007 at 8:10 AM Post #3 of 168
Quote:

Originally Posted by krmathis /img/forum/go_quote.gif
Here we go again... !
very_evil_smiley.gif
wink.gif



Yeah, well I'm not looking for smart ass answers from people like the stuff I see over at Hydrogen Audio (not saying you are, krmathis). So perhaps I should lay this down now: Hopefully some people that have experienced the same thing will chime in and give serious answers. Anyone who wants to say something like "You are just hearing things, it's placebo and you're a moron" can keep their comments to themself. I know what I'm hearing and I've spent several hours with this one song already. So if you think it's placebo, you don't have equipment good enough to hear it, plain and simple. When I was using my iPod and Go-Vibe, I was not able to tell the difference at all between ALAC and WAV or AIFF. What's more interesting to me is why the 32-bit WAV sounds so much better (which was ripped from a 16-bit CD). Of course, I still want to know why the other ones vary sonically.
 
Nov 4, 2007 at 8:26 AM Post #4 of 168
They all contain the same audio data, so there are no easy logical explanation why you hear an audible difference. The decoding of the Apple Lossless stream take CPU cycles, which "may" affect the sound (I don't know). But in either case the same PCM stream 'should' be played back and sent to your headphone gear.

This have, as you also state, been discussed numerous times before, and don't think anyone have found a good answer. Other than the usual "there should not be any difference"..
 
Nov 4, 2007 at 8:41 AM Post #5 of 168
Quote:

Originally Posted by krmathis /img/forum/go_quote.gif
This have, as you also state, been discussed numerous times before, and don't think anyone have found a good answer. Other than the usual "there should not be any difference"..


Right. There shouldn't be a difference if the lossless algorithms are correct and the file is verified after encoding. So that means the sonic differences are occuring on playback.
confused.gif


You know what else I've noticed? Something else is adding distortion to music close to peaking. I was wondering why a lot of music that I've never noticed distortion on before suddenly acquired it. It could be a conflict between ASIO4ALL and the DAC in the Opera.

Less than 24 hours using the Opera and already I've become such a finicky person... LOL.
biggrin.gif
 
Nov 4, 2007 at 10:11 AM Post #6 of 168
A few weeks ago, I made a simialr suggestion I was lampoon by almost everyone.

Even my initial study is not a scientific one, just my listening impression from a headphone comparing Wav and Flac, I found the sound to be different. Wav being better.

In order to make a more reliable conclusion, one must perform a scientific studies including control groups.. etc...

there you are just my thoughts...
 
Nov 4, 2007 at 11:05 AM Post #7 of 168
Why is your wav file a 32-bit one? I mean the original data isn't 32-bit in anyway. It has obviously been altered. They all can't contain the same audio data if they sound different. Lossless encoding is lossless encoding so you should compare the files directly bit-by-bit with a good sound software to verify they all are the same.
 
Nov 4, 2007 at 12:18 PM Post #8 of 168
You may be introducing some conversion steps that might affect sound quality. Consider that the information on the CD is 16-bit/44.1 kHz. You're ripping the track, padding this information to 32-bit (hopefully using a lossless algorithm), then performing a lossy sample rate upconversion from 44.1 kHz to 48 kHz, and downmixing back to 16-bit so that the Opera can accept the bitstream.

Here's the guaranteed most accurate way to listen in Foobar with your Opera DAC:

1. Turn up the gain to 0.0 dB in the lower-right corner of Foobar.
2. Disable all resampler plug-ins.
3. Switch to 16-bit or 24-bit mode in Foobar (the Opera can accept either bit-rate, and the choice won't make much difference because plug-ins are not being used).
4. Turn up the Wave control to 100% in the Windows Mixer.
 
Nov 4, 2007 at 1:44 PM Post #10 of 168
Quote:

Originally Posted by IPodPJ /img/forum/go_quote.gif
You know what else I've noticed? Something else is adding distortion to music close to peaking. I was wondering why a lot of music that I've never noticed distortion on before suddenly acquired it. It could be a conflict between ASIO4ALL and the DAC in the Opera.



I'd double-triple check that you aren't clipping anywhere - I vaguely recall a similar issue when I used the Kx audio drivers alternate mixer - it's worth a look.

Since you have foobar, I assume you've been ABX-ing the files? Perhaps the encoder is the culprit.
 
Nov 4, 2007 at 2:15 PM Post #11 of 168
Quote:

Originally Posted by IPodPJ /img/forum/go_quote.gif
What's more interesting to me is why the 32-bit WAV sounds so much better (which was ripped from a 16-bit CD).


Assuming you're not hearing things, my rather uninformed guess is that you're hearing the dithering from 32-bit to 16-bit. One of its function might be to smooth the fine details and in particular to avoid clipping in order to minimize artifacts from the conversion. When there's nothing to minimize, it might still be looking for problems in your signal.

In any case, you'll get the most fidelity with 16/44 lossless (uncompressed PCM, FLAC, whatever). I wouldn't trust ALAC but that's up to you.
 
Nov 4, 2007 at 7:00 PM Post #12 of 168
You're ripping with Foobar and iTunes. That is bad. Use EAC.

When you rip bit-perfect, lossless files will be the same as the CD, and any difference you hear would either be placebo, or something being decoded weird, resampled, or something.
 
Nov 4, 2007 at 8:39 PM Post #13 of 168
Thanks for all of your advice. You've got some really good answers.

When you run 24/96, your music sounds better because the DAC oversamples, or in some cases, upsamples. It seems like something similar is happening here when the audio is getting played back from the 32-bit file.

With regards to the variables in the Lossless algorithms of the 16-bit files, I'm not sure what to think. All of my Foobar settings are optimal, no resampling, no decrease in gain, etc.

So I think that what is happening when you use a lossles file is that Foobar is not sampling the reconstructed audio data as it does with CD-DA extraction. Maybe it is not sampling the audio at 44.1 when it is decoding the audio. Or maybe my computer just doesn't like lossless files.

Advice to others: Don't upgrade your equipment, and this won't happen to you.
smily_headphones1.gif
 
Nov 4, 2007 at 8:44 PM Post #14 of 168
Quote:

Originally Posted by ph0rk /img/forum/go_quote.gif
I'd double-triple check that you aren't clipping anywhere - I vaguely recall a similar issue when I used the Kx audio drivers alternate mixer - it's worth a look.

Since you have foobar, I assume you've been ABX-ing the files? Perhaps the encoder is the culprit.



No, I'm not clipping anywhere, not on my computer at least. Maybe the DAC is clipping??
confused.gif


I used iTunes actually to encode most of my CDs to ALAC. The only time I used Foobar was when I had a FLAC file that I needed to convert to WAV to import into iTunes. But this time I was just experimenting with Foobar, so I used it to make some files from the CD. I was using iTunes all along, up until I read the thread about Foobar + ASIO4ALL.
 
Nov 4, 2007 at 10:20 PM Post #15 of 168
Quote:

Originally Posted by OverlordXenu /img/forum/go_quote.gif
You're ripping with Foobar and iTunes. That is bad. Use EAC.


QFT.
 

Users who are viewing this thread

Back
Top