Head-Fi.org › Forums › Equipment Forums › Dedicated Source Components › iTunes does not rip accurate audio data
New Posts  All Forums:Forum Nav:

iTunes does not rip accurate audio data

post #1 of 199
Thread Starter 
EDIT: Thread title changed

I am having to re-rip my entire CD collection because after tons of trial and error, I have found that the Apple lossless codec is not lossless at all. It seems to randomly choose which songs it wishes to encode at higher bitrates. So now I downloaded EAC and am using it to rip all my audio CDs to WAV. I refuse to use any lossless codec anymore. The WAV files generated with EAC sound identical to the CD.

I put in a classical CD today that my neighbor loaned me, Antonio Vivaldi, Four Seasons. When I played it straight from my computer's CD player it sounded fantastic. Then I encoded it in ALAC through iTunes (as I have done with most albums in the past, even though I play them through Foobar) and it sounded like garbage. All the high end was missing. And it wasn't semi-obvious. It was so obvious a 90% deaf person could tell the difference. Even the little spectrum analyzer in Foobar was not bouncing up and down at more than half of the way from left to right. But when the CD was playing, the spectrum analyzer was bouncing up and down all the way from left to right (20Hz - 20kHz). So obviously iTunes' ALAC encoding rolls off a ton of audio data. I also did a bit-compare of the files and they are not identical. Even the WAV files extracted from CD with iTunes sound worse than with EAC.

And the FLAC encoder in EAC is no better, just different -- it chops off audio data, too. If lossless is lossless, why are there about 20 different setting in EAC's FLAC encoder, all the way from 32kbps to 1024kbps? I tried encoding a single song with a few of these FLAC settings, from lowest to highest. The sound quality improved with each step up, with 1024 being the best. But even at 1024 it is still short of CD-DA at 1411kbps.

Before I had this Virtual Dynamics Master LE 2.0 digital cable in my system, it was much harder to hear the difference, but it was still noticable. But now it's so obvious it's not even funny. And like I said, both the spectrum analyzer and bit-compare are confirming the obvious difference.

So....

1) Why does Apple claim it's codec is Lossless when it is not?
2) If FLAC is supposed to be lossless, why are there so many different bitrate encoding options that yield different audio data for each setting?

Now I have to get a bigger hard drive so I can store all my uncompressed audio on it. I do not have enough space on my computer for it all.

And if you feel like starting an argument over this, please spare both of us the trouble and refrain from posting. Audio quality is subjective but when the actual data is ending up different from the original source, something is wrong. And ALAC seems to randomly pick which songs it chooses to assign a higher bitrate to.

If anyone has any answers to the above questions, please provide them.
Thanks.
post #2 of 199
This should be easy to confirm objectively. Take a WAV, encode in ALAC, and then decode into WAV again. Use foobar's comparator to compare tracks.
post #3 of 199
I just tried the above with FLAC Level 5, and the decoded WAV is identical to the original WAV.
post #4 of 199
Quote:
Originally Posted by dkjohnso View Post
This should be easy to confirm objectively. Take a WAV, encode in ALAC, and then decode into WAV again. Use foobar's comparator to compare tracks.
Quote:
Originally Posted by dkjohnso View Post
I just tried the above with FLAC Level 5, and the decoded WAV is identical to the original WAV.
Shwing! Thanks a lot dude for not being lazy (like me ), and confirming for the rest of us.
post #5 of 199
You're doing something wrong if you can encode FLAC at a certain bitrate. FLAC isn't CBR, or even VBR, It's Lossless. Which means it uses as many bits at it deems necessary to encode the file as 1411kbps. Due to the lossless compression, the filesize might relate to ~700-1100kbps, but quality wise it is still 1411kbps.

btw, these are my EAC settings:
Use External program for compression [ticked]
Parameter scheme [User Defined Encoder]
File Extension [.flac]
Program [path to flac.exe]
command line options [-V -8 -T "artist=%a" -T "title=%t" -T "album=%g" -T "date=%y" -T "tracknumber=%n" -T "genre=%m" %s]
bit rate [has no affect on your output at all if all things go smoothly!! This option is completely ignored due to the command line options!]
"High quality" selected
Use CRC Check [ticked]
post #6 of 199
Quote:
Originally Posted by TheMarchingMule View Post
Shwing! Thanks a lot dude for not being lazy (like me ), and confirming for the rest of us.
No problem. I would do the same with ALAC but I don't use iTunes. Perhaps someone else wants to give it a try. I am certain the results will be the same.
post #7 of 199
Thread Starter 
Which one is FLAC level 5? It doesn't give those options. It just has about twenty steps from 32kbps to 1024kbps.

And why would you have multiple FLAC options anyway? If it's lossless, it's supposed to be lossless. Why the need for different options if all of them are supposed to yield the same sound quality? This is what doesn't make sense about the whole lossless propoganda.

I have done bitcompare in the past with ALAC generated from iTunes and WAV generated in iTunes. They always turned up the same. That's not what the problem is. The problem is that iTunes is corrupting the audio data from the CD, so you have to do a bitcompare with a song encoded from iTunes and one encoded from EAC and you will see and hear that they sound very different.
post #8 of 199
Thread Starter 
Quote:
Originally Posted by TMM View Post
You're doing something wrong if you can encode FLAC at a certain bitrate. FLAC isn't CBR, or even VBR, It's Lossless. Which means it uses as many bits at it deems necessary to encode the file as 1411kbps. Due to the lossless compression, the filesize might relate to 800-1000kbps, but quality wise it is still 1411kbps.
Yes, and this is what I thought too (and learned ages ago). So then why would you have multiple FLAC levels (which EAC doesn't seem to have anyway)? If everyone of them is supposed to be lossless, why wouldn't everyone want the smallest file size possible?

Quote:
btw, these are my EAC settings:
Use External program for compression [ticked]
Parameter scheme [User Defined Encoder]
File Extension [.flac]
Program [path to flac.exe]
command line options [-V -8 -T "artist=%a" -T "title=%t" -T "album=%g" -T "date=%y" -T "tracknumber=%n" -T "genre=%m" %s]
bit rate [has no affect on your output at all if all things go smoothly!! This option is completely ignored due to the command line options!]
"High quality" selected
Use CRC Check [ticked]
Well I can tell you the bitrate option does have an affect on the sound quality. And there are no options for FLAC Level such and such, as I always hear people talking about.
post #9 of 199
Lossless means no data is discarded and that it can be returned to its original form, not that it isn't compressed.

For instance, when you put something in a ZIP file, it's compressed, but you can get the original contents back.
post #10 of 199
Quote:
Originally Posted by IPodPJ View Post
Which one is FLAC level 5? It doesn't give those options. It just has about twenty steps from 32kbps to 1024kbps.

And why would you have multiple FLAC options anyway? If it's lossless, it's supposed to be lossless. Why the need for different options if all of them are supposed to yield the same sound quality? This is what doesn't make sense about the whole lossless propoganda.
There are 8 levels of compression for FLAC. My understanding is that the higher the compression level the smaller the file size, but the slower the decoding. All levels will yield an identical WAV file upon decompression. You should have the choice of levels rather than the choice of bit rate. Something is definitely wrong, but I don't use EAC or iTunes so I will not be any help there.

Quote:
Originally Posted by IPodPJ View Post
I have done bitcompare in the past with ALAC generated from iTunes and WAV generated in iTunes. They always turned up the same. That's not what the problem is. The problem is that iTunes is corrupting the audio data from the CD, so you have to do a bitcompare with a song encoded from iTunes and one encoded from EAC and you will see and hear that they sound very different.
Perhaps you could do a bit compare and report the results here?
post #11 of 199
Thread Starter 
Quote:
Originally Posted by dkjohnso View Post
No problem. I would do the same with ALAC but I don't use iTunes. Perhaps someone else wants to give it a try. I am certain the results will be the same.
Here are the results:
Comparing:
"C:\Users\Phil\Music\EAC extracted music\Antonio Vivaldi\The Four Seasons\Antonio Vivaldi - The Four Seasons Op. 8 - 01 - La Primavera (the Spring), Allegro.wav"
"C:\Users\Phil\Desktop\Vivaldi_ Four Seasons\01 Vivaldi_ Four Seasons, Op. 8_1, R.m4a"
differences found: 16987797 sample(s), starting at 2.1626531 second(s), peak: 0.9897766 at 20.1665760 second(s), 1ch


The first file listed above was extracted from the CD as a WAV file with EAC, 1411kbps, 3:15 length.
The second file was extracted from the CD with iTunes as an ALAC, showing as 765kbps, 3:15 length.

Like I said, the differences were obvious, not subtle in ANY way. And if you were in question of your hearing, your eyes wouldn't lie to you as the spectrum analyzer readout was totally different on both.
post #12 of 199
Do you think the CD drive might not be utilizing the ASIO? I think that I have seen CD drives in the past that hook directly to the sound card. Maybe I'm wrong that they were playing straight through, I do know that several external that I have seen have the analog unbalanced output.

Also, do you think EAC might be changing the sound with its buffer/jitter protection? What other factors might be at play here.

I don't understand all of this here of cd drive jitter: Hydrogenaudio Forums > Removing CD drive jitter via buffering

Don't you think a scratch would change how the laser reads to playback by a little every time?
post #13 of 199
Quote:
Originally Posted by IPodPJ View Post
Well I can tell you the bitrate option does have an affect on the sound quality. And there are no options for FLAC Level such and such, as I always hear people talking about.
I don't know about ALAC (i'm assuming it is due to not beign a secure rip?) but i think the FLAC bitrate thing is totally a placebo effect you've convinced yourself of. The "-8" in the command line options is the level people speak of. In this case it encodes level 8 (best/smallest/most intensive compression). Can you do a compare in Foobar of two FLACs with different "bitrates"? Or even a WAV and a FLAC (both ripped with EAC of course)

edit:
Quote:
Originally Posted by IPodPJ View Post
Yes, and this is what I thought too (and learned ages ago). So then why would you have multiple FLAC levels (which EAC doesn't seem to have anyway)? If everyone of them is supposed to be lossless, why wouldn't everyone want the smallest file size possible?
Another highly debated issue heh. Some people believe that less compression (e.g. FLAC V1) is better then more compression (e.g. FLAC V8) due to less jitter, but i think its all in their head as long as whatever is playing it uses appropriate buffering .
post #14 of 199
Quote:
Originally Posted by IPodPJ View Post
As soon as EAC is done trying to reassemble the data from my scratched Pink Floyd CD, I will extract one as a WAV using EAC and one as ALAC from iTunes, do a bit-compare in Foobar and show you that they are turning out differently.

Here are the results:
Comparing:
"C:\Users\Phil\Music\EAC extracted music\Antonio Vivaldi\The Four Seasons\Antonio Vivaldi - The Four Seasons Op. 8 - 01 - La Primavera (the Spring), Allegro.wav"
"C:\Users\Phil\Desktop\Vivaldi_ Four Seasons\01 Vivaldi_ Four Seasons, Op. 8_1, R.m4a"
differences found: 16987797 sample(s), starting at 2.1626531 second(s), peak: 0.9897766 at 20.1665760 second(s), 1ch
Aren't you comparing a WAV to an M4A there? Perhaps I am missing something. But of course those files are going to be very different: one is compressed one isn't. Decode that M4A to a WAV and compare the WAV files.
post #15 of 199
More and more people are coming out of the woodwork in support of Uncompressed over Lossless Compression. I won't go back to lossless compression as I find it fatiguing over long periods of time. I spoke with a very high end manufacturer Friday night about this very topic and he said they have done tests with their gear and notice a difference every time. One problem is all the hardcore fanatics who jump all over those of us who prefer uncompressed music. Why would Joe Sixpack voluntarily put himself through the ringer in one of these forums by stating he notices a difference between uncompressed and compressed music. I am used to it by now and I trust my ears over anyone's opinion.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Dedicated Source Components
Head-Fi.org › Forums › Equipment Forums › Dedicated Source Components › iTunes does not rip accurate audio data