IPodPJ
MOT: Bellatone Audio
Caution: Incomplete customer orders
- Joined
- Apr 17, 2006
- Posts
- 7,870
- Likes
- 64
EDIT: Thread title changed
I am having to re-rip my entire CD collection because after tons of trial and error, I have found that the Apple lossless codec is not lossless at all. It seems to randomly choose which songs it wishes to encode at higher bitrates. So now I downloaded EAC and am using it to rip all my audio CDs to WAV. I refuse to use any lossless codec anymore. The WAV files generated with EAC sound identical to the CD.
I put in a classical CD today that my neighbor loaned me, Antonio Vivaldi, Four Seasons. When I played it straight from my computer's CD player it sounded fantastic. Then I encoded it in ALAC through iTunes (as I have done with most albums in the past, even though I play them through Foobar) and it sounded like garbage. All the high end was missing. And it wasn't semi-obvious. It was so obvious a 90% deaf person could tell the difference. Even the little spectrum analyzer in Foobar was not bouncing up and down at more than half of the way from left to right. But when the CD was playing, the spectrum analyzer was bouncing up and down all the way from left to right (20Hz - 20kHz). So obviously iTunes' ALAC encoding rolls off a ton of audio data. I also did a bit-compare of the files and they are not identical. Even the WAV files extracted from CD with iTunes sound worse than with EAC.
And the FLAC encoder in EAC is no better, just different -- it chops off audio data, too. If lossless is lossless, why are there about 20 different setting in EAC's FLAC encoder, all the way from 32kbps to 1024kbps? I tried encoding a single song with a few of these FLAC settings, from lowest to highest. The sound quality improved with each step up, with 1024 being the best. But even at 1024 it is still short of CD-DA at 1411kbps.
Before I had this Virtual Dynamics Master LE 2.0 digital cable in my system, it was much harder to hear the difference, but it was still noticable. But now it's so obvious it's not even funny. And like I said, both the spectrum analyzer and bit-compare are confirming the obvious difference.
So....
1) Why does Apple claim it's codec is Lossless when it is not?
2) If FLAC is supposed to be lossless, why are there so many different bitrate encoding options that yield different audio data for each setting?
Now I have to get a bigger hard drive so I can store all my uncompressed audio on it. I do not have enough space on my computer for it all.
And if you feel like starting an argument over this, please spare both of us the trouble and refrain from posting. Audio quality is subjective but when the actual data is ending up different from the original source, something is wrong. And ALAC seems to randomly pick which songs it chooses to assign a higher bitrate to.
If anyone has any answers to the above questions, please provide them.
Thanks.
I am having to re-rip my entire CD collection because after tons of trial and error, I have found that the Apple lossless codec is not lossless at all. It seems to randomly choose which songs it wishes to encode at higher bitrates. So now I downloaded EAC and am using it to rip all my audio CDs to WAV. I refuse to use any lossless codec anymore. The WAV files generated with EAC sound identical to the CD.
I put in a classical CD today that my neighbor loaned me, Antonio Vivaldi, Four Seasons. When I played it straight from my computer's CD player it sounded fantastic. Then I encoded it in ALAC through iTunes (as I have done with most albums in the past, even though I play them through Foobar) and it sounded like garbage. All the high end was missing. And it wasn't semi-obvious. It was so obvious a 90% deaf person could tell the difference. Even the little spectrum analyzer in Foobar was not bouncing up and down at more than half of the way from left to right. But when the CD was playing, the spectrum analyzer was bouncing up and down all the way from left to right (20Hz - 20kHz). So obviously iTunes' ALAC encoding rolls off a ton of audio data. I also did a bit-compare of the files and they are not identical. Even the WAV files extracted from CD with iTunes sound worse than with EAC.
And the FLAC encoder in EAC is no better, just different -- it chops off audio data, too. If lossless is lossless, why are there about 20 different setting in EAC's FLAC encoder, all the way from 32kbps to 1024kbps? I tried encoding a single song with a few of these FLAC settings, from lowest to highest. The sound quality improved with each step up, with 1024 being the best. But even at 1024 it is still short of CD-DA at 1411kbps.
Before I had this Virtual Dynamics Master LE 2.0 digital cable in my system, it was much harder to hear the difference, but it was still noticable. But now it's so obvious it's not even funny. And like I said, both the spectrum analyzer and bit-compare are confirming the obvious difference.
So....
1) Why does Apple claim it's codec is Lossless when it is not?
2) If FLAC is supposed to be lossless, why are there so many different bitrate encoding options that yield different audio data for each setting?
Now I have to get a bigger hard drive so I can store all my uncompressed audio on it. I do not have enough space on my computer for it all.
And if you feel like starting an argument over this, please spare both of us the trouble and refrain from posting. Audio quality is subjective but when the actual data is ending up different from the original source, something is wrong. And ALAC seems to randomly pick which songs it chooses to assign a higher bitrate to.
If anyone has any answers to the above questions, please provide them.
Thanks.