Originally Posted by tmars78
Have you ever really tried to distinguish 96k and 128K from each other? I'm going to say that your 100% is a flat out lie. We have people here that can't always tell lossless from 128k, but you can tell 96k and 128k 100% of the time?
I don't see why it is so hard to believe that there is a threshold below which I can distinguish a compressed track from a CD and above which I cannot. Can you distinguish between 64k and 128k nearly all the time? I suspect most, if not all people on this board can. But by your logic none of us should be able to do so since there are people here who "can't always tell lossless from 128k." You're making the mistake of assuming that because much more sonic information separates 128k from lossless, it must be harder to distinguish a 96k MP3 track from a 128k MP3 than it is to distinguish a 128k MP3 track from a 1411k WAV file.
I'm talking, of course, about rich, decently recorded music tracks. I'm sure there are voice recordings and such where I would not be able distinguish the two bitrates, but for your typical pop song, yes, I feel confident I can. Part of the reason I feel confident about this is that I once accidentally ripped a CD at 96k (Led Zeppelin's How the West Was Won) and kept wondering why it sounded so "shallow" and tinny when I listened to it on my iPod. Only some time later did I realize that I had changed the recording bitrate in iTunes for some voice records I had ripped and forgot to bump it back to 128k before ripping the Zeppelin discs (I think this is before I started ripping at 192k). I've noticed the same thing with tracks I've downloaded that didn't quite sound right until I realized they were encoded at 96k.