Okay, back in the day when DAP's just got started and lossless wasn't on the radar I recall it being obvious as night and day in hearing the difference between 64k, 96k, 128k, 256k, 320k, etc. Some programs even used to let you preview sample compression rates before ripping to hear the difference before making your decision. Today I was challenged to tell the difference between a 128k rip and a lossless file. The differences were nearly indiscernable. Not believing my ears and questioning the source I decided to take my own WAV files and make 128k MP3 rips to compare. Same result!! My question is this... Has something changed in how MP3's are now encoded and decoded compared to the past? It used to be as easy as night and day to tell from below 256. Is it my imagination or is something going on here??