I was in the process of converting some of my old lossy AAC to ALAC and in quite a few tracks I noticed that I was very hard pressed trying to hear a difference between my old 160kbps AAC vs the newly recorded ALACs. This is listening through my Benchmark DAC1 Pre (USB, Audirvana+), Stax SRM-727A, and SR-009. As a backup, I also listened via the same DAC but Ortofon Hd-Q7 amp & Fostex TH-900...and asked my wife to have a listen too. Both of us listened hard and couldn't hear noticeable differences.
After converting a few different albums comparing, I did finally managed to pick up some that I could hear differences and a trend started to form. It seemed really old rips from iTunes 4.0.1 even at lossy 160kbps actually sounded very very good, but along the way somewhere around iTunes 6.0.3, it sounds like the algorithm had changed and actually got worse?
Although conceptually illogical, it makes me curious if so-called Apple Lossless over the years may have changed too 'cos after all, it's an algorithm.
Does anyone who have the same track rips :-
1) lossy from old iTunes & new iTunes confirm that they sound different?
2) lossless from old iTunes and new iTunes comment if they sound exactly the same?
Please note that I'm looking for genuine practical experiences rather than theoretical or conceptual opinions here (i.e. none of the "lossless is lossless" comments please).