bangraman
Headphoneus Supremus
- Joined
- Oct 3, 2002
- Posts
- 10,308
- Likes
- 83
Hello guys
What I've found is that while the 5/4G DACs have similar performance, the headphone / line output stage offers a cleaner sound on the 3/4G... but really only if you subject it to a high load.
So the measured SNR on the 4G with a line load is higher. In fact practically every spec is better. The oft-mentioned problem with the 4G Photo is that the video stuff introduces additional distortion into the headphone output circuitry, and in addition all pre-5G iPods have a falloff in bass response with a low-impedance load (i.e. many headphones of 32 ohms and under). Added to that, there's significantly increased measurable distortion that creeps in with the same low load.
Using low-impedance earphones, the lack of bass response of the 3G/4G does fool even the reasonably practiced ear into believing that you're hearing better sound quality if you consider the distinct lack of bass to be a 'better' sound. Clearly this board is full of 'em as you guys still have the Etys up on a pedestal
Thing is though, with the older iPods there's actually a grain of truth in it, although the perceived difference in the cleanliness of the sound is more than the actual difference, and this is mainly down to the tonal differences.
In reality, the 5G iPod actually has a response closer to other more flat-sounding players. The thing is that if you consider the 3/4G the better source (and once again, there's some truth in this) the 5G comes across as bassy. But in reality, the 5G is actually flatter. Confused? Clearly many are, and until I started listening to a lot more players and measured them afterwards this wasn't fully clear to me either.
The simplest way to tell you the difference would be that if you stuck a 300 ohm HD650 into both a 4G mono iPod and the 5G iPod playing WAV files, the 4G iPod would sound better to the practiced ear because you wouldn't be pitching tonal differences against each other (both would offer an essentially flat frequency response with only minor variations) and you're freer to hear the better amp stage of the 4G. However if you pulled off that HD650 and replaced it with say a CD3000 (32ohm), ER-4P (24ohm), etc, you'd hear different tonal responses and then the opinions would be far more divided depending on how well you can separate quality from tonal response.
But... there's more!
The 5G has the best codec implementation among all the iPods so far. It's the most bug-fixed, and the most stable. Many MP3 and some AAC issues (some of them VERY audible) which cropped up in 3/4G iPods are fixed on the 5G. So in fact, despite the superiority of the audio stage of the older iPods if you're playing back high-bitrate compressed files you may be experiencing inferior decoding.
I hope I've confused everyone nicely now
What I've found is that while the 5/4G DACs have similar performance, the headphone / line output stage offers a cleaner sound on the 3/4G... but really only if you subject it to a high load.
So the measured SNR on the 4G with a line load is higher. In fact practically every spec is better. The oft-mentioned problem with the 4G Photo is that the video stuff introduces additional distortion into the headphone output circuitry, and in addition all pre-5G iPods have a falloff in bass response with a low-impedance load (i.e. many headphones of 32 ohms and under). Added to that, there's significantly increased measurable distortion that creeps in with the same low load.
Using low-impedance earphones, the lack of bass response of the 3G/4G does fool even the reasonably practiced ear into believing that you're hearing better sound quality if you consider the distinct lack of bass to be a 'better' sound. Clearly this board is full of 'em as you guys still have the Etys up on a pedestal

In reality, the 5G iPod actually has a response closer to other more flat-sounding players. The thing is that if you consider the 3/4G the better source (and once again, there's some truth in this) the 5G comes across as bassy. But in reality, the 5G is actually flatter. Confused? Clearly many are, and until I started listening to a lot more players and measured them afterwards this wasn't fully clear to me either.
The simplest way to tell you the difference would be that if you stuck a 300 ohm HD650 into both a 4G mono iPod and the 5G iPod playing WAV files, the 4G iPod would sound better to the practiced ear because you wouldn't be pitching tonal differences against each other (both would offer an essentially flat frequency response with only minor variations) and you're freer to hear the better amp stage of the 4G. However if you pulled off that HD650 and replaced it with say a CD3000 (32ohm), ER-4P (24ohm), etc, you'd hear different tonal responses and then the opinions would be far more divided depending on how well you can separate quality from tonal response.
But... there's more!
The 5G has the best codec implementation among all the iPods so far. It's the most bug-fixed, and the most stable. Many MP3 and some AAC issues (some of them VERY audible) which cropped up in 3/4G iPods are fixed on the 5G. So in fact, despite the superiority of the audio stage of the older iPods if you're playing back high-bitrate compressed files you may be experiencing inferior decoding.
I hope I've confused everyone nicely now

