iPod Classic stomps all over 5th gen

Oct 3, 2007 at 1:09 AM Post #76 of 154
Quote:

Originally Posted by IPodPJ /img/forum/go_quote.gif
Well it may not make much sense to you now, but spend some more time reading through the forum and then it will.


Hoo Boy! Spend some more time reading this forum and a lot of nonsense will begin to make sense! Then shake your head clear of it all and go listen to some great music.

See ya
Steve
 
Oct 3, 2007 at 1:14 AM Post #77 of 154
Quote:

Originally Posted by IPodPJ /img/forum/go_quote.gif
Well it may not make much sense to you now, but spend some more time reading through the forum and then it will.


People here seem to largely divide into those who believe in burn in and those that don't. People get entrenched into these views either based on what they think they know of science or because their ears tell them so. Personally I think there's some cases where burn in has merit and others where it doesn't. Certainly if you have moving parts like headphone drivers then it would seem feasible that their characteristics may change with wear just like many materials do when you flex them. Some electrical components, especially capacitors, are known to change once they start being used regularly. In something like the Classic though, it is hard to see what could really change with use. The only real possibilities are the headphone driver (mostly integrated into the DAC chip) and the power circuitry. Finding noticeable changes would in theory be very unlikely.

Quote:

Originally Posted by IPodPJ /img/forum/go_quote.gif
And if you listen to this iPod, the sound signature does change after about 20 - 40 hours. The 5th gen didn't do this, but the Classic does. I'm still not sure why. The Classic has a clean signal path, which the 5th gen did not have.


Speaking from a purely anecdotal point of view, I haven't noticed any change in my Classic in the past four weeks since I got it. Must be nearing 200 hours. This I'm not categorically saying there have been no changes in the sound, but all the characteristics I originally identified when I first started listening to it still ring as true to me now as they did then. While I wouldn't totally rule out changes due to burn in they're very small changes if they're there.
 
Oct 3, 2007 at 3:21 AM Post #78 of 154
Mirumu,

It could be due to the fact that the Classic has bugs and defects. Some people have no issue with the soundstage, but on mine, played through the headphone output, the soundstage is cockeyed, with the left channel being pushed towards the rear of my head. Maybe some parts in mine are just wearing out as I use it, I honestly don't know.

But it's not like I use my iPod every hour of the day and would be getting used to its sound signature. I don't use it every day, and when I do I listen for maybe 30 minutes in that day. I only let it "burn in" by itself when I first got it after I listened for awhile, by setting it aside and letting it run while pushing a load through the LOD to the headphone amp.

I never experienced any change in sound signature with my 5g iPod. And I was shocked to find this one changing on me.

Who knows.
 
Oct 3, 2007 at 3:44 AM Post #79 of 154
Experimentally, many products improve after burn in. You just have to try and find out. If it is significant, it will be easily apparent.

Much of the iPod is analog output. Plus, even digital circuitry is in fact analog. "Digital" is an abstraction that is actually implemented by sample-and-hold analog circuits, filters, etc.

Break in of the G6 would fit the fact that opinion is so polarized. I'm open to finding it sounds good after break in, despite my own tests at the store in which I found it unacceptable.
 
Oct 3, 2007 at 3:53 AM Post #80 of 154
Quote:

Originally Posted by IPodPJ /img/forum/go_quote.gif
Mirumu,

It could be due to the fact that the Classic has bugs and defects. Some people have no issue with the soundstage, but on mine, played through the headphone output, the soundstage is cockeyed, with the left channel being pushed towards the rear of my head. Maybe some parts in mine are just wearing out as I use it, I honestly don't know.



Yes, I have no issue with the soundstage myself but when I measured the phase response it looked much the same as the iPods people are complaining about. This makes me think the iPods are probably all much the same, and I can only assume we all just hear it differently. Nothing wrong with that of course.

Quote:

Originally Posted by IPodPJ /img/forum/go_quote.gif
But it's not like I use my iPod every hour of the day and would be getting used to its sound signature. I don't use it every day, and when I do I listen for maybe 30 minutes in that day. I only let it "burn in" by itself when I first got it after I listened for awhile, by setting it aside and letting it run while pushing a load through the LOD to the headphone amp.

I never experienced any change in sound signature with my 5g iPod. And I was shocked to find this one changing on me.

Who knows.



I know what you mean, it's impossible to know for sure. It just seems burn in would be very unlikely to affect something like the Classic. I suppose it is possible that some units have been part of a random sample that had heavier testing (hence burn-in) prior to leaving the factory.
 
Oct 3, 2007 at 3:57 AM Post #81 of 154
Quote:

Originally Posted by Stoney /img/forum/go_quote.gif
Much of the iPod is analog output. Plus, even digital circuitry is in fact analog. "Digital" is an abstraction that is actually implemented by sample-and-hold analog circuits, filters, etc.


That's technically correct but unless it's enough of a variation to flip bits or throw timing off then it's going to have absolutely no effect on the sound. You'd have to consider a digital device defective if it's tolerances were poor enough to allow bit flips under burn-in conditions.
 
Oct 3, 2007 at 5:35 AM Post #82 of 154
Quote:

unless it's enough of a variation to flip bits or throw timing off then it's going to have absolutely no effect on the sound.


Not so. First, the voltage that a set of bits is converted to is done by circuitry that is analog, and is prone to hysteresis and other analog errors. So, sound out is not identical just because bits are identical.

Second, the voltage that a digital "word" converts to has to be outputted at the exactly correct moment. "Timing thrown off" as you put it. A time error is indistinguishable from a voltage error. Digital signals are reclocked with phase-lock loops, but they aren't as stable as we'd like to believe. Just like a rocket is always correcting its course, so is a loop always adjusting its frequency. This spectrum of the "course corrections" is transferred directly into the music.

Think of an x-y plot of a straight line of points. If all the Y's are the correct values but the X's are a bit off, the line is no longer straight.

Jitter is very real and very audible. So much so that, to my surprise, the same CD burnt at different speeds sounded different, and the frequency band most affected went down with the burn speed. Others report the exact same thing. Bit by bit, the disks read the same, but players can't fully reclock out the jitter also encoded on the discs.

But, with MP3 players, it is much, much worse. The music is approximated by brief wavelets summed together, and has to be decoded. There is much variation and even judgment calls as to what sounds best in the decoding. No two algorithms should be expected to sound the same. Different output chip sets and algorithms may sound noticeably different.

Digital Domain - Jitter (in CDs, never mind compressed formats!)
 
Oct 3, 2007 at 5:27 PM Post #83 of 154
Jitter is a completely inaudible in most home electronics; and as a problem, it is insignificant compared to the other issues facing even the most well thought out rigs. Burn errors are burn errors. They result in data loss, not jitter.

See ya
Steve
 
Oct 3, 2007 at 8:02 PM Post #84 of 154
Quote:

Originally Posted by Stoney /img/forum/go_quote.gif
Not so. First, the voltage that a set of bits is converted to is done by circuitry that is analog, and is prone to hysteresis and other analog errors. So, sound out is not identical just because bits are identical.

But, with MP3 players, it is much, much worse. The music is approximated by brief wavelets summed together, and has to be decoded. There is much variation and even judgment calls as to what sounds best in the decoding. No two algorithms should be expected to sound the same. Different output chip sets and algorithms may sound noticeably different.

Digital Domain - Jitter (in CDs, never mind compressed formats!)



Good explanation. So assuming Apple played around and came up with the best compromise resulting in a decent chipset and algorhythm combo, we can expect the baseline sound in the Classic to be pretty decent. (That's why there are clearly audible differences between different gens of iPods, including the Classic: because of different chipset/algorhythm combos.)

But again I have to ask: even assuming that sq change is possible (for whatever reasons) after "burn in", why would the change necessarily be a good one? That is, starting from Apple's decent sound baseline, why wouldn't burn in cause degradation rather than improvement? Why not cause, for example,decreased bass or overwhelming bass rather than tamed trebles? Why not scooped or recessed mids? I'll concede that there are analog components in a digtal player, and that the DAC has an analog component, but to me this would not be enough to cause a positive change in sq the way analog speakers benefit from burn in across the audible spectrum.
 
Oct 3, 2007 at 8:59 PM Post #85 of 154
Quote:

Originally Posted by Stoney /img/forum/go_quote.gif
Not so. First, the voltage that a set of bits is converted to is done by circuitry that is analog, and is prone to hysteresis and other analog errors. So, sound out is not identical just because bits are identical.


You're not talking here about the same thing you originally described. You explicitly mentioned digital circuitry before. The analog conversion circuitry (i.e. the DAC) is certainly affected by analog errors since that is where the conversion to analog is done. That is because this circuit is in essence analog, not digital. A good power supply is crucial at this point and it's worth pointing out that batteries as used in a typical DAP are much better than a power socket used by a typical CD player. In the purely digital circuitry higher up the chain the fact that any single bit has a slightly different voltage is neither here nor there as long as the bit is correctly represented as a 1 or a 0 and the tolerances are very high against this happening.

Quote:

Originally Posted by Stoney /img/forum/go_quote.gif
Second, the voltage that a digital "word" converts to has to be outputted at the exactly correct moment. "Timing thrown off" as you put it. A time error is indistinguishable from a voltage error. Digital signals are reclocked with phase-lock loops, but they aren't as stable as we'd like to believe. Just like a rocket is always correcting its course, so is a loop always adjusting its frequency. This spectrum of the "course corrections" is transferred directly into the music.

Think of an x-y plot of a straight line of points. If all the Y's are the correct values but the X's are a bit off, the line is no longer straight.

Jitter is very real and very audible. So much so that, to my surprise, the same CD burnt at different speeds sounded different, and the frequency band most affected went down with the burn speed. Others report the exact same thing. Bit by bit, the disks read the same, but players can't fully reclock out the jitter also encoded on the discs.

But, with MP3 players, it is much, much worse. The music is approximated by brief wavelets summed together, and has to be decoded. There is much variation and even judgment calls as to what sounds best in the decoding. No two algorithms should be expected to sound the same. Different output chip sets and algorithms may sound noticeably different.

Digital Domain - Jitter (in CDs, never mind compressed formats!)



Jitter on CDs is in practice a substantially worse problem than it is on a DAP. DAPs are buffered devices and data transfer from the hard drive, to memory and out to the codec is in essence perfect and done at extremely high speeds in parallel between buffers well above the rate the DAC outputs audio. There is no need for the reclocking you mention here as data transfer will use the same clock on both ends and will end up back in a buffer so any potential variation is factored out again by the time the word reaches the destination.

Yes, there can be variation in the implementation of each codec but any individual codec will be consistent in it's implementation. Any imperfections the codec generates will always exist, and the algorithm will not change with burn-in. Additionally, the codec's decoding algorithm implementation is only a factor with lossy audio, with lossless/raw formats the data coming out should always match the original data exactly.

The transfer from the codec to the DAC is the place where jitter can potentially occur, typically this is the weak link of a CD based system where you hook your digital output up to an external DAC. Clock information is not transfered over a typical SPDIF or AES/EBU link, instead it is implied by the transfer (this is a really bad idea). Some high end CD player/DAC combinations get around this by having a separate cable to carry the clock information purely to minimize jitter. In a DAP like the iPod, transfer from the codec to the DAC is done over I2S. The whole purpose of I2S is to not only transfer the audio data stream but also the clock to the DAC specifically to minimize jitter. In the iPod Classic it would appear that this I2S link is an incredibly short trace internal to the codec chip so the chance of clock skew is lessened even more. Yes, it's theoretically possible to get some jitter at this stage but the chance is vastly lower than with a CD connected to an external DAC.

None of what I am saying here contradicts the page you linked to either. There's a lot of factors involved when it comes to jitter. You will also find many people will flatly deny any effects of jitter, it's a controversial topic. For example the waveform graphs shown on the page you liked to are in effect showing a square wave around 22050hz. The reason for using that specific frequency is that the highest frequencies would be the worse cases for jitter since there are so few samples. In practice though most people can't hear frequencies this high, and most players will filter frequencies that high so that's never a situation you will see in the real world. At lower frequencies the representation of the waves would look substantially more accurate even with jitter since there are more samples.

There is of course still another way jitter can get into your DAP. If your CD ripper does not try and correct jitter right from the start, then your digital files will contain jitter before the files are transferred to your DAP and will carry it all the way through the chain to the DAC being used for playback. This is why people should rip with EAC, Max, etc
 
Oct 3, 2007 at 9:00 PM Post #86 of 154
I'm not sure if this has been discussed already but has anyone addressed the click-wheel?

I was at Best Buy a few days ago and when I tried out the 80 gig Classic, I was disappointed in the click wheel. It's not nearly as responsive as my 4th gen's click wheel is. You have to to put a lot more pressure on it to register and it doesn't feel as precise either. Normally I would have just dismissed this issue as merely a floor-model malfunction due to all the Best Buy customers continually playing with it, but I've heard from numerous sources that the wheel is, in fact, less sensitive this time around (not to mention the fact that the Classic is so new that I doubt it would be been damaged so soon).

Does anyone have any incite into this 'problem'? I'm dying to purchase a Classic to replace my broke monochromatic 20 gig 4G, but a bad click wheel (not to mention the sluggish menu) could really turn me off. What's the verdict on the wheel?
 
Oct 3, 2007 at 9:08 PM Post #87 of 154
mirumu:

Some good points. Keep in mind that noise in a computer is pretty high, and in principle can cross-couple, even intermodulate, and cause jitter. I mention CDs just to illustrate easily, to those that may be new to jitter.

I disagree that you need to minimize jitter before ripping. Encoding is not done in the time domain, so "bits is bits" applies there. Jitter on playback is always removable by an ideal clock. This is why improved clocks and buffering help in high-end players.

Jitter during recording is irrevocable.
 
Oct 3, 2007 at 9:41 PM Post #88 of 154
Quote:

Originally Posted by Stoney /img/forum/go_quote.gif
mirumu:

Some good points. Keep in mind that noise in a computer is pretty high, and in principle can cross-couple, even intermodulate, and cause jitter. I mention CDs just to illustrate easily, to those that may be new to jitter.



Certainly, computers are some of the nosiest devices around with all the different clocks. They do try to minimize this in DAPs and are a lot better than the average home computer although how successful they are overall compared to something like a CD player is debatable.

Quote:

Originally Posted by Stoney /img/forum/go_quote.gif
I disagree that you need to minimize jitter before ripping. Encoding is not done in the time domain, so "bits is bits" applies there. Jitter on playback is always removable by an ideal clock. This is why improved clocks and buffering help in high-end players.


Some cheap and/or older computer CD drives use different strategies for reading audio CDs and data CDs. They stream bits without performing any significant error handling and jitter errors can slip through. But yes, most decent CD drives these days will treat all CDs the same (since there's no advantage in the old method) and will detect and correct any jitter problems before it hits the buffers. Disc damage would be a much more pertinent concern for sure.
 
Oct 3, 2007 at 9:55 PM Post #89 of 154
Quote:

Originally Posted by Stoney /img/forum/go_quote.gif

I disagree that you need to minimize jitter before ripping. Encoding is not done in the time domain, so "bits is bits" applies there. Jitter on playback is always removable by an ideal clock. This is why improved clocks and buffering help in high-end players.

Jitter during recording is irrevocable.



I thought that jitter that was introduced when ripping CDs while potentially making them sound worse, was NOT actually wed to the data itself, and as such could in fact be removed by doing a new clear extraction and rip.
 
Oct 3, 2007 at 10:04 PM Post #90 of 154
Ripping and data transfer to storage has nothing to do with jitter. As long as the data gets from point A to B, jitter is irrelevant.
 

Users who are viewing this thread

Back
Top