tfarney
1000+ Head-Fier
- Joined
- Feb 8, 2008
- Posts
- 1,257
- Likes
- 16
This elusive distortion seems to be the bane of digital audio. Much is done to minimize it, mitigate it, move it to someplace where we can't hear it. But every description of it I've come across seems to boil down to harsh trebles, and there is so much bad mastering in the present and so much bad recording in the past (just listen to some pop records from the 60s), I'm not sure I have any way of differentiating between "jitter" and just badly recorded or mastered material with nasty trebles. Is there a way? Or, with reasonably good equipment, is jitter really much of an issue these days?
Not to be cynical (well, maybe just a bit...), but I seem to be finding two very distinct points of view on harsh trebles and glare in digital audio. One comes from audiophiles and their high-end suppliers who have them running for cover and spending thousands of dollars on cable snd black boxes to eliminate this uniquely digital distortion, thousands more on tube DACs, pre-amps and amps to mask it, and the other is recording and mastering pros who advise us to turn down the treble a bit...
My experience seems to reinforce the latter. I just lived through an example. iTunes was set to global mix and match - randomly picking tunes from my entire collection. It picked the live recording of "Tin Pan Alley" from SRV's "In The Beginning." Harsh trebles. Harsh upper mids. I have my amp set to hinge the treble at 5kz, and the treble knob rolled back about 1/4, so when I run into this issue I can just rech over there and release the tone defeat switch, which I did, but that wasn't enough.
Jitter?
I switched tone defeat back in and played "Tin Pan Alley" off of "Couldn't Stand the Weather" instead. Smooth as a baby's back side. Warm. Sweet. The ride cymbal and the muted pop of the snare drum's rim were still bright and present, but all traces of upper mid and treble harshness were gone.
No, not jitter, not this time -- recording, mastering. And this is with a cheap data-grade USB cable from hard drive to Mac and whatever DAC chip is on my iBook's soundcard.
So what does jitter sound like?
Tim
Not to be cynical (well, maybe just a bit...), but I seem to be finding two very distinct points of view on harsh trebles and glare in digital audio. One comes from audiophiles and their high-end suppliers who have them running for cover and spending thousands of dollars on cable snd black boxes to eliminate this uniquely digital distortion, thousands more on tube DACs, pre-amps and amps to mask it, and the other is recording and mastering pros who advise us to turn down the treble a bit...
My experience seems to reinforce the latter. I just lived through an example. iTunes was set to global mix and match - randomly picking tunes from my entire collection. It picked the live recording of "Tin Pan Alley" from SRV's "In The Beginning." Harsh trebles. Harsh upper mids. I have my amp set to hinge the treble at 5kz, and the treble knob rolled back about 1/4, so when I run into this issue I can just rech over there and release the tone defeat switch, which I did, but that wasn't enough.
Jitter?
I switched tone defeat back in and played "Tin Pan Alley" off of "Couldn't Stand the Weather" instead. Smooth as a baby's back side. Warm. Sweet. The ride cymbal and the muted pop of the snare drum's rim were still bright and present, but all traces of upper mid and treble harshness were gone.
No, not jitter, not this time -- recording, mastering. And this is with a cheap data-grade USB cable from hard drive to Mac and whatever DAC chip is on my iBook's soundcard.
So what does jitter sound like?
Tim