MojoAudio
Member of the Trade: Mojo Audio
- Joined
- Feb 6, 2012
- Posts
- 126
- Likes
- 86
Erin makes some very good points.
But please understand that my points are intended as over simplifications for the novice and not to be used as points in a highly technical debate.
Also consider that 80% of the music we hear is in the mid-range above 100Hz and below 3KHz.
And consider the peak-to-peak voltage of a power supply is only one factor. Things like the dynamic response, ringing, and odd harmonics play a significant role in what our ears perceive as "right."
For the record, I'm not a big fan of the TDA15xx DAC chips in general. They are relatively noisy, not particularly linear, and have a "grainy" sound relative to the later 18-bit and 20-bit R-2R DAC chips.
The point I was trying to make is that in a "budget" DAC (<$500) you can get better performance from something like a Terra DAC using the TDA1543 chip than some more advanced designs because they can put the $$$ into one better single voltage external PSU and some fairly high grade output coupling caps. When you get into the >$1K price range I think there are many more favorable options using other R-2R DAC chips.
BTW, I admit I was mistaken about the TDA1543 having an S/PDIF input...remind me not to post at 3am when my cat Rufus wakes me up
Personally I'm a big fan of the 20-bit R-2R DAC chips such as the PCM63 (my personal favorite) and the AD1862. To my ears they have much less noise and grain than the older R-2R DAC chips and better articulation than the more modern "hybrid" 24-bit R-2R DAC chips.
But that is just my personal preference...some audiophiles and manufacturers prefer the sound of the TDA1541 or TDA1543...look how many highly regarded (and highly priced) DACs on that list use those chips.
As for 16/44.1 being all that is required, I don't agree. At the same time I think that 24/192 is a bit on the ridiculous side. I agree with some of the renowned recording engineers that claim somewhere around 20/96 is about the maximum playback electronics and the human ear can distinguish under the best of conditions. Of course we never get to hear things under "the best of conditions" so likely far less than 20/96 would be optimal for commercial recordings.
Consider that these LSB are rarely used on any commercial recording. See if you can actually find a commercial recording with greater than 16-bits of dynamic range despite being sold as a so-called HD 24-bit format.
So when some people argue "I can hear the difference" they are in fact hearing the difference in the modern low-noise remastering or some other factor and not the LSB above 20-bits.
I've been exhibiting at audiophile shows for nearly a decade and my favorite thing to do is to play a well recorded 16-bit Red Book recording and ask people "what resolution do you think that is?"
Most will tell you it's a 24/96 or 24/192.
When I tell them "you're listening to a 16/44.1 Red Book CD" their jaws drop in disbelief and question how it could sound as good or better than the 24/192 and DSD recordings they've heard in other rooms.
One year at RMAF we even had a contest and played a series of 16/44.1, 24/96, and 24/192 recordings requesting that people mark down the digital resolution. Prizes were awarded to "the most correct" marked on a sheet. Long story short in all the entries done over the three days of the show the best anyone got was 1 out of 5 correct. Seriously.
What does that tell you about recording, mixing, and mastering quality vs. HD file formats?
And back to topic, what does that tell you about the potential of a vintage R-2R DAC
But please understand that my points are intended as over simplifications for the novice and not to be used as points in a highly technical debate.
Also consider that 80% of the music we hear is in the mid-range above 100Hz and below 3KHz.
And consider the peak-to-peak voltage of a power supply is only one factor. Things like the dynamic response, ringing, and odd harmonics play a significant role in what our ears perceive as "right."
For the record, I'm not a big fan of the TDA15xx DAC chips in general. They are relatively noisy, not particularly linear, and have a "grainy" sound relative to the later 18-bit and 20-bit R-2R DAC chips.
The point I was trying to make is that in a "budget" DAC (<$500) you can get better performance from something like a Terra DAC using the TDA1543 chip than some more advanced designs because they can put the $$$ into one better single voltage external PSU and some fairly high grade output coupling caps. When you get into the >$1K price range I think there are many more favorable options using other R-2R DAC chips.
BTW, I admit I was mistaken about the TDA1543 having an S/PDIF input...remind me not to post at 3am when my cat Rufus wakes me up
Personally I'm a big fan of the 20-bit R-2R DAC chips such as the PCM63 (my personal favorite) and the AD1862. To my ears they have much less noise and grain than the older R-2R DAC chips and better articulation than the more modern "hybrid" 24-bit R-2R DAC chips.
But that is just my personal preference...some audiophiles and manufacturers prefer the sound of the TDA1541 or TDA1543...look how many highly regarded (and highly priced) DACs on that list use those chips.
As for 16/44.1 being all that is required, I don't agree. At the same time I think that 24/192 is a bit on the ridiculous side. I agree with some of the renowned recording engineers that claim somewhere around 20/96 is about the maximum playback electronics and the human ear can distinguish under the best of conditions. Of course we never get to hear things under "the best of conditions" so likely far less than 20/96 would be optimal for commercial recordings.
Consider that these LSB are rarely used on any commercial recording. See if you can actually find a commercial recording with greater than 16-bits of dynamic range despite being sold as a so-called HD 24-bit format.
So when some people argue "I can hear the difference" they are in fact hearing the difference in the modern low-noise remastering or some other factor and not the LSB above 20-bits.
I've been exhibiting at audiophile shows for nearly a decade and my favorite thing to do is to play a well recorded 16-bit Red Book recording and ask people "what resolution do you think that is?"
Most will tell you it's a 24/96 or 24/192.
When I tell them "you're listening to a 16/44.1 Red Book CD" their jaws drop in disbelief and question how it could sound as good or better than the 24/192 and DSD recordings they've heard in other rooms.
One year at RMAF we even had a contest and played a series of 16/44.1, 24/96, and 24/192 recordings requesting that people mark down the digital resolution. Prizes were awarded to "the most correct" marked on a sheet. Long story short in all the entries done over the three days of the show the best anyone got was 1 out of 5 correct. Seriously.
What does that tell you about recording, mixing, and mastering quality vs. HD file formats?
And back to topic, what does that tell you about the potential of a vintage R-2R DAC