Josh83
100+ Head-Fier
the AD5791 specs in its datasheet are much more convincing re 20-bit performance than the PCM1704 - the 1704 even given the slant towards dynamic spec just is a bit shy on useful performance numbers at the 20 bit level
the AD5791 however does give an awful dynamic distortion number - Mike's secret sauce on that hasn't been revealed so far
it would be interesting to see the Yggy's 24 bit sine fade linearity/THD plot as the digital level drops through -120 dB
the AP analyzers do this with internal gain switching - can see this measurement in Stereophile measurements of hi end audio DAC
a home soundcard and AV 100-1000x amp with a couple of op amps would do the job even if averaging were needed to pull the signal from the noise (yes I know how to make low enough distortion clipping/limiting amps to protect the soundcard input)
maybe we will eventually get higher res measurements from a AP analyzer or even a teardown/internal tour with 'scope pics one day
"First true 20-bit" is just marketing speak (Schiit or Analog Devices) for the AD5791 DAC chip being super linear all the way down to the 20th bit. R2R DAC chips historically have not been all that linear at the bottom and the lowest ladder rung changes are in the thermal noise of the system at the number of bits increase. That is why the PCM1704UK is so expensive as lasers are used to trim ladder down to incredible small levels but any minute changes in voltage, temperature, or other factors in the real world applications can throw off linearity at this minute level. Schiit's position is there is not much going on down there that can be retrieved and useful so why not use a really good 20-bit DAC chip with guaranteed performance. Maybe someone else can explain better.
My first Sony CD player (1983) had a single 14-bit DAC but I could hear a lot of low level information, reverb tails, etc. It was much better than the later 1-bit DAC chips that followed in the late 80s. My hearing is probably 7-bits now anyway. :wink_face:
So, this may be a dumb follow-up, but how does the chip deal with 24-bit audio, if it doesn't throw out any of the original bits?