@Jawed I don't disagree with you that dollar for performance dollar, GPUs will lose out to FPGAs will lose out to ASICs...That's just the tradeoff between general purpose vs special purpose.... nor do I disagree that the current market does not support nVidia making GPUs viable for DACs (bec nVidia won't make them available in a package suitable for a DAC as the DAC market is too small for them to even care to pursue -- I think that was my initial comment).
But... if someone would to strip apart a DAC and just put the pulse array and other analog portions into a "bare bones" DAC, provide an USB driver with SDK to access it, and maybe an open source project reference implementation for the entire DSP portion -- with all that, I would maintain that there's more than enough power in a $800 nVidia card today to handle the DSP pieces to match the capabilities of BluDAVE. (* perhaps the footnote is, I have not verified the floating point precision needs, but CUDA does do 64 bit floating point.)
Admittedly, this is only of interest to folks using a PC as a source, another reason why an nVidia-based approach is not on the commercial horizon. This type of solution would appeal more to someone wanting to make exploring audio DSP an easy-to-access proposition in the hopes of fostering some general breakthrough (as opposed to for commercial gain).
Of course, this isn't something I foresee Chord doing, or any other existing audio vendor. Maybe if some enthusiast is willing to finance it personally (get Jeff Bezos interested in audio instead of building the world's baddest clock?)....
I was only reacting to your strong pushback that it's just nVidia marketing hype -- Not that we shouldn't expect a near term commercial utilization, a position with which I agree.
I didn't know whether it was really just hype or not, hence I wanted to run some basic benchmarks and seek clarification.
But... if someone would to strip apart a DAC and just put the pulse array and other analog portions into a "bare bones" DAC, provide an USB driver with SDK to access it, and maybe an open source project reference implementation for the entire DSP portion -- with all that, I would maintain that there's more than enough power in a $800 nVidia card today to handle the DSP pieces to match the capabilities of BluDAVE. (* perhaps the footnote is, I have not verified the floating point precision needs, but CUDA does do 64 bit floating point.)
Admittedly, this is only of interest to folks using a PC as a source, another reason why an nVidia-based approach is not on the commercial horizon. This type of solution would appeal more to someone wanting to make exploring audio DSP an easy-to-access proposition in the hopes of fostering some general breakthrough (as opposed to for commercial gain).
Of course, this isn't something I foresee Chord doing, or any other existing audio vendor. Maybe if some enthusiast is willing to finance it personally (get Jeff Bezos interested in audio instead of building the world's baddest clock?)....
I was only reacting to your strong pushback that it's just nVidia marketing hype -- Not that we shouldn't expect a near term commercial utilization, a position with which I agree.
I didn't know whether it was really just hype or not, hence I wanted to run some basic benchmarks and seek clarification.