FallenAngel
Headphoneus Supremus
EMU1212M looks very nice but not much better than EMU0404PCI. I haven't tried many "super high end" digital sources, but I really do like the EMU0404PCI.
Originally Posted by HiPerFreak /img/forum/go_quote.gif what soundcard offers the best possible signal quality on its SPDIF output one can buy for money today? I want to drive a high end dolby surround preamp with both stero and AC3 material and I want to get the absolute best quality that is possible out of my PC. I would be glad to hear your recommendations! |
Originally Posted by gregorio /img/forum/go_quote.gif Any old sound card will do but one that allows bit perfect output would be best. If the output is bit perfect then it's bit perfect, a better or more expensive sound card is not going to make sound quality any more perfect. Someone mentioned 192kFS/s but this is a waste of time for two reasons: Firstly Dolby Digital (AC3) does not support 192k (only 48kFS/s) and secondly there is just no point to the 192k sample rate anyway, unless you want to see what a full hard disk looks like! |
Originally Posted by linuxworks /img/forum/go_quote.gif the new codecs are a money grab from the music/movie industry. they offer no real benefit, given that most peoples' playback systems don't even do ac3 justice. ac3 isn't all that bad - and given the context (movies) - its more than good enough. we didn't need new audio standards. we just needed to implement the ones we have WELL. the codecs are a cash grab. force people to re-buy their media and hardware. brilliant. 'cept I'm not playing that game again. been there, done that, bought white album too many times to go thru THIS again ![]() |
Originally Posted by roadtonowhere08 /img/forum/go_quote.gif The thing is, there are many concert releases that I hope have another release where engineers have another crack at the mastering with the new codecs. Why have lossy when you can have lossless straight from the soundboard? |
Originally Posted by roadtonowhere08 /img/forum/go_quote.gif In addition, I am hoping that with the new push for releases with the new codecs and increasing optical disc capacity, studios will realize that having the highest quality sound is a worthwhile cause. Just because most people will never hear the difference, why penalize those people who have the systems that will reveal them? |
Originally Posted by gregorio /img/forum/go_quote.gif Sounds like you may have got the wrong end of the stick or been a victim of marketing. So called Hi-Rez is a marketing con, it's not a question of having a system that can reveal the difference, because there is no system in the world that can reveal the difference, not even in the million dollar range let alone consumer systems. The problem is, these Hi-Rez formats exceed what is possible with electronics and exceed by orders of magnitude what the human ear is capable of hearing. For example, do you have a 24bit DAC? The answer is no you don't, you just think you do because of marketing. Sure your DAC can accept a 24bit format file but it can't output 24bit resolution, in fact it can't even output the full resolution available with 16bit. Recording studios have striven for many decades for high sound quality but they understand the realities of the science behind recording and that providing so called hi-rez to the consumer has got nothing to do with better sound quality. G |
Originally Posted by roadtonowhere08 /img/forum/go_quote.gif I am going to have to ask you for a source on that claim. |
Originally Posted by roadtonowhere08 /img/forum/go_quote.gif The point I am trying to make is that since optical storage is much greater now, ditch the lossy codecs and go for a lossless one. It does not have to be 24/192, but Dolby and DTS lossy codecs are silly to still use. I CAN hear the difference between a lossy Dolby and an LPCM track. If it is a concert video, there is not reason why there is not a hi-rez track. It is music just in a live setting. |
Originally Posted by gregorio /img/forum/go_quote.gif I don't need to, it's simple logic and basic digital audio theory. Noise shaped dither (used on all mastered CD 16bit/44.1k releases) provides for a dynamic range of up to 139dB (in the critical hearing band). Have a look at your DAC, if it does not have a dynamic range of 139dB it cannot reproduce the full dynamic range which is possible from 16bit. 24bit (with noise shaping) could produce a dynamic range of over 150dB. Most top class DACs have a dynamic range between 110dB and 125dB. To put this in perspective, the electrons colliding inside a single 1.8k resistor will produce noise at about -138dB. So it's self evident that the digital audio format exceeds the capabilities of the electronics within any DAC, let alone the capabilities of the human ear. It's also worth noting that no commercially released music ever exceeds a dynamic range of about 60dB, orders of magnitude within the capability of 16bit. It's not about the capabilities of optical storage, it's about format specifications and 35mm film storage. Dobly Digital is actually recorded on 35mm film between the sprocket holes, there's no room for higher data rates. HDTV specification is Dolby Digital and DVD-Video specification is Dolby Stereo, Dolby Digital or DTS. As all DVD players adhere to this specification and millions of people have DVD players, the specifications cannot be changed. Hence why Blu-Ray was invented, which does support a variety of audio formats. The differences between say a lossy Dolby Digital track and a standard PCM track are quite difficult to hear on consumer equipment and even usually on professional gear. The chances are that you are hearing differences in mastering rather than the format. Have you noticed a quality difference between DTS and DD (or PCM)? Dolby Digital is likely to be around for a long time yet! G |