Why isn't a Soundcard powered by the PSU's 12V rail
Aug 5, 2007 at 10:09 AM Thread Starter Post #1 of 5

Sir Nobax

500+ Head-Fier
Joined
Feb 1, 2007
Posts
627
Likes
10
When i was reading all kinds of things about DIY amp's it became clear that a amp performs best when near the maximum voltage, which would be around 18V (36V for both channels) for most op-amps in that class (IIRC).

A PCI slot can provide +3.3 and +5 Volts, where both rails can output around 3A (so 25W for each PCI slot).

So without a DC -> DC converter the opamps will get a incredibly low voltage.
And even with a DC -> DC converter the costs could be decreased if a 12V rail from a molex connector would be used for each stereo channel.

Did i miss something, does the DAC somehow pump up the voltage, don't soundcard creators care about the voltage of their op-amps or is it all magic?
 
Aug 11, 2007 at 12:35 PM Post #2 of 5
I had a chat with someone who had a hand in the development of PCI standard and here is what he offered on this subject:

Quote:

I’m honestly not really sure what to say. The trend in PC’s is to reduce power consumption, and PCI was an advancement in that regard. Originally PCI was a 5v spec, with PCI 2.0 it moved to a 3.3v spec(but with 5v lines for backwards compat). All modern PCI cards use only the 3.3v rail as a result.

So I guess my question is: Why would someone expect PCI to be supplied with any arbitrary voltage? I can’t really explain why 12v wasn’t extended to PCI specifically, but if I had to guess it had to do with traces, crosstalk and other noise issues, the more power you run across any bus the more interference you have to deal with. My own involvement is mostly with the PCI 2.1/2.2 power management specification, and the goal at that point was simply to get things as low as possible in terms of power consumption…


HIHs. Cheers!
 
Aug 11, 2007 at 1:02 PM Post #3 of 5
Ah yes, that does make sense. More voltage causes more crosstalk. To retain quality the pcb's design will have to be alot more complex then a simple 5V pcb.

Btw, a pcb M-audio (IIRC) uses on one of their cards looks very smart, it was made of two pcb's attached to eachother, while the needed connection from one pcb to the other where done by copper pins.
His specifications werent that great but looked very nice.
 
Aug 11, 2007 at 6:03 PM Post #4 of 5
Quote:

Originally Posted by Sir Nobax /img/forum/go_quote.gif
When i was reading all kinds of things about DIY amp's it became clear that a amp performs best when near the maximum voltage, which would be around 18V (36V for both channels) for most op-amps in that class (IIRC).


IIRC, many IC's produce more distortion at a given impedance, the more voltage they are supplied. And frankly, manufacturers don't really run anything but line-level outputs, these days. You might be able to power a pair of 32-Ohm headphones into the tens of milliwatts, but nothing like the old days when you could output over 1,000mW!

Personally, I'm so happy with this SB16 project system for headphone listening, I defenitely think soundcard manufacturers should just add a molex connector to the card, like video card manufacturers do, to power a real output OpAmp.
 
Aug 11, 2007 at 6:21 PM Post #5 of 5
I think it would be better if the soundcard had a external connector for a adapter so the power for the soundcard was clean as can be, from a separate source. Not from the same PSU source. I think this would be optimal with regards to powering a audio card.
 

Users who are viewing this thread

Back
Top