gaboo
100+ Head-Fier
- Joined
- Apr 20, 2004
- Posts
- 465
- Likes
- 0
Disclaimer: I seem to never suffer the 'not loud enough' problem with my sources. So...
Question: If the source already has opamps on its output (not "true" line-out?) that provide sufficient voltage gain, is there any advantage to using an amp that contains a voltage gain stage? The alternative would be to use just an output (i.e. current) stage: buffers etc. which seems to follow the KISS principle.
I realize that a multiloop design is better in theory, because you get the extra feedback. So, in an ideal world you'd have no op amps on source outputs. If the source already has amped outputs, what is the best choice? Remove them, and use a multiloop amp? Live with them and use an amp with unity voltage gain stage? Or simply use some buffers?
Question: If the source already has opamps on its output (not "true" line-out?) that provide sufficient voltage gain, is there any advantage to using an amp that contains a voltage gain stage? The alternative would be to use just an output (i.e. current) stage: buffers etc. which seems to follow the KISS principle.
I realize that a multiloop design is better in theory, because you get the extra feedback. So, in an ideal world you'd have no op amps on source outputs. If the source already has amped outputs, what is the best choice? Remove them, and use a multiloop amp? Live with them and use an amp with unity voltage gain stage? Or simply use some buffers?