UglyJoe
100+ Head-Fier
- Joined
- Dec 11, 2006
- Posts
- 348
- Likes
- 15
I'm having difficulty finding information on how to design a choke-loaded driver stage. I assume that at the operating point the DC conditions will be set by the resistance in the choke and the current pulled by the stage. Is there any way to do this with LED biasing of the input valve? Also, for signal, the reactance of the choke would give a loadline that changes with frequency. So the value of the choke should be determined by the lowest frequency needed (say 20Hz) and the resulting reactance that would give a large load... say 10k. I assume that this would mean distortion of the stage would be worse for low frequency signals?
Assuming I have this right, what's the best way to go about determining the operating point and the correct choke, as well as optimal biasing scheme? Is there anyway to apply local feedback to the stage to increase the input sensitivity and decrease the output impedance if it is needed for the stage?
Assuming I have this right, what's the best way to go about determining the operating point and the correct choke, as well as optimal biasing scheme? Is there anyway to apply local feedback to the stage to increase the input sensitivity and decrease the output impedance if it is needed for the stage?