Can an input level be too high?
Jan 14, 2007 at 11:27 AM Thread Starter Post #1 of 6

Zorander

Headphoneus Supremus
Joined
May 14, 2004
Posts
5,493
Likes
36
Location
Sydney, Australia
I recently built myself a pair of XLR-to-RCA interconnect so that I can directly connect my Solo to my DAC (previously it was through my integrated's tape-loop - all RCA connection). Configuration is: 'hot' terminal (pin 2) as signal and 'cold' terminal (pin 3) as ground, with shielding directionally connected to pin 1.

Most of you would be aware that this configuration results in a higher input voltage (to the amp) and thus input volume. This is evident too. Where I used to listen at 8:30-9:00 knob position, I am now stuck around the 7:30-8:00 mark. I am concerned if there is such a thing as too 'hot' an input signal and whether this could damage an amp circuit in the long run. I'm thinking of e-mailing Graham for his comment but I don't know his e-mail address (the one given on his website never sent back any replies to my previous queries).

Cheers and TIA!
 
Jan 14, 2007 at 2:49 PM Post #2 of 6
Bump. I'd like to know the answer to this question, too.
 
Jan 14, 2007 at 3:38 PM Post #3 of 6
Although I am not a qualified electronics expert, I can say that there is such thing as a too-hot input signal from personal experience. As to the elements of amp design which determine this threshold, and where it is for typical Head-Fier equipment, I defer to others.

The symptoms of a hot input are severe distortion and clipping at even low volumes, IME. For example, the outputs of my non-portable cd/dvd players are too hot for my Total Bithead, which otherwise works perfectly with computer and portable player sources. The Headamp Gilmores, Hornet, Tomahawk, and Headroom Micro work fine with the cd/dvd players.

I would be curious to know what the typical input limit is, probably somewhere near or slightly above 1.5 or 2 volts. I have heard there is an industry standard, perhaps someone can confirm this.
 
Jan 14, 2007 at 4:03 PM Post #4 of 6
I'm not familiar with industry standards in home audio, however... with car audio, 3 volts line out (not standard) is the desired sweet spot. There are line drivers to increase the line signal to as much as 8 volts.

The problem lies in what you're connecting that to. Some amplifiers list their specs as "up to 8v input" whereas others will list 3v max. Matching these specs is key to both performance and longevity. Unfortunately, it would seem you are missing documentation of these specs?

The gain is what becomes an issue, as the input voltage, if too high, will continue down the entire path being too high. It can cause anything from a tight sounding amp, to excessive heat, electrical migration, or kill the output stage entirely.

I'm no engineer, far from it, so I don't know how (or even if) this relates to your situation. I would think 1.5v to 3v would be the ideal signal level... as it seems to be the case across the board.

I'd be inclined to change the wiring to a standard config, myself. I don't believe the higher voltage is desireable, when you had a good result prior.
 
Jan 14, 2007 at 4:56 PM Post #5 of 6
Quote:

Originally Posted by Zorander /img/forum/go_quote.gif
I recently built myself a pair of XLR-to-RCA interconnect so that I can directly connect my Solo to my DAC (previously it was through my integrated's tape-loop - all RCA connection). Configuration is: 'hot' terminal (pin 2) as signal and 'cold' terminal (pin 3) as ground, with shielding directionally connected to pin 1.

Most of you would be aware that this configuration results in a higher input voltage (to the amp) and thus input volume. This is evident too. Where I used to listen at 8:30-9:00 knob position, I am now stuck around the 7:30-8:00 mark. I am concerned if there is such a thing as too 'hot' an input signal and whether this could damage an amp circuit in the long run. I'm thinking of e-mailing Graham for his comment but I don't know his e-mail address (the one given on his website never sent back any replies to my previous queries).

Cheers and TIA!



I had this exact same problem when I first got my sacdmods 555es. The 555 has a very high output and I had about the same volume range you do; maybe slightly less. While this is annoying the higher level you are experiencing is not going to damage anything. When you further attenuate the signal with the volume control to get the same listening level the amp is seeing about the same input voltage as before. You can set the level low enough so there is no distortion or clipping correct? .... or there would be a concern.

On the actual personal experience side .... I have used the 555es players beginning in March 2003 and I have never had an equipment failure of any type related to the input voltage with many different tube and solid state amps. My original 555es has been on the same amp for the last two or so years. I hope this helps.
wink.gif
 
Jan 15, 2007 at 5:15 AM Post #6 of 6
At zero volume knob (and music running into the amp), I can actually hear very faint sound playing through the phones. I don't know if that is a serious concern though.

On the DAC side, I remember reading from the manual that the XLR output terminals run on a higher voltage range than the unbalanced output. Whether that is an industry standard or not I have no idea (and neither does anyone who's replied here it seems). I'll come back with some values later on.

p.s. Cheers for the reassurance, SACD lover!
 

Users who are viewing this thread

Back
Top