Transformer hum--high VAC or DC offset???
Oct 18, 2007 at 1:30 PM Thread Starter Post #1 of 8

sejarzo

Headphoneus Supremus
Joined
Jan 1, 2006
Posts
1,964
Likes
21
Location
Indiana
I've got a couple of components around that have a hum problem, not through the outputs, but from the case.

One manufacturer suggested that I measure our home's AC voltage, and that turned out to be 124.5 to 125 VAC (confirmed by our local utility, whose residential spec is 120 VAC +/- 5%.) The manufacturer's response was that the unit was designed for 115 VAC, so that high supply voltage was the problem. (A friend at the utility suggested that one way to push more power over an aging infrastructure was to increase the voltage spec and thus provide the power increase at the existing current level--not that his firm would do that, of course, but I didn't hear any of that from him
rolleyes.gif
.)

Someone suggested buying a variac to see if decreasing the supply voltage fixed the problem, but I never wanted to spend the money for one, though now with another component evidencing the same problem, maybe I should....but that would only possibly diagnose the problem, not solve it!

OTOH, the problem could be due to DC offset in the AC supply, correct?

Is there any easy/safe way to measure that in a home with not much more than a typical DMM? Or is the likelihood of DC offset on our AC circuits in a home not very high?

Thanks!
 
Oct 18, 2007 at 4:35 PM Post #2 of 8
The likelyhood of DC offset from the utility is next to nothing. Even if there was it doesn't matter. The transfomer will block it anyways. Transformers operate on changing voltage only and will never pass DC.

The hum you hear may be the transformer being pushed to or beyond it's abilities. When you increase the voltage to a transformer, you can start saturating the core and it will draw excess current in spikes which will create vibration. A simple way to check this is to use the ammeter capability of your DMM to check for current with the transformer pluged in, but with no load on it. If current is being drawn (more than 5% of it's full load capability) then it may be starting to enter saturation. Get a transformer rated for 120V....or a better one...it shouldn't be saturating at 8% over voltage.

to check for dc offset at home with a DMM: set the meter to DC and probe a recepticle. Pretty simple.
 
Oct 18, 2007 at 7:07 PM Post #3 of 8
Quote:

Originally Posted by Bjornboy81
The likelyhood of DC offset from the utility is next to nothing. Even if there was it doesn't matter. The transfomer will block it anyways. Transformers operate on changing voltage only and will never pass DC.

The hum you hear may be the transformer being pushed to or beyond it's abilities. When you increase the voltage to a transformer, you can start saturating the core and it will draw excess current in spikes which will create vibration. A simple way to check this is to use the ammeter capability of your DMM to check for current with the transformer pluged in, but with no load on it. If current is being drawn (more than 5% of it's full load capability) then it may be starting to enter saturation. Get a transformer rated for 120V....or a better one...it shouldn't be saturating at 8% over voltage.

to check for dc offset at home with a DMM: set the meter to DC and probe a recepticle. Pretty simple.



This is only partly correct. A transformer can start to hum when the core is pushed into saturation. An iron core is very efficient and the magnetic flux is often very near saturation of the core. While a transformer will block DC from getting through to the secondary, even a small DC offset can increase the magnetic flux enough to push the core into saturation.
 
Oct 18, 2007 at 7:17 PM Post #4 of 8
Quote:

Originally Posted by Cauhtemoc /img/forum/go_quote.gif
An iron core is very efficient and the magnetic flux is often very near saturation of the core.


1. efficiency depends on the grade of steel used in the core.
2. Who says it is often very near saturation of the core? This is soley up to the design of the transformer. Most 115V transformers should handle 125 with no problem. If the designer is pushing the limits of the steel then it's a very ****** design.

Quote:

Originally Posted by Cauhtemoc /img/forum/go_quote.gif
While a transformer will block DC from getting through to the secondary, even a small DC offset can increase the magnetic flux enough to push the core into saturation.


I agree, but it takes more than a "small DC offset". It would need to be a decent offset (5-10% of Vac) to make a significant difference on the core permeability.
 
Oct 18, 2007 at 11:33 PM Post #5 of 8
I wasn't worried about DC offset making it through the transformer, I was concerned about the loud physical hum--some references indicated that offset on the primary side might be the culprit. There was no typical 60 Hz hum heard via the signal outputs, but there might as well have been because the thing could be heard vibrating above soft passages in the music.

I also wanted to confirm that I could measure DC offset directly.......let's say I had a somewhat surprising incident in my past when attempting to measure some high VAC when I wasn't entirely sure about some unusual symbology on the connections of a new DMM. RTFM, I know, I know........

Anyway, I just confirmed no DC offset is present and the voltage is running 123.7 VAC here.

Some additional Googling indicated another user of this particular component went through a string of them at a dealer before he found one that didn't hum, so it's likely just a cheap xfo that is the cause.

I am curious, though, as to what the spec is for various utilities. When I was growing up in the '60's, we always called it "110", not "120". I presume that supports my friend's story that utilities have tended to raise the voltage over the past 40 years or so.
 
Oct 19, 2007 at 12:58 PM Post #6 of 8
I don't know about this whole "increase the voltage over time" thing. Being that most household/commercial/industrial loads are passive, if you increase the voltage, you increase the current as well, thus increasing the power consumption too. Defeats the purpose of "more power at the existing current level".

However in the 60's I wasn't even a twinkle in my daddies eye...because he was only like 10.
wink.gif


At far as 110 or 120...It's 120. It can fluctuate, but standard is 120v.

Back to the original problem/question. Sorry to hear about the shotty trans. It happens (bad bracketing, inadaquite varnishing or bake times, etc), but it shouldn't with decent quality control. Good luck finding another...check out Hammond transformers.
smily_headphones1.gif
 
Oct 19, 2007 at 1:32 PM Post #7 of 8
Quote:

Originally Posted by Bjornboy81 /img/forum/go_quote.gif
I don't know about this whole "increase the voltage over time" thing. Being that most household/commercial/industrial loads are passive, if you increase the voltage, you increase the current as well, thus increasing the power consumption too. Defeats the purpose of "more power at the existing current level".

However in the 60's I wasn't even a twinkle in my daddies eye...because he was only like 10.
wink.gif


At far as 110 or 120...It's 120. It can fluctuate, but standard is 120v.

Back to the original problem/question. Sorry to hear about the shotty trans. It happens (bad bracketing, inadaquite varnishing or bake times, etc), but it shouldn't with decent quality control. Good luck finding another...check out Hammond transformers.
smily_headphones1.gif



I'm a bit unclear as to your first point......the power consumed in a motor, for instance, is primarily a function of the load applied to the shaft. Conservation of energy.....if the motor drew the same current at a higher voltage without increasing the shaft load, then the power consumption would be increased but would have to be dissipated somewhere else as heat.

Electrical resistance heating should follow the same principle--heating units would simply run for a shorter length of time if their duties remained constant. With a lot of users on the grid, it all balances out.

Edison's original plan was to use 110 VDC to get adequate light intensity out of his bulbs at the time, and most references say that value of just spilled over when AC became the standard. Seriously, appliances and electronics in the 1960's were labeled for 110 VAC, not 120 VAC. And I have found a few more references that indicate the desire of utilities to provide more power over existing copper has been the main driving force behind the increased standard.

Worldwide, it seems that most systems are on 220/230/240 VAC, and that must be why the unit in question has a switch for 230 or 115 V operation. The manufacturer of the first component that had a hum problem in our home is out in California, where I presume voltage dropping off to less than 115 VAC is not uncommon based on news reports. Here, I have never measured our voltage at anything less than 123 VAC. Maybe those are reasons for designing a unit for 115VAC rather than 120VAC, perhaps?

The unit I'm most concerned about is a purchased unit that I am donating to a school music program, not a DIY build. (That environment pretty much demands a UL listed component.) I posted my question here on DIY because there is a lot of pseudo-voodoo-science misunderstanding on some of the other forums, you know?
 
Oct 19, 2007 at 4:29 PM Post #8 of 8
Quote:

Originally Posted by sejarzo /img/forum/go_quote.gif
I'm a bit unclear as to your first point......the power consumed in a motor, for instance, is primarily a function of the load applied to the shaft. Conservation of energy.....if the motor drew the same current at a higher voltage without increasing the shaft load, then the power consumption would be increased but would have to be dissipated somewhere else as heat.


I kind of see what you're saying. I guess it would depend on application. If you have a ventilation fan running 24/7 and you increase the voltage on that motor, you're going to increase the current as well. Where is this extra power going? to increased speed. same torque + higher speed = more power.

Quote:

Originally Posted by sejarzo /img/forum/go_quote.gif
Electrical resistance heating should follow the same principle--heating units would simply run for a shorter length of time if their duties remained constant. With a lot of users on the grid, it all balances out.


Again...application based. heating units, sure they would follow this principle. Incandescent lights, no. P = (V^2)/R. Resistance is the same, power will go up expotentially with voltage. And the difference in Lumens between 115V and 125V is pretty minuscule.

Quote:

Originally Posted by sejarzo /img/forum/go_quote.gif
Edison's original plan was to use 110 VDC to get adequate light intensity out of his bulbs at the time, and most references say that value of just spilled over when AC became the standard. Seriously, appliances and electronics in the 1960's were labeled for 110 VAC, not 120 VAC. And I have found a few more references that indicate the desire of utilities to provide more power over existing copper has been the main driving force behind the increased standard.


That's interesting. Do you have links to them?

Quote:

Originally Posted by sejarzo /img/forum/go_quote.gif
Worldwide, it seems that there most systems on 220/230/240 VAC, and that must be why the unit in question has a switch for 230 or 115 V operation. The manufacturer of the first component that had a hum problem in our home is out in California, where I presume voltage dropping off to less than 115 VAC is not uncommon based on news reports. Here, I have never measured our voltage at anything less than 123 VAC. Maybe those are reasons for designing a unit for 115VAC rather than 120VAC, perhaps?


could very well be. I've never measured less than 123 or so either. I think pretty much any unit rated at 110, 115, or 120V (for US use) will work at any of those voltages +-5%. It's one of those "people are use to it so we won't change it" things.

Quote:

Originally Posted by sejarzo /img/forum/go_quote.gif
The unit I'm most concerned about is a purchased unit that I am donating to a school music program, not a DIY build. (That environment pretty much demands a UL listed component.) I posted my question here on DIY because there is a lot of pseudo-voodoo-science misunderstanding on some of the other forums, you know?


I hate UL. One of my jobs here at work are to help in listing products. It's fun to test, but I hate the paperwork and politics.

You get pretty good info around here. I know what you mean about other sites...people think they know what they're talking about, but don't have a clue. It's easy to be a genius behind a keyboard.
wink.gif
 

Users who are viewing this thread

Back
Top