How do I measure the power output of my amp with just a multimeter and test tones?

Jan 31, 2005 at 10:16 PM Thread Starter Post #1 of 3

tennisets

1000+ Head-Fier
Joined
Jan 6, 2004
Posts
1,337
Likes
23
Location
Medford, MA
I got a multimeter the other day from Radio Shack since I'll need one when I build the tubed preamp I'll build in a couple weeks. I figured I would test how much power my speakers are actually using so I know how much power I'll need when I build a power amp in a couple of months (I'm currently using a Jolida integrated that claims 20 watts/channel).

I plugged the positive and negative leads into their appropriate places on the front of the multimeter. I then played a 1Khz sinewave test tone through the speakers and turned them up to a reasonably loud volume (a little higher than I normally listen). I disconnected one of the speakers and connected the speaker leads to their positive and negative counterparts from the multimeter, planning to measure the AC voltage and the current to get how many watts the amp was putting out (W = VI; wattage = voltage times current).

I must have done something wrong, because I only measured about .12A and about .15V. I turned up the sound a bit, but even at very loud levels neither voltage nor current went above 1. This can't be right. It's not as though I'm using 100dB/W/m horns; I'm using 88dB/W/m monitors.

So, how should I be going about this? Your help is much appreciated since I really need to know how much power I need!
 
Feb 1, 2005 at 4:06 AM Post #3 of 3
When measuring current, DO NOT,

[size=medium]DO NOT[/size]

measure directly between the power terminals of the amp. This will SHORT the amp (0 Ohm load). Read: Bad.

And quite frankly, any decent amplifier of 5 watts + should have more than enough power to drive speakers of reasonable sensitivity (85dB +) for nearfield applications.
confused.gif
 

Users who are viewing this thread

Back
Top