It does not have to be very accurate in terms of absolute levels, as it is only the ratio of voltages that matters in this case. For the same reason, true RMS is not important either, and you are only going to test with sine waves anyway. For example, I have a cheap and old multimeter that does not support true RMS measurement, and it works decently (maybe ~1% variation) with pure sine waves from about 30 Hz to 2000 Hz, and even from 20 Hz to 20 kHz the frequency response error is within 1 dB. Your mileage may vary, though, and I think cheap true RMS capable multimeters are the ones that tend to have a limited usable frequency range. In any case, as long as you can measure ratios of AC voltages accurately enough, it should be fine.
You can also use a sound card or external A/D converter, as these usually have good linearity now, they are just not calibrated for accurate absolute levels, and have limited voltage range. They are more likely to interfere with the measured amplifier's performance, though, for example by having a low input impedance of only a few kiloohms (which should be taken into account when the amplifier output impedance is very high, like 100 ohms or more), or by creating ground loops.
For headphone amplifiers, the simpler method suggested by SilverEars can actually be OK, since it is not important to measure <1 ohm output impedances very accurately. In the case of speaker amplifiers, it is more important, as having damping factors in the range of hundreds or higher with 4-8 ohm loads is a marketable feature. But if you do care about accuracy in the milliohm range, you also need to be careful about cable and connector impedance.