The general consensus on using cheap/average receivers as headphone amps is that they suck due to the general practice of creating the headphone output by stepping the speaker out down with large (~200~800 ohm) resistors in series, creating an unfavorably high output impedance. There are products sold which take the speaker output and give you a headphone jack. This comes with two problems:
1. The high power can blow your phones or your ears if you're not careful with the volume. Solution: Lower the source or just be careful!
2. Pulling the headphone out of the jack will SHORT the speaker output, possibly damaging the amp. Bad bad bad, no way to avoid it apart from taping the plug in, but even then it's an issue if the cord gets pinched/cut/shorted.
So the only issue is #2. Now, a 2 ohm headphone output is still favorable right? Following a common "1/8ths" rule it would provide sufficient performance, much better than say a 680 ohm out. Replacing these series resistors with 20W 2ohm resistors would protect the amplifier if the jack is pulled/shorted, right? Would a single large wirewound resistor preserve audio quality here, or is using for example, five 10 ohm 5W metal in parallel okay? A load approaching 20W would probably never occur on the resistors at any reasonable listening volume, and would only be present for a second or two.
Does it make sense to do this? I can't find anything on this with a search; provide some input and I'll edit the first post to be more of a guide.
Edited by k00zk0 - 9/21/13 at 9:14pm