Yeah, the resistance of the adapter adds to the load impedance so for the amp it's an easier load, after all it has to provide less current. Of course the headphone will only receive a fraction of the voltage that the amp outputs and that's the crux. 1 V will split at 0.5 V if the resistor and the impedance of the headphone are 30 ohm each. But if the headphone impedance rises at 100 Hz, which it usually does with dynamic headphones, let's say to 60 ohm the 1 V will now split at 0.333 V. The headphone now gets 0.666 V instead of 0.5 V causing the peak in the FR mentioned above.
From the drivers' point of view: If the diaphragm moves the driver generates voltage. If you short the driver (by having a close to zero output impedance) you will stop the diaphragm from vibrating more quickly. For the headphone it doesn't matter if the output impedance comes from resistance inside the amp, the cable, or resistors in adapters.
Yes, impedance matching is usually a bad idea with headphones and speakers though there are exceptions. Normally you want the output impedance to be a lot smaller than the load impedance.
I guess the attenuator would ideally be 1/8" to 1/8". I'm not sure such in-line attenuators are available, it would be a nice beginner DIY project though.