1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.

    Dismiss Notice

Causes of Soundstage, Imaging, etc.

Discussion in 'Sound Science' started by tzjin, Jul 27, 2012.
  1. Deathwish238
    Oh interesting, that is indeed very different and also very useful. So then it would be interesting to see the FR of headphones that appear to have a good soundstage and compare them to the FR of sound coming from left/right.
    Well your enclosure affects your sound a great deal. By having an open headphone, the enclosure becomes largely your surroundings. I can't say I know exactly why, but it would seem to be me if HRTF holds then perhaps open headphones have an easier time achieving the FR of a sound coming from your left/right.
    If you replicate the sound signature of audio coming from afar, your brain wouldn't know the difference if it was actually quite close.
    So if soundstage is affected by FR, it would lead that an EQ should affect sound stage...
  2. ProtegeManiac Contributor
    Aside from being open-back or sealed, there is also the location of the headphone drivers relative to your ears, which is why enclosure/chassis design, including the earpads, can affect the sound as much as the drivers. 
  3. m2man
    Your brain figures out where a sound comes from by using timing. A sound from your left reaches your left ear a zillionth of a second before the right ear. Using that hint, I think that jitter (timing error) makes the most difference in how big your sound stage is. Certainly changing sources (DACs) makes the most difference to me. I can't say I understand how a amp can have a good sound stage, but clearly being accurate will make a difference. Tubes can add a little echo apparently, that will give you some space clues.

    The big difference with a closed headphone is that there are sound reflections, and you will hear the sound wave twice, which presumably throws off the timing and your brain to the exact location.
  4. bigshot
    Most of the imaging comes from miking techniques. If the mike is a few feet back, you can hear a bit of room acoustic, and things seem further away. If the instrument is miked close and dry it sounds close up. Take those acoustic cues and spead them left/right between the two channels and you've got soundstage. You can also syntesize room acoustics with digital reerbs.

    Jitter is inaudible in the amount it occurs in even the cheapest electronics. Sound doesn't bounce around in closed headphones causing reflections. Neither of those things affect soundstage.
  5. MrGreen
    Treble, delay and reverberations cause soundstage. It's that simple.
    If you don't understand it exactly, look up binocular vs monocular vision.

    It's a pretty similar concept.
  6. tzjin
    I understand the delay, but could you please elaborate on the other two?
  7. MrGreen

     Reverberation tells us the size of the room usually.

    Our ears accentuate treble, and we use this to stereoposition more accurately, or create soundstage. Treble has a smaller wavelength meaning it hardly ever reaches ears in the same manner (colouration, volume, if at all etc). Whereas bass, with a larger wavelength, pretty much doesn't change.
  8. Lorspeaker
    check out meier stagedac...its got some switches that tweak the soundstage. 
  9. bigshot
    Ask the right questions...

    How far does sound have to travel before it is audibly degraded?

    Is the worst case scenario of an attenuation of .33 dB per meter audible?

    How much does this apply to a typical living room with a relatively small space and fairly consistent humidity levels?

    Wouldn't normal room equalization correct for this completely?

Share This Page