castleofargh
Sound Science Forum Moderator
- Joined
- Jul 2, 2011
- Posts
- 11,029
- Likes
- 6,913
You're talking to a guy who's been manipulating sounds to match video scenes for a living, He doesn't need simplifications or to have you explain M/S(seriously, how hard did you misread his posts?). Any simplification you made has been exactly what got you in trouble from the get go. Because this has been a discussion about very specific variables and techniques between people who at the very least, learned the basics about them(counting myself in), things got elevated compared to the usual "let's talk an ignorant guy out of something false by simplifying reality until we hopeful can get down to his level" kind of discussions. Because of that, more accuracy is expected not less.
This new road you're taking about M/S looks like self arm to me. I cannot imagine this having any potential for a happy ending.
No matter what, crossfeed not being a HRTF or speaker simulator, Some variables are even more wrong/added/false/missing than with a simulation attempt. It's a fact, and everybody here accepts it. When we discuss facts in this forum, we're can never tell people that they don't feel or prefer what they feel and prefer. We argue that they misunderstood the cause of it. This applies here too.
I got some interesting results using head tracking, be it the A16 or the Waves 3D crap with the tracker I got on kickstarter years ago(using some generic HRTF based on the size of the head, so most HRTF cues were still chaos for me but not as much as fixed dummy head chaos). I kept a stable panning over time because I moved enough to re-calibrate into the effect(like turning crossfeed on and off often will immediately make me feel 60° when it's ON). But if I rest my head on something and pay attention not to move, I still end up progressively spreading instruments to the sides and killing all distance for the center. Even with my custom impulses on the A16. Change is what keeps me in the dream.
The other extreme I talked about, was spending months using a laptop placed on the side with another keyboard and bigger screen in front of me. I would often listen to youtube videos directly through the integrated tweeters of the computer on the side. after months of doing that occasionally, I started feeling like the sound was coming from the screen. It never off balanced anything else, not headphones, not daily life, and not my actual speakers in the other room. Another sign that my eyes do most of the listening and that my brain knows what it's doing when it's messing with my senses.
I don't claim to be the common average guy, I usually am not and generic 3D solutions have always been very bad for me(that includes binaural recordings made with a dummy head).
From the Realiser thread I learned that I clearly am even more prioritizing vision to make sense of sounds than the average guy usually does. Probably why I noticed early on how much I could fool myself with eye candy and why I got interested in controlled tests.
This new road you're taking about M/S looks like self arm to me. I cannot imagine this having any potential for a happy ending.
No matter what, crossfeed not being a HRTF or speaker simulator, Some variables are even more wrong/added/false/missing than with a simulation attempt. It's a fact, and everybody here accepts it. When we discuss facts in this forum, we're can never tell people that they don't feel or prefer what they feel and prefer. We argue that they misunderstood the cause of it. This applies here too.
I do get 60° initially, I imagine that my brain is already convinced it's a headphone and that a given song must have that instrument all the way to the side. Over time it progressively compensates until I get there. It the only explanation I can think about but maybe it's something else?For me the panning or placement of the instrument don't change, so your experience of 180° panning on crossfeed after a while is interesting.
I got some interesting results using head tracking, be it the A16 or the Waves 3D crap with the tracker I got on kickstarter years ago(using some generic HRTF based on the size of the head, so most HRTF cues were still chaos for me but not as much as fixed dummy head chaos). I kept a stable panning over time because I moved enough to re-calibrate into the effect(like turning crossfeed on and off often will immediately make me feel 60° when it's ON). But if I rest my head on something and pay attention not to move, I still end up progressively spreading instruments to the sides and killing all distance for the center. Even with my custom impulses on the A16. Change is what keeps me in the dream.
The other extreme I talked about, was spending months using a laptop placed on the side with another keyboard and bigger screen in front of me. I would often listen to youtube videos directly through the integrated tweeters of the computer on the side. after months of doing that occasionally, I started feeling like the sound was coming from the screen. It never off balanced anything else, not headphones, not daily life, and not my actual speakers in the other room. Another sign that my eyes do most of the listening and that my brain knows what it's doing when it's messing with my senses.
I don't claim to be the common average guy, I usually am not and generic 3D solutions have always been very bad for me(that includes binaural recordings made with a dummy head).
From the Realiser thread I learned that I clearly am even more prioritizing vision to make sense of sounds than the average guy usually does. Probably why I noticed early on how much I could fool myself with eye candy and why I got interested in controlled tests.