accuracy is subjective
Apr 2, 2017 at 10:09 AM Post #61 of 69
  [1] If you want to insist that the recording is gospel, and play it back ruler flat, no EQ, then sure, that is a measurable form of accuracy compared to the recording.
[2] I just don't believe that is the true input signal.   
[3] A musician goes into a studio, lays down a track (the TRUE input signal).  
[3a] The engineer applies a host of EQ, compression, etc. (based on what factors, I don't know) cuts a recording.   I play it back on my headphones.   What do I want to hear?  I want to hear what the musician played.
[4] I cannot objectively know what I am hearing is accurate unless either (i) I took independent measurements in the studio to compare to, or (ii) I know the entire chain of modifications that were made in the process of mixing.  
[5] As for what is on the recording, Jude and Jerry Harvey (and assign as much weight to this as you want, I'm not saying their word is gospel) all seem to agree that a bit of a bass hump sounds closer to being in the recording studio.
[6] All I can do is go to concerts.  I agree with you 100% that this is a terrible reference point, but this is all I got man.
[7] If you want to insist that the recording is gospel, then I disagree but respect your right to an opinion.


1. Of course the recording is gospel, what else are you trying to playback? What else do you expect an audio reproduction system to reproduce if not the recording?
2. The input signal is the signal you input into the piece/s of equipment you are measuring! Are you now trying to redefine what "input signal" means because I've refuted your redefinition of "accuracy"?
3. NO! ... That's just one of many "true input signals" to the studio's DAW/multi-track recorder, it is NOT the input signal to your DAC, Amp, headphones, etc.!
3a.  Again, NO! That really is NOT what you want to hear, or if it is, then you're just about the only consumer on the planet who wants to hear that! That's like going to a restaurant, ordering a piece of cake and then saying; that doesn't taste anything like a completely different cake, I want the pile of ingredients that cake was made from, not a cake! Virtually never does a musician go into a studio and lays down a track, What typically happens is that a musician goes into the studio and lays down several takes, then the next day another musician goes into the studio and lays down another bunch of takes and this continues until all the musicians are recorded. All this is then edited together and then mixed, those edits and processing done by the engineer are ENTIRELY "based on the factors" of what the artist/s and producer want the the recording to sound like! Furthermore, the artist/s typically do not want you to hear the original takes because they have usually been recorded in such a way as to facilitate the mixing process, they have not been recorded to sound real or perfect in isolation (without the rest of the mix).
4. You're still doing exactly the same thing you were before; comparing two completely different things, apples and oranges. This case is even more bizarre than the previous one because one of the things you are using for comparison you don't even know, in fact no one does! Therefore, there CANNOT be any determination of "accuracy"! Now, you seem to be agreeing with this statement but then contradict yourself by effectively saying: OK then, there is accuracy but it's subjective. I'm saying; there CANNOT be any determination of accuracy period! If you want to discuss subjective opinions about your perception of similarities or differences that's fine but that is a discussion about your perception of similarities and differences, not a discussion about accuracy.
5. I suspect you are taking that quote out of context. If you're not, then they are incompetent, because if their recordings all need a bit of "bass hump" then they should have added that bass hump to the recording during mastering, that's the whole point of the mastering process!!
6. No, the reference point for a determination of accuracy is the input signal. Your reference point can only serve as a basis for a subjective opinion of the differences between your perception of a live gig and a recording, not a reference point for accuracy.
7. The recording is your input signal, you have no other input signal. Therefore as a reference point for the determination of accuracy the recording MUST be gospel. For a subjective determination of whether you prefer the recording or the live gig, the recording might not be the gospel but that's a subjective opinion which has nothing to do with accuracy!
 
  [1] Practical measurement of accuracy is technical impossible. Here important matter, how we get reference original? Capturing system also have accuracy limitation.
[2] And what is "true" sound in studio/concert hall? Each sitting place have own "true" sound.

 
1. Measuring accuracy is not technically impossible, although measuring it with infinite precision is.
2. Correct but then most commercial recordings are not designed to be a perfect documentary of the sound at a single point in space, they are designed to create a perception.
 
G
 
Apr 2, 2017 at 10:26 AM Post #62 of 69
 
1. Measuring accuracy is not technically impossible, although measuring it with infinite precision is.

 
1. Describe, please, how you will mesaure accuracy in details, step-by-step.
 
2. Sorry, I don't understand what you mean as "measuring it with infinite precision is". Could you give more details?
 
Apr 2, 2017 at 12:02 PM Post #63 of 69
P.s: Perhaps I should have asked them not only to make available and optional function to not add crosstalk, but also an option to change a. the crosstalk level and b. the time the signal from the left virtual speakers arrives to the right headphone driver (and vice-versa). Then such option could be used with recordings in which the mixing relied only in level differences between L and R channels. So we could tweak the artificial panpoted soundstage, But that wouldn't be accurate and the mixing engineer may get angry with us. :)

 
  Using the first PRIR, central sounds seem to be in front of you, and they move properly as you turn your head. However, far-left and far-right sounds stay about where they were. That is, they sound about the same as they did without a PRIR, and they don't move as you turn your head. In other words, far-left sounds stay stuck to your left ear, and far-right sounds stay stuck to your right ear. It's possible to shift the far-left and far-right sounds towards the front by using the Realiser's mix block, which can add a bit of the left signal to the front speaker for the right ear, and a bit of the right signal to the front speaker for the left ear.
 
(...)
 

 
Erik, thank you very much for that mix block tip.
 
Apr 3, 2017 at 5:52 AM Post #64 of 69
  1. Describe, please, how you will mesaure accuracy in details, step-by-step.
2. Sorry, I don't understand what you mean as "measuring it with infinite precision is". Could you give more details?

 
I'm not sure I understand what you mean?
 
1. Accuracy is the deviation from linear response, which can be measured (for example) by a null/difference test.
2. There's a difference between accuracy and precision and infinite precision is unattainable. This maybe explains it better:

Not sure if this answers your question because I'm not sure I've understood your question.
 
G
 
Apr 3, 2017 at 6:12 AM Post #65 of 69
1. Your phrase "Measuring accuracy is not technically impossible, although measuring it with infinite precision is." I understand as "measurement with infinite precision is possible, so we can measure accuracy" (sorry for my English, if I understand it wrong).
 
But "measurement with infinite precision" is impossible technically.
 
 
 
2. What you meant as "linear response"? It is "input/output amplitude response"?
 
But it is local case of accuracy only. We can see wider: the response across frequency range.
 
 
3. You wrote "by a null/difference test".
 
 
3. a) What you will take as reference value? Ideal response?
 
3. b) If you will measure the response (or other), results of measurements will contains errors of measurement tool.
 
The errors can not be separated from results of measurements.
 
So again "measurement with infinite precision" is impossible technically in real life.
 
Apr 3, 2017 at 6:37 AM Post #66 of 69
  1. Your phrase "Measuring accuracy is not technically impossible, although measuring it with infinite precision is." I understand as "measurement with infinite precision is possible, so we can measure accuracy" (sorry for my English, if I understand it wrong).
2. What you meant as "linear response"? It is "input/output amplitude response"?
3. a) What you will take as reference value? Ideal response? 3. b) If you will measure the response (or other), results of measurements will contains errors of measurement tool.

 
1. No, that's a language thing, I meant: Measuring with infinite precision is [impossible].
2. Input/Output amplitudes of all freqs.
3a. The input is the reference value. In the case of my response to iron-buddha, that would ultimately be the recording.
3b. This was answered by the graph I posted previously was it not? Accuracy and precision are not the same thing, we don't need infinite precision to measure accuracy.
 
G
 
Apr 3, 2017 at 8:01 AM Post #67 of 69
Ok, Gregorio. I understand you.
 
Array of measurements "Input/Output amplitudes of all freqs" is enough for accuracy estimation in the first approach.
 
For pure digital processing there will ideal picture inside noise floor and clipping level :)
 
I suppose, need add noise floor estimation to pool of measurements. May be something more.
 
 
After it questions are:
 
1. How interprete the results for improving of apparatus?
 
2. Where the edge of useless improvement?
 
For me, as developer, important only practical aspect of accuracy.
 
Apr 4, 2017 at 3:04 AM Post #68 of 69
  After it questions are:
1. How interprete the results for improving of apparatus?
2. Where the edge of useless improvement?

 
Yes, for a developer they are ultimately the important questions.
 
1. If I knew the answer to that I'd be a wealthy developer myself!
biggrin.gif
If we're talking about improvement in terms of accuracy, then in some equipment areas there's nowhere to go. With DACs for example, consumers already have DACs which are as near perfect as make no difference and are effectively free (built into a good smartphone), so how as a developer do you compete with that? Maybe more flexible functionality, rather than accuracy, is a potential solution but only useful to relatively few and difficult to justify a price hike (relative to effectively free). Another way is marketing, to simply lie about accuracy, either/both in terms of what accuracy means and how much is needed. We're now effectively talking about improvement in terms of consumer perception, an improvement which requires the manipulation of consumer perception instead of an improvement in accuracy. This approach is, as far as I'm aware, one which pretty much all of today's audiophile DAC makers have to employ to some degree or other. Audiophile cable makers have it even worse, cables were to all intents and purposes perfected many decades ago and cost peanuts but still we see many audiophile cable makers with expensive products and forums full of consumers who've had their perception "successfully" manipulated.
 
2. Theoretically that should be the limit of audibility plus a reasonable margin for error. In practice though this is not the case in the audiophile world, due to the manipulation of perception. For example, the typical audibility of jitter has been shown to be about 500 nano-secs, a few have demonstrated an ability to detect 200 nano-secs. Using specifically constructed signals, there's evidence to suggest 20 nano-secs could be audible. In terms of added noise, something around 1 nano-sec get's us well below audibility. Even fairly cheap units achieve half this level which is a reasonable margin for error but some in the audiophile world report "night and day" differences with femto clocks which output several hundred thousand times less, which of course is utter nonsense as far as accuracy is concerned but in terms of (manipulated or manipulatable) perception there is no answer to your question, it's whatever marketing you can come up with which audiophiles will accept/believe!
 
G
 
Apr 4, 2017 at 7:50 AM Post #69 of 69
   
Yes, for a developer they are ultimately the important questions.
 
1. If I knew the answer to that I'd be a wealthy developer myself!
biggrin.gif
If we're talking about improvement in terms of accuracy, then in some equipment areas there's nowhere to go. With DACs for example, consumers already have DACs which are as near perfect as make no difference and are effectively free (built into a good smartphone), so how as a developer do you compete with that? Maybe more flexible functionality, rather than accuracy, is a potential solution but only useful to relatively few and difficult to justify a price hike (relative to effectively free). Another way is marketing, to simply lie about accuracy, either/both in terms of what accuracy means and how much is needed. We're now effectively talking about improvement in terms of consumer perception, an improvement which requires the manipulation of consumer perception instead of an improvement in accuracy. This approach is, as far as I'm aware, one which pretty much all of today's audiophile DAC makers have to employ to some degree or other. Audiophile cable makers have it even worse, cables were to all intents and purposes perfected many decades ago and cost peanuts but still we see many audiophile cable makers with expensive products and forums full of consumers who've had their perception "successfully" manipulated.
 
2. Theoretically that should be the limit of audibility plus a reasonable margin for error. In practice though this is not the case in the audiophile world, due to the manipulation of perception. For example, the typical audibility of jitter has been shown to be about 500 nano-secs, a few have demonstrated an ability to detect 200 nano-secs. Using specifically constructed signals, there's evidence to suggest 20 nano-secs could be audible. In terms of added noise, something around 1 nano-sec get's us well below audibility. Even fairly cheap units achieve half this level which is a reasonable margin for error but some in the audiophile world report "night and day" differences with femto clocks which output several hundred thousand times less, which of course is utter nonsense as far as accuracy is concerned but in terms of (manipulated or manipulatable) perception there is no answer to your question, it's whatever marketing you can come up with which audiophiles will accept/believe!
 
G

 
1. You can see what do yourself and competitors. When you develop, you know what may be achieved practically. You collect opinions and results of researches. Synthes of this information allow feel edges of figures. But it is not exact numbers.
 
2. Limit should be. But its different for different people. Here only proper blind test may found approximate border.
 

Users who are viewing this thread

Back
Top