Testing audiophile claims and myths
Oct 18, 2018 at 3:43 PM Post #9,736 of 17,589
Then consider myself as an exception .

I am glad there are at least some exceptions to this rule. :)

As far as Keith's theorization of what might be causing the apparent treble boost... that's not even necessarily getting into dynamic / nonlinear effects, which won't really show up in an IR anyway, and might not show up in a typical THD result either.

For example, if I was mixing a track, and I wanted to add this effect to sound on purpose, I would add a volume-gated distortion plugin and filter the output and maybe the input. Or I might throw an expander on the input. Anyway, you could definitely get this subjective brightness, and hide it from all normal measurements. So as far as that goes, I totally agree with Keith that it is conceivable to have audible differences that are difficult or impossible to detect with standard measurements.
 
Oct 18, 2018 at 7:50 PM Post #9,737 of 17,589
We have no strong incentive to "prove" that our product is audibly better than one costing twice as much; we're quite content if you believe they sound the same (ours is half the price of theirs).
It doesn't benefit us in the least if you buy our DC-1 "because we proved it's better" instead of "because you believe they're both the same and ours is a much better deal".
We have no incentive to pay a lot of money to fund a study showing that ours is better; we've already "won by default".
And, for the folks who are already totally convinced that the other product is better, "sound unheard".... odds are they won't read the study, or won't believe it, anyway - so we gain nothing there.

Now, from their side of the fence, it would only benefit THEM if they could prove that a lot of people find their product clearly and significantly better.
If the study were to prove that nobody heard a difference at all - they "lose on price".
If the study were to prove that people heard a difference, but didn't express a clear preference - they lose on price.
If the study were to prove that people heard a difference, and a few people liked their product a little bit better - they still lose on price (at double the price, a few people finding it "a little bit better" would count as a "loss").
And, finally, if the study were to find that people preferred the sound of our DC-1, yet again, they lose.
Therefore, unless they honestly believe that ENOUGH people will find their product CLEARLY BETTER, they also have no incentive to fund a study.
There is simply no way that funding such a study is likely to make them enough profit to justify the cost.
(And they most certainly have no incentive to run a study, and disclose the results, unless they clearly "win".)
It is far more profitable for them to have at least some people assume that their product is better "because it's more expensive" or because they find the logic they use in their advertisements credible.

Therefore, there's very little chance that the study would actually benefit EITHER of our companies in terms of sales.

Nonsense.

If an(y) audio company's product can do what they claim it can do, then scientifically valid verification of that fact can only help that company and lend credibility to their product, increasing its perceived value, and thus its demand.

On the other hand, if the product is high priced snake oil, you're absolutely right about it not being in their interest to reveal that to potential customers.
 
Oct 18, 2018 at 8:31 PM Post #9,738 of 17,589
And, based on my somewhat extensive experience, while not everyone claims to hear a difference between Sabre DACs and other brands...
The majority of people who do claim to notice a difference tend to describe it the same way.
Those who dislike them say that "the Sabre DAC sounds grainy or etched compared to the other one".
Those that like them say that "the Sabre DAC sounds more detailed".
To me, these seem likely to be descriptions of the same phenomenon, differing in whether they are viewed as being positive or negative.

To me, based on my somewhat extensive experience, that sounds exactly like cognitive biases at play. What you've seen is justification to develop a research hypothesis to begin testing, not to reach any sort of conclusion yet unjustified by systematic, repeatable research.
 
Last edited:
Oct 18, 2018 at 8:59 PM Post #9,739 of 17,589
To me, based on my somewhat extensive experience, that sounds exactly like cognitive biases at play. What you've seen is justification to develop a research hypothesis to begin testing, not to reach any sort of conclusion yet unjustified by systematic, repeatable research.

This is how it works, apparently.

DACs can use filters that measure differently. People can listen to different DACs and make sighted, unverifiable decisions that one sounds different from another. Since DACs can measure differently, and subjective listening tests with no rudimentary validations implemented find preferences, we should correlate the different DAC measurements with these sighted evaluations. If cognitive biases are mentioned, you can be sure that a post will soon be made regarding how hearing perception is not fully understood and we can't possibly attempt to verify that a difference is actually being detected.
 
Last edited:
Oct 19, 2018 at 8:24 AM Post #9,740 of 17,589
First of all, as I stated earlier in the text, both DACs actually have very flat frequency response - according to spec.
I can vouch for the fact that the Emotiva DC-1 actually is flat within a small fraction of a dB (we build them, and we test a lot of them, they all pass that spec just fine).
Also, as far as I know, Wyred4Sound is a very "reputable" company... so I suspect that the DAC2 is also measurably as flat as they claim it is.
(I've also seen several reviews on various Wyred4Sound products, and they generally do meet their published specs.)
I wish I'd had an opportunity to confirm that on both - but I suspect that it would be confirmed that both really were quite flat.

You'll note that I specifically said that what I heard was "a difference that SUBJECTIVELY SOUNDED AS IF THERE WAS A BOOST OVER THAT RANGE OF FREQUENCIES".
I strongly suspect that, if you were to confirm the measurements, you would find that the frequency response on both is virtually identical, and the difference lies elsewhere.

And, in terms of grouping, I have found that many DACs that use various Sabre DAC chips sound this way to me...
While other brands of DAC chips sound more natural to me, and sound more similar to each other.

My theory, which is that it is due to differences in the impulse response of the filters, is based on two things.

First, there is plenty of precedent for similar effects in other areas that also involve perception.
For example, if you have used Photoshop, or any similar image editor... and, more specifically, used the very popular "Unsharp Mask" sharpening feature.
This feature is used to make images APPEAR PERCEPTUALLY TO BE SHARPER... and virtually everyone agrees that, when images are "processed" with it, they APPEAR to be much sharper.
This is quite similar to the way the "sharpness" feature on many TVs works... although some modern ones involve more complex processing.
In actual fact, what it does is to detect borders between different colors or brightness levels, and create artificial "halos" at the boundary between them.
So, for example, if a light object borders on a dark one, a light halo is added at the edge of the bright object, and a dark halo is added at the edge of the dark object.
This artificially boosts the local contrast between the edges where they touch.
The result is a sort of optical illusion that, to humans, makes it APPEAR as if the edges and details are sharper.
In fact, the actual sharpness or focus of the image remains unaffected; it only "appears" to have become sharper.
It makes sense that a similar effect might work for the "edges" of audible details as well... making them "stand out more sharply".
(This is similar to what I perceive when I turn up the range of frequencies between 5 and 7 kHz - details seem to be exaggerated.)

While you claim to be very experienced in signal processing, the way in which you characterize the examples you give make me feel less than confident about your credentials.

The Unsharp Mask emphasizes high spatial frequencies compared to low spatial frequencies much like a treble knob for audio. This "only creates sharpening halos" only if the transfer function of the lens/camera/eye combination already recorded all spatial frequencies in equal proportion, in which case the USM filter was extraneous in the first place (e.g. note how the sharpness control rapidly fell out of favor on monitors with the transition from CRT to LCD...). It is also extraneous if the frequencies it attempts to boost were totally wiped out by the combination transfer function, in which case there wouldn't be halos but rather pretty much no perceived change. When applied correctly USM compensates for a drop in high frequency response of the lens/camera/eye combination much like an equalizer.

Second, I have noticed this difference very often specifically with Sabre DACs (ESS)... and many other people characterize many products using Sabre DACs as sounding "grainy or etched".
To me, this sounds very much like a description of the audible equivalent of how images sharpened using unsharp mask appear visually.

That does not give credence to the theory that sound can appear high-boosted without actually being high-boosted... because as noted above USM is actually analogous to totally un-mysterious equalization of audio frequencies...
 
Last edited:
HiBy Stay updated on HiBy at their facebook, website or email (icons below). Stay updated on HiBy at their sponsor profile on Head-Fi.
 
https://www.facebook.com/hibycom https://store.hiby.com/ service@hiby.com
Oct 19, 2018 at 2:13 PM Post #9,741 of 17,589
First of all, as I stated earlier in the text, both DACs actually have very flat frequency response - according to spec.

I'm not looking for the "why" of it yet. I'm skipping over that part. We'll get to the "why" of it later. I'm also not going to consider the reputation of the company. That is irrelevant to what we are doing here.

I'm looking for verification of the difference you heard. If there is a boost of at least 1.5dB between 5 and 7kHz which was your estimation of the difference, that would be reflected in the response. (If this isn't an accurate description of the difference, perhaps you can elaborate on what you heard better so we can understand better.) Distortion can be tested for as well. If the difference isn't reflected in the specs, either 1) the specs on one or both of the DACs is fudged/incomplete or 2) the particular DAC you compared was performing out of spec. I suppose there could be some magical thing that can be heard but not measured, but that would be something to consider much further down the road after we've eliminated all the other possibilities. I have been swamped at the studio this week. I'll google specs and bring them back to this thread when I get a chance. Clearly audible should be easily measured.

I'm not keen to go out and buy two DACs to test this myself. I'm not made of money unfortunately! Do you have access to a way of transparently capturing the outputs of the DACs to a digital file? That would be the easiest way to distribute the files to all of us so we can see and hear exactly what we are talking about. Maybe even set up a null test to isolate the difference clearly. Or perhaps someone here with testing equipment will be able to verify specs on one of your DACs that is loaned for testing.

There's an order to what I'm doing here...

1) observation
2) verification
3) quantification / measurement
4) hypothesis
5) testing hypothesis

We are on step 2. We shouldn't jump ahead of ourselves. Let's think of a way to verify what you heard. It's clearly audible, so that shouldn't be difficult. Does that make sense?

By the way, I didn't see an answer to this question.... Which one of these DACs do you suspect is transparent, and which one do you think is colored? Have you compared either of them to other DACs and found them to sound the same? Obviously one has to be colored, because if both were transparent, they would both sound the same.
 
Last edited:
Oct 19, 2018 at 4:59 PM Post #9,742 of 17,589
You're right... to a point.

First off, if a company sells a comparatively expensive product, it only benefits them to prove that it is SIGNIFICANTLY better than products costing less.
(Proving that a product that costs a lot more is a tiny bit better, or that only a small percentage of customers find it better, can actually count against it.)

Second of all, you might be surprised how many people don't look at performance as an important feature at all.
(When you purchased your last car did you buy the one that got the best mileage?)
And, for that matter, if you REALLY believe that "all DACs sound the same", then you probably don't bother to look at the specs at all.
(If you REALLY believe that THD below 0.1% is inaudible, then who cares if your next DAC has a THD of 0.003% or 0.1%?)

The reality is that the majority of people who purchase audio equipment won't care what the outcome of any study is.
SOME already "know" that THD below 0.5% is inaudible, so they're not going to believe any results our study might find.
Others are "subjectivists", so they're equally certain that the specs aren't important at all.
Others will purchase the DAC that their favorite reviewer says sounds the best.
Others will buy the one that looks the fanciest, or that fits in their rack the best, or that matches their other equipment.
Many will audition it with the rest of their system... and, in essence, try to find one that does the best job of cancelling out other flaws.
(if their speakers are bright they'll look for a dull sounding DAC, and, if their speakers are a bit muddy, they'll look for a bright DAC "which has the best synergy with their system".)
And, of course, most non-audiophiles will simply end up buying the one that shows up first on their search on Amazon or Google that fits their general price range.

In fact, the marketing departments in most companies will all tell you that a nice full page ad, or a good review, are FAR more important to sales than actual performance.

I'll bet you spend a LOT more on canned food every year than you spend on audio equipment.
So, when did you perform a properly conduted ABX double blind test on canned peas?
And, have you actually READ a lot of reviews to determine the best-tasting TV dinners?
And, for that matter, have you ever actually measured your gas mileage to see if your car gets better mileage with Premiun than with Regular.
(If not, then you may be buying Premium that's "just high-priced snake oil" or you may be seriously handicapping your car's performance by using inferior Regular.)

Most all selling, and especially when it comes to nonessentials like cars and audio equipment, is based almost entirely on emotion and bias.
Read ANY good book on "how to sell cars".
They will tell you that specs and features DO NOT sell cars.
The way to sell a car is to convince the customer to want it; then steer him or her away from anything that might prevent him from buying it.
That's why so many car commericals show a pretty girl sitting in that sports car.. they're vaguely promising that, if you buy their car, pretty girls like that will want to ride with you.
And, notice how the obnoxious kids all suddenly behave like little angels when they jump into that new SUV they're pushing on that commerical.
It's all a not-very-veiled message that "if you buy this car that could be YOU".
Then, ONCE THEY WANT THE CAR, you provide the information they may need to justify the decision they've already made.
And, as long as none of the specs are so bad that they "disqualify" the choice, they're going to buy it.
The specs aren't used to MAKE the choice; they're used to RATIONALIZE the choice after it's already been made.
Those impresssive specs are there to convince you that you've already made a great choice... and to impress your friends.

Nonsense.

If an(y) audio company's product can do what they claim it can do, then scientifically valid verification of that fact can only help that company and lend credibility to their product, increasing its perceived value, and thus its demand.

On the other hand, if the product is high priced snake oil, you're absolutely right about it not being in their interest to reveal that to potential customers.
 
Oct 19, 2018 at 5:47 PM Post #9,743 of 17,589
First of all, as far as I can tell, you still haven't actually read what I wrote.
I said that, since the manufacturer claims to have a flat frequency response, and reviews seem to confirm this, I doubt that there will be an anomaly in the frequency response.
(It couldn't hurt to run a test to confirm this and, if we find it, then the test is effectively over.)

So, as is the case with anything that affects PERCEPTION, but you don't know the exact cause...
The first thing to do would be to confirm that it is there... using human perception as your test.
So, in this case, a double-blind test using a significant number of test subjects would be appropriate.
I'm also going to assume that we're going to SPECIFICALLY test my assertion that this difference is between products using Sabre DAC chips and other brands.

Therefore, a good starting point would be to secure several products that use: Sabre DACs, Analog Devices DACs, Burr Brown DACs, and Wolfson DACs.
Five samples of each seems like a reasonable sample.... in a variety of price ranges.

We will then allow a group of fifty or so listeners to listen to all of them (fully blind).
We could use specially chosen musical content, or allow each listener to bring their own, or some combination of both... each methdology has advantages and disadvantages.

We can then:
1) Determine how many people can actually hear a difference between the products.
2) Ask them to CHARACTERIZE the difference they hear (if any).

If it turns out that the descriptions have no significant correlation, then I must be mistaken, and the study is over.
However, if it turns out that the majority are able to differentiate between the different brands, we will have proven that they sound different.
At that point we can move on to analyzing the specific differences they hear.

My hypothesis, based on my experience and observations, is that a statistically significant majority will describe the Sabre DACs as "highly detailed", or "bright", or "etched"...
And that a statistically significant majority will describe the other brands as sounding more similar to each other, and "more neutral" or "less detailed" than the Sabre DACs.

Obviously, if we FAIL to confirm my hypothesis, then we've finished.
However, if we confirm it, then we move on to Part 2 - attempting to correlate the percieved difference with something specific.

At this point we can either simply test my secondary hypothesis - whcih is that it is filter impulse response that will correlate with the audible differences.
Or we can move directly to measuring that and a dozen other parameters (perhaps frequency ripple, phase ripple, THD, IMD, impulse response, and a few others).
(Obviously it's going to be easier to test for individual correlations one at a time.)

For a reasonably "representative study" I would suggest five each of DACs using four different brands of DAC chips (ESS SAbre, Burr Brown, Analod Devices, and Wolfson or AKM).
(I would suggest five different ones of each at five different price ranges - just for inclusivity.)
I would say forty or fifty test subjects would be sufficient.
I would suggest a good selection of well-recorded music samples, at both CD and higher resolution, be provides...
I would ALSO encourage listeners to bring their own samples...
And we need some sort of mechanism to allow all DACs to be assigned a number and switched "blindly".
I would suggest that, for each listener, the numbers be randomly assigned to the DACs, but the listener then allowed to listen in any order.
I see no reason to limit the time listeners have, or to prevent them from repeating selections if they like.

There is no way to do this sort of test accurately using recordings or captured samples.
The reason is that you haven no way to capture the samples accurately enough to ensure that your recorded samples contain the information you need.
When measuring THD, if you want reasonable accuracy, it is commonly accepted that your test gear must have 5x to 10x LOWER THD than what you're measuring.
A similar standard is typically applied to other forms of measurements.
If you're comparing DAC filters, then the ADC you use to capture samples would have to be KNOWN to be far more accurate than any of the devices under test.
Since the filters used in commercial ADCs are of similar quality to those used in commercial DACs, they clearly do NOT meet this requirement.

It would be easy to null or subtract the outputs, and so "extract" the differences.....
But that won't tell us how audible they are or are not when combined with actual musical content.

Likewise, while you can analyze the samples you gather using a computer, there is no issue with the "accuracy" when you do so.
However, if you hope to have humans listen to those samples, they must be presented on gear that is FAR more accurate than the gear you're trying to test.
(This is the ONLY way to ensure that any differences in the samples are presented accurately, and no edditional differences are introduced, and no existing ones obscured.)

Note that all of these issues can be avoided quite nicely by simply doing the test in person.

Incidentally, just for the record, I would be very surprised if there's anything "magical" involved, or anything that cannot be measured.
However, I suspect that it will be something OTHER THAN frequency response, or steady state THD or IMD, that accounts for it.
(As I've said, I suspect it will turn out to be differences in filter impulse response, but there are plenty of other possibilities.)

And, since you asked, I no longer own the Wyred4Sound DAC (I sold it soon after the comparison I mentioned).
And, no, I have NEVER owned an ADC of sufficient accuracy to capture samples at the necessary level of accuracy.

Of course, if you just want to test the INDIVIDUAL case, and simply want to establish that a percieved difference really exist (or not), then simply doing a double-blind test with a sufficient number of subjects should be sufficient.

QUOTE="bigshot, post: 14548222, member: 17990"]I'm not looking for the "why" of it yet. I'm skipping over that part. We'll get to the "why" of it later. I'm also not going to consider the reputation of the company. That is irrelevant to what we are doing here.

I'm looking for verification of the difference you heard. If there is a boost of at least 1.5dB between 5 and 7kHz which was your estimation of the difference, that would be reflected in the response. (If this isn't an accurate description of the difference, perhaps you can elaborate on what you heard better so we can understand better.) Distortion can be tested for as well. If the difference isn't reflected in the specs, either 1) the specs on one or both of the DACs is fudged/incomplete or 2) the particular DAC you compared was performing out of spec. I suppose there could be some magical thing that can be heard but not measured, but that would be something to consider much further down the road after we've eliminated all the other possibilities. I have been swamped at the studio this week. I'll google specs and bring them back to this thread when I get a chance. Clearly audible should be easily measured.

I'm not keen to go out and buy two DACs to test this myself. I'm not made of money unfortunately! Do you have access to a way of transparently capturing the outputs of the DACs to a digital file? That would be the easiest way to distribute the files to all of us so we can see and hear exactly what we are talking about. Maybe even set up a null test to isolate the difference clearly. Or perhaps someone here with testing equipment will be able to verify specs on one of your DACs that is loaned for testing.

There's an order to what I'm doing here...

1) observation
2) verification
3) quantification / measurement
4) hypothesis
5) testing hypothesis

We are on step 2. We shouldn't jump ahead of ourselves. Let's think of a way to verify what you heard. It's clearly audible, so that shouldn't be difficult. Does that make sense?

By the way, I didn't see an answer to this question.... Which one of these DACs do you suspect is transparent, and which one do you think is colored? Have you compared either of them to other DACs and found them to sound the same? Obviously one has to be colored, because if both were transparent, they would both sound the same.[/QUOTE]
 
Last edited:
Oct 19, 2018 at 6:06 PM Post #9,744 of 17,589
The reason I don't read all of your posts is because you keep drifting into proving your preconceived point instead of answering my specific questions.

We're all busy people here. Let's focus and be concise and to the point.

We are trying to verify what you heard. How can we do that? Here are some suggested ways. Feel free to suggest more ways to verify it if you can think of more.

1) Checking the specs
2) Capturing samples of both for comparison using measurements and null test.
3) Sending two physical DACs out to a bunch of people and overseeing to make sure all the comparison tests are properly conducted.

I don't think any of us have the ability to do 3. We can do 1 and 2. I'll google up 1. You say it won't show the difference. OK. Then we will have to move on to 2. Can you help us capture samples that exhibit the clear difference? You have access to the two DACs. Do you have access to a good capture device? It's not a subtle difference and it's clearly audible, so it should be able to be captured. If it isn't able to be captured by a device capable of capturing transparently, how can it be clearly audible? (Remember when answering that question not to assume the conclusion by trying to explain why it might not be able to be captured. We haven't verified that there is a difference yet.)

I suppose we could try to get just one verification from someone in the group who is trusted to know how to conduct a proper ABX test and perhaps do measurements. Would you be willing to send the two DACs to someone to do their own listening test/measurements?

If this is purely a subjective perceptual response, then we don't need to go any further because a subjective response that you have probably doesn't mean that other people will have the same subjective perceptual response. The reason you applied the controls to your test (level matching, blind comparison, direct A/B switching, etc.) was to eliminate the possibility of subjective perceptual bias affecting the test. So that brings us back to the effectiveness of your controls, which I gave you the benefit of the doubt with in the beginning.

One last question... My Oppo HA-1 has a Sabre Reference ES9018 DAC in it. My Mac and iPods don't. I've done a direct controlled comparison of those myself and I didn't hear any difference. Should I have?
 
Last edited:
Oct 19, 2018 at 6:24 PM Post #9,745 of 17,589
Strictly speaking that is incorrect...

The original version of USM was done optically, by superimposing a negative image that was slightly blurred over the original, and optically summing them, so it literally applied a halo only at edges of high contrast... (At low contrast edges, while the falloff profile of the edge may be affected, it won't be boosted or cut enough to appear as an actual halo. Apparently, while changing the dropoff profile may be slightly visible, only an actual halo produces a distinct illusion of artificially added sharpness.) The blurring applied to the "mask image" is where the term "unsharp masking" comes from.

Modern digital versions produce a similar effect by signal processing, but virtually all of them apply the effect ONLY TO EDGES, after using some method to avoid applying it to "non-edges". Therefore, if you want a more correct analogy, they act more like a dynamic processing equalizer, which is applied unequally and whose effect varies with time and the specific characteristics of the signal it is applied to.

On a typical DAC...
- If you pass a continuous sine wave of fixed frequency through them there will be no measurable effect.
- If you pass a variety of frequencies, like a frequency sweep, the filters will have a minor effect on the overall frequency response (outside of its intended range).
- But, if you apply a TRANSIENT signal, then the filter will introduce ringing, which will vary greatly depending on the specific filter parameters used.
(For example, if you play a five second burst of a sine wave, the central portion will be virtually unaffected, but ringing will be introduced that extends PAST THE ENDS of the signal.)

Therefore, if you look at the effect in relation to time, it will have no effect on steady state sine waves.
And will have an effect that is greatest at and near the edges of CHANGES in the signal.

Incidentally, in modern high quality cameras, there is rarely a significant "drop in high frequency response". Typically, in most high quality modern cameras I've used, the effect you're compensating for is "pixel uncertainty". (If a sharp boundary between black and white falls in the center of a camera pixel, that pixel will be recorded as grey. Applying a USM to that image does NOT restore the original resolution, which was never recorded. There's no way, after the fact, to determine whether that grey pixel "should have been half black and half white" rather than being a grey pixel. Rather, the "sharpening effect" creates an optical illusion of "enhanced sharpness" which "perceptually compensates" for the lack of real resolution. The picture is "made to appear sharper" - whether it was originally that way or not.)

While you claim to be very experienced in signal processing, the way in which you characterize the examples you give make me feel less than confident about your credentials.

The Unsharp Mask emphasizes high spatial frequencies compared to low spatial frequencies much like a treble knob for audio. This "only creates sharpening halos" only if the transfer function of the lens/camera/eye combination already recorded all spatial frequencies in equal proportion, in which case the USM filter was extraneous in the first place (e.g. note how the sharpness control rapidly fell out of favor on monitors with the transition from CRT to LCD...). It is also extraneous if the frequencies it attempts to boost were totally wiped out by the combination transfer function, in which case there wouldn't be halos but rather pretty much no perceived change. When applied correctly USM compensates for a drop in high frequency response of the lens/camera/eye combination much like an equalizer.



That does not give credence to the theory that sound can appear high-boosted without actually being high-boosted... because as noted above USM is actually analogous to totally un-mysterious equalization of audio frequencies...
 
Oct 19, 2018 at 6:40 PM Post #9,746 of 17,589
In fact, the marketing departments in most companies will all tell you that a nice full page ad, or a good review, are FAR more important to sales than actual performance.

A fact often exploited by snake-oil salesman. Audiophiles are, collectively, a pretty gullible bunch.

When it comes to high-end audio, snake-oil is the null hypothesis.

And no, if I can plug some high sensitivity, low impedance IEMs into a DAP, pause the music, set the gain to high and crank up the volume, and hear no noise, I don't care that the THD is 0.1% instead of 0.0003%. Sure, the difference is significant, but the effect isn't of a meaningful magnitude. Research suggests it isn't audible. Why would I pay extra? If a company wants me to pay for improved performance, they must first convince me that the increased performance adds value. If their claims conflict with prevalent scientific research, you'd better believe they better have some sound scientific research supporting those claims.
 
Last edited:
Oct 19, 2018 at 6:40 PM Post #9,747 of 17,589
OK, if you're not actually going to read the full answer, then here's a short one....

NO, #1 is unlikely to show anything interesting.
NO, I do NOT have access to an ADC of high enough quality that it won't potentially alter things like the amount of ringing usually obersved in DAC filters.
(And, no, I no longer own the Wyred4Sound DAC.... I sold it some time ago.)

Therefore, I would say that #3 is the only way to actually perform the test in a valid fashion.
And, until and unless we do so, we're all just applying our preconceived notions about what we BELIEVE the results woud turn out to be.
(Or, to spin it more positively, "discussing theories and performing thought experiments".)

I should also point out that Emotiva offers a 30 day return policy (the DC-1 has been discontinued, but its replacement will be along in a few months).
Likewise, I believe Wyred4Sound has a return policy (the last time I looked they had a small restocking fee, however they also now have a new model).
Therefore, anyone considering purchasing a DAC from either company, and willing to go to the effort, actually CAN test whether they hear a difference for themselves.

The reason I don't read all of your posts is because you keep drifting into proving your preconceived point instead of answering my specific questions.

We're all busy people here. Let's focus and be concise and to the point.

We are trying to verify what you heard. How can we do that? Here are some suggested ways. Feel free to suggest more ways to verify it if you can think of more.

1) Checking the specs
2) Capturing samples of both for comparison using measurements and null test.
3) Sending two physical DACs out to a bunch of people and overseeing to make sure all the comparison tests are properly conducted.

I don't think any of us have the ability to do 3. We can do 1 and 2. I'll google up 1. You say it won't show the difference. OK. Then we will have to move on to 2. Can you help us capture samples that exhibit the clear difference? You have access to the two DACs. Do you have access to a good capture device? It's not a subtle difference and it's clearly audible, so it should be able to be captured. If it isn't able to be captured by a device capable of capturing transparently, how can it be clearly audible? (Remember when answering that question not to assume the conclusion by trying to explain why it might not be able to be captured. We haven't verified that there is a difference yet.)

I suppose we could try to get just one verification from someone in the group who is trusted to know how to conduct a proper ABX test and perhaps do measurements. Would you be willing to send the two DACs to someone to do their own listening test/measurements?

If this is purely a subjective perceptual response, then we don't need to go any further because a subjective response that you have probably doesn't mean that other people will have the same subjective perceptual response. The reason you applied the controls to your test (level matching, blind comparison, direct A/B switching, etc.) was to eliminate the possibility of subjective perceptual bias affecting the test. So that brings us back to the effectiveness of your controls, which I gave you the benefit of the doubt with in the beginning.
 
Oct 19, 2018 at 7:01 PM Post #9,748 of 17,589
NO, #1 is unlikely to show anything interesting.
NO, I do NOT have access to an ADC of high enough quality that it won't potentially alter things like the amount of ringing usually obersved in DAC filters.
(And, no, I no longer own the Wyred4Sound DAC.... I sold it some time ago.)
Therefore, I would say that #3 is the only way to actually perform the test in a valid fashion.
And, until and unless we do so, we're all just applying our preconceived notions about what we BELIEVE the results would turn out to be.

Oh! I'm more than open to not assuming anything about an anecdotal unverified claim until it's verified first. I am just looking for an example of a difference between DACs that I can verify!

But if we're going to talk about preconceived notions... How did you determine that ringing is the cause of the clearly audible difference? Is there some way to measure that or prove that it is the cause and not something else causing it? Couldn't it be just be a difference in response or distortion or a defective unit? How did you check to eliminate that possibility? We don't want to confuse association with causation.

I think you missed my question about the Sabre DAC in my Oppo HA-1... Assuming that ringing is audible, should that sound different than playing the same digital file through an iPod with an Apple branded or Wolfson DAC in it?
 
Last edited:
Oct 19, 2018 at 7:14 PM Post #9,749 of 17,589
The reason I don't read all of your posts is because you keep drifting into proving your preconceived point instead of answering my specific questions.

We're all busy people here. Let's focus and be concise and to the point.

We are trying to verify what you heard. How can we do that? Here are some suggested ways. Feel free to suggest more ways to verify it if you can think of more.

1) Checking the specs
2) Capturing samples of both for comparison using measurements and null test.
3) Sending two physical DACs out to a bunch of people and overseeing to make sure all the comparison tests are properly conducted.

I don't think any of us have the ability to do 3. We can do 1 and 2. I'll google up 1. You say it won't show the difference. OK. Then we will have to move on to 2. Can you help us capture samples that exhibit the clear difference? You have access to the two DACs. Do you have access to a good capture device? It's not a subtle difference and it's clearly audible, so it should be able to be captured. If it isn't able to be captured by a device capable of capturing transparently, how can it be clearly audible? (Remember when answering that question not to assume the conclusion by trying to explain why it might not be able to be captured. We haven't verified that there is a difference yet.)

I suppose we could try to get just one verification from someone in the group who is trusted to know how to conduct a proper ABX test and perhaps do measurements. Would you be willing to send the two DACs to someone to do their own listening test/measurements?

If this is purely a subjective perceptual response, then we don't need to go any further because a subjective response that you have probably doesn't mean that other people will have the same subjective perceptual response. The reason you applied the controls to your test (level matching, blind comparison, direct A/B switching, etc.) was to eliminate the possibility of subjective perceptual bias affecting the test. So that brings us back to the effectiveness of your controls, which I gave you the benefit of the doubt with in the beginning.

One last question... My Oppo HA-1 has a Sabre Reference ES9018 DAC in it. My Mac and iPods don't. I've done a direct controlled comparison of those myself and I didn't hear any difference. Should I have?

You have clearly missed that @KeithEmo said that for capturing the performance of the various DACs would require at least 5 times - and preferably 10 times - better specs in the ADC. Similar was/is possible in analogue world - but DACs and ADCs go laregely hand in hand performance wise; no such thing as 10 times better ADC than the best DAC. Judging from the number of DACs ( innumerable ) and ADCs for audio ( comparably VERY few ), both home and pro, it is no surprise some DACs might well exceed the performance of even the best ADCs.

Merging Horus/Hapi, one of the top ADCs/DACs today, is on its FOURTH revision - each time, an even better ADC and/or DAC chip ( and the required annicilaries ) is incorporated onto new module board. The specs from he very first to the very last version are so good that - according to your comments - should not matter in the slightest.

One relatively inexpensive way to test two VERY similar DACs ( in fact, two generations, or if you will, second being an improved version on the first ) is iFi Micro iDSD and iFi Micro iDSD Black Label. Same functionality, same size, except that the second has a plethora of improvements across the board. Again, some in areas you dismiss as being long time surpassing anything a human could possibly hear. Around 20% increase in price over the predecessor at the time of introduction. It would take one hell of a measuring setup to actually tell the two apart ( for the reasons given above ) - yet in listening test, the two are clearly discernibly different. It does not really matter if the listening test is sighted , ABX, DBT, or whatever - the difference is big enough. The only outcome in using one or another testing methology is the different amount of time required before the difference is understood. Among other reviews, there is also my take on the Black Label : [URL]https://www.head-fi.org/showcase/ifi-audio-micro-idsd.20201/reviews[/URL]
 
Last edited:
Oct 19, 2018 at 7:28 PM Post #9,750 of 17,589
You have clearly missed that @KeithEmo said that for capturing the performance of the various DACs would require at least 5 times - and preferably 10 times - better specs in the ADC.

Why? I'm not trying to determine all of the differences, just the audible ones. If a capture device is capable of audible transparency, then it should be fine for reproducing audible differences.

There's no point measuring inaudible differences, and I'm sure they exist. I am trying to find a way to verify that two DACs sound different to human ears. If we're assuming there must be a difference because one is a Sabre chip and one isn't, I have a headphone amp with a high end Sabre chip and it sounds identical to an iPod, so it isn't that.
 
Last edited:

Users who are viewing this thread

Back
Top