When trust disappears, Reviewers are simply “noise”
Nov 3, 2017 at 5:56 PM Thread Starter Post #1 of 198
Joined
Jan 30, 2011
Posts
13,315
Likes
24,354
Disclaimer

This is an opinion piece – purely my opinion. Feel free to discuss, to agree, to disagree. But please keep it polite, and keep the emotion to a reasonable level. I am calling out the system here – not necessarily the people.

Intro

Warning – this is a “rant”. Its been building for a while, and I can't hold it back any more. It is my opinion that the SNR ratio among a small regular bunch of reviewers is getting so bad lately that something has to be said, or we may as well just say that Head-Fi is a purely advertising site for the poorly informed.

Brooko – calm down. Surely you're one of these people too, and are you saying all reviewers are like this?

No I'm not saying that, although I will state up front that I am not perfect (none of us are) – reviewing after all, at its heart, subjective.

So lets go back a bit to the heart of the matter.

History on my journey

When I first joined Head-Fi almost 7 years ago, I did it for the audio discussion, and because I wanted to learn about what was out there, so I could make informed decisions about my audio chain. I was very green, very naive, and pretty much looked at peoples post count as a measure of accuracy as to who I could trust. I would also look at multiple reviews and look for consensus.

Back then, there were few reviewers getting review samples, and few tours. Companies like FiiO were just getting started. The explosion of gear out of China into the west hadn't really started. I started reviewing my own gear and becoming comfortable with the community I was in. Soon after that I was approached by a company to see if I wanted to review some of their cheaper IEMs. Great – good chance to hear more gear, and it meant I didn't have to pay (growing family meant I couldn't afford to).

First reality check – I reviewed something and was talking about the mid-range and where vocals sat and I got pulled up by someone who basically told me I was spouting a bunch of nonsense. We PM'd – he told me where I was going wrong, where I had knowledge gaps, and basically told me that if I didn't understand audio and what I was writing, how could I review gear. He was right. I look back on some of my earlier reviews and I cringe. So I started changing my style, asking questions and learning. Along the way I started recognising many of my own personal bias points, and putting them in my own reviews. I stopped trusting solely what I thought I was hearing, and started using measurements to check and also help me explain.

I also recognised that what I was writing was potentially influencing people's buying decisions. This was slammed home to me when someone made a purchase based on one of my early reviews, and then PM'd me to state that I shouldn't be reviewing, that my review was inaccurate, and that it was their worst buy ever. That PM ended with the accusation that I was shilling. I'll never forget that. From that time my attitude to reviewing changed. I started writing what I thought a buyer would want to know – or at least what would be important to me if I was buying. Strangely enough, from that point, I started getting people following me (or following my reviews).

A matter of trust

As a reviewer, the greatest words you can hear (usually via pm) are pretty much the following:
  • “I trust your opinion”
  • “You tell it like it is”
  • “I trust you – what do you recommend”
Its the single biggest compliment a reviewer can get. And for those regular reviewers at this point in time, if you aren't getting that regular feedback among your PMs asking for your opinion, then its time to take a look at yourself (IMO of course).

Trust is huge. I read a lot of reviews, but I gloss over (or miss entirely) the reviewers I no longer have any trust for. And why don't I trust them? – because they are inaccurate. If I can't trust what they are saying, then it is pointless reading what they say.

So the system should weed out those are inaccurate right? Those who don't enable trust should be less read, and the ones who do should rise above. But that is where the system is broken. I mean what do you do when a group of reviewers is all saying the same thing, and they are all getting a lot of exposure? They must be right if they all concur, and if they all keep making the front page right? So lets explore that thought a little further. Lets talk about SNR (the signal to noise ratio).

The problem

This is a generalisation – and not how do I do it – but a summary of how a reviewer can feel, and where the pitfalls are.

I am reviewer. I want to be able to write about my experiences with gear, but I can't afford it. So I start writing about the gear I can, and I start joining tours, and start soliciting samples. However – if I write anything negative in the reviews, the manufacturers may drop me, and the likelihood is that Head-Fi won't feature the reviews. If I can't get the samples, and I can't get the exposure – then my budding “career” is over.

So how do I solve it? When I review I mainly talk about the good points, I “gloss over” the bad ones, and I rank a lot more leniently than I should. I also chuck in a lot of subjective “glowy” type descriptions. And inadvertently I make a lot of subjective claims about the gear regardless of whether they are true or not. Because I have to fill the review with something right? In short, I have written the review for the manufacturer, and to meet Head-Fi's standards for making the front page (good pictures, very positive, high score).

But what have I sacrificed? The review is now not accurate (the noise has got to a point where it masks the true signal), and my description is partly a work of fiction.

But hang on Brooko – that's BS. No-one could get away with this. X is also a reviewer and I trust his/her opinion, surely you're talking about a small sub-set? No – I'm talking about a mind-set, and its getting worse. If we don't stop it now, it will continue to grow. Lets look at some examples.

Some examples

This is not meant to name and shame – just to bring it to people's attention.

The ZhiYin QT5 saga
What happened – a group of regular reviewers had the IEM, it was being hyped up massively, and my attention was piqued when someone claimed it was better than the Fidue A91 Sirius (both 5 driver hybrids). Luckily there was a fellow kiwi who had a pair, so I arranged to swap with my U6 for a week, and I reviewed it. It is easily the worst IEM I have ever heard. At the time, all the reviews and comments were describing a 4 star IEM. I got comments afterward from a variety of people how bad it was – but no-one was willing to speak up.

The Kinera H3

This isn't a really bad IEM. It has a lot of potential in fact. I'm pretty sure a couple of reviews have already made the front page. My hasn't so far – and likely won't. Why – because although I believe the review is accurate, it is not hugely positive. The IEM sounds really good if you extensively EQ it, but is has a hard drop in the fundamental, mid-range, and early and sharp rise at 1-2 kHz, and far too much lower treble. There have been 13 reviews so far. 5 are 5 star, 6 are 4 star, 1 is 3.5 star, and mine is 2.5 star. So mine is the outlier right? Almost all of the other reviews were free samples (mine was too), and it is my sincere belief that most of the reviews and the scores were written with a natural bias toward the manufacturer. Why does this one really resonate with me? Penon were the ones who approached me about the Kinera H3. They gave me my choice of IEMs to review and I chose the H3 based on the reviews and feedback. If I was an actual buyer I'd be thoroughly peeved by now. Interestingly enough I've already had feedback from a couple of buyers confirming what I listed as some of the issues in the review.

Other instances

Just recently I got into a debate when someone else on an external review site claimed that the balanced output of a particular DAP with a particular IEM changed the frequency output. It doesn't (and I can prove it – I have the same gear). The response was – he relies on his ears and doesn't believe in measurements. ***. It's not about a competition – its about accuracy.

And the latest one was a debate about an X7ii review. The claim – with the AM3A amp, it is U shaped. I refuted it, pointed to the measurements, and suggested the need for accuracy. Once again I got back the usual “well thats what I heard”. The issue – he's listening with a U shaped IEM. I pointed out that shouldn't make any difference when describing the DAP. He next brought up Fletcher Munson (rolls eyes) and then claimed that the impedance of the DAP is influencing things. I pointed out that his IEM has a flat impedance curve, is a DD, and that would have no influence either. The point I'm trying to make is that many of the “newer crop” of reviewers are either making things up, or they are so invested in their own opinions, they simply don't know how to review with a shred of objectivity.

The dilemma

I don't know if you remember, but a little while ago Lachlan (Lachlan likes a thing) raised many of the same points about manufacturer bias, and what we are all doing wrong as reviewers. I debated him at the time – and I still would. His idea was to ask people to crowd-fund him. That way he was writing for them (more accurate), and we could stop the situation of reviewers getting unpaid for review samples. Of course his idea (to me anyway) is patently ridiculous. You'd end up with only one of two reviewers and really speaking all he's doing is feathering his own nest.

The dilemma really is that people want reviews that are accurate, but people like me can't afford to buy everything. Not only that – people want comparisons, which means you virtually have to keep a lot of the gear so you can do side-by-side volume matched tests. Unless you're rich – you have to rely on manufacturers.

Tours are good, but I'm amazed how many of the bigger reviewers don't do them – I know at least a couple who will not do a tour unless they get to keep the sample (I won't name names – but what I am stating is fact). That is their prerogative, but I think it is telling as to the reason for doing it. I do tours because otherwise I won't get to hear the product. Campfire is a good example and I genuinely thank them for the opportunity. The issue of course with tours is that when I later get asked to compare products, I can't. The tour item is gone.

Some regard keeping the sample as deserved reward for the time spent on doing a review. What? If you're reviewing for the gear, you shouldn't be doing it. Its the same as shilling. You are writing a review for a reward (keeping the gear).

So how do you get accurate reviews of the gear without radically changing the rules?

The “catch 22” is that you're expected to be overly positive if you want to review and can't afford it. And you are penalised if you write a more accurate review. In both cases its ultimately the manufacturer who is complicit in deciding how things are handled, and who should learn that accurate feedback will ultimately help them to better themselves. But also Head-Fi admins could (and IMO should) change the landscape, and start recognising that both positive and not so positive reviews are both important to be highlighted. I doubt that will happen – and it's be the subject of another blog post at some stage soon (to do with my future here).

So what do I do (personally)?

Here is simply an example of how I do things. Its not perfect, but some of the ideas may help.
  1. I don't approach manufacturers. I let them come to me. This will not be acceptable to a lot of reviewers – especially those starting out, and I can understand that. But to me, your rep should speak for itself. I can influence buying decisions with what I write – but that depends on how truthful and transparent I am. So the way I do it is to create distance from the manufacturer. I've written less than positive reviews on FiiO, Dunu and a number of manufacturers. The good ones come back – simply because they recognise that you are actually helping them develop better products, and you're not afraid to tell them your subjective truth.

  2. I recognise (as much as possible) and state my bias. I then work to eliminate them as much as possible to become as neutral as possible when I review. If you ignore your natural bias, you'll never be accurate.

  3. I treat the review samples as that – samples. They are not mine. And I don't do it for the gear. To this day, the gear I use mostly for my own pleasure is the gear I have purchased (or won in competitions). The stuff that is legally mine. HD800S, HD600, K553, MS Pro, iDSD, Curve, q-Jays, Dipper, LSR305's. If I like something I've reviewed so much that I want it for pleasure – I buy it. The only exception to this (which I am aware of) is the FiiO DAPs. FiiO won't allow me to buy anything off them any more (they won't accept payment from me). They use me for software testing, reviewing, and critical feedback. I've spent $100's on their gear in the past, and they expect me to be completely honest – and I can live with this.

    I tell every company that I will return the gear at their cost whenever they so desire. But they also know that leaving it with me – means more mileage (good or bad) through further product comparisons in the threads and subsequent reviews.

  4. I always volume match when reviewing. If you don't (with accuracy), then (to me anyway) your review is a technically flawed opinion piece. I can't trust anything you say about relative comparisons. And I'll simply read a review for the features.

  5. I use my ears to describe what I hear, but I use measurements for consistency, to check accuracy, and to understand what I am hearing. It is very cheap to get a reliable set-up, and I'm prepared to help anyone who wants to calibrate for a bit more accuracy. I personally think that if you want to be a good reviewer you simply must use all the tools at your disposal. If you don't have them – get them.

  6. I state where and how I got the product

  7. I take the time to get to know the product, and I also re-calibrate with more neutral gear while I'm reviewing. I know I can get used to anything over time (that marvellous “brain filter”), so its important to recognise this and correct it.

  8. I take critique and try to improve. I also recognise that my subjective opinion is unique to me. So a certain measure of objectivity is essential.

  9. I try to state the good with the bad and generally be gentle but accurate about the critique. You shouldn't take pleasure in pointing out the errors - merely be factual. Everyone makes mistakes (manufacturers). Help them improve, but also make sure your readers are your target. They are the ones you have to be truthful to.
Perspective

I don't want to get people's backs up with this. I'd rather have a dialogue and explore the issues, and how we solve them. I'm analytical – that’s simply how I work. There will be people out there who regard this as an attack, so I will simply state this. If you're a reviewer – have a look at your profile, and how many people are following you. They should be an indicator. The more following you, the more who are interested in what you say. Yes the following will be built over time – but mostly it will be built on trust. It should be one of the most important statistical numbers on Head-Fi if you are a reviewer. I'm not going to state them – look for yourself. And look at your own compared to other reviewers. Make a personal choice. What you do with your reviewing will influence that number.

So how do we fix this?
I guess that’s the whole point of this blog post.

How do we do it? Lets get some dialogue going. I take this stuff very seriously. Its one of the reasons I gave up my Moderator status recently (I was accused of using my position to put down others with discussing gear – there are other reasons too, but that was one of them). To this day I think the critique was unfair and inaccurate, but I had to eliminate it as a possibility and my reputation as a reviewer is too important to me.

Every critique I've had (including accusations of shilling from one or two people) have caused me to look at what I'm doing and how I'm approaching things.

The two biggest things which should help is that manufacturers should look at what they want from a review. If they regard Head-Fi as merely advertising (and this may be the truth of it), then the current system will continue.

And Head-Fi administration need to decide if their current system of reviewing is to continue. If they reward a low SNR with front page exposure, then they are setting the ROE, and the website will continue to get the types of reviews which are becoming more and more prominent, and less and less useful to buyers.

Ultimately if things continue in their current vein, I will follow through on my current decision to fade out early next year. I've already talked to a couple of manufacturers about this – interestingly they've asked me where my new home will be as they'd like to continue the relationship. Again – that little word trust. When you have it between readers and manufacturers, you know you are in a good space.

So lets discuss. How do we improve things (reviewing) before it goes so far its irrecoverable?
 
Last edited:
Nov 3, 2017 at 6:11 PM Post #4 of 198
I think it should be ok to talk bad on a product. If I wanted fluff pieces I'd keep reading Stereophile et al.


....but lucky for me I've been in this hobby long enough to not need reviews, I make the effort to get out there and try gear and make my own informed buying decisions.

To be honest, my advice to anyone newer to the hobby would be to ignore reviews and seek out those short snippets of impressions the average Joe types in a thread for "flavor of the month X". They're mostly caught up in the hype but if you read enough of them and read between the lines a little, you can get a good feel for the truth on how something sounds.

Sadly, that little nugget of truth you can glean from a few dozen impressions is more useful than most puff-piece reviews ,especially if you spend your time listening to actual real music and not test tones or chanting monks.
 
Nov 3, 2017 at 6:14 PM Post #5 of 198
it is interesting to me, how can you get a 2000 dollars in ear in exchange for you "honest" review?
how honest it can be really

I reviewed it under the pretence that I was returning it. So far HiFiMan have elected to leave it with me, and next week i'm sending it to someone else to review. Lets not throw the baby out with the bath water here. If HiFiMan did not approach me to review it, I couldn't. And I know they can ask for it back any time - if I wanted to keep it, I would buy it.

The question is that do you believe my review is biased by the fact that I still have it and didn't pay for it. I certainly wrote it on the understanding they would be taking it back. Because if you are stating all reviews are dishonest because they still have the IEM, then you have to include mine too.

This is part of the problem - how do we fix things? Simply saying no review samples or only tours creates its own issues. How do you compare when the inevitable request for comparisons are made?
 
Last edited:
Nov 3, 2017 at 6:18 PM Post #6 of 198
I reviewed it under the pretence that I was returning it. So far HiFiMan have elected to leave it with me, and next week i'm sending it to someone else to review. Lets not throw the baby out with the bath water here. If HiFiMan did not approach me to review it, I couldn't. And I know they can ask for it back any time - if I wanted to keep it, I would buy it.

The question is that do you believe my review is biased by the fact that I still have it and didn't pay for it. I certainly wrote it on the understanding they would be taking it back. Because if you are stating all reviews are dishonest because they still have the IEM, then you have to include mine too.

This is part of the problem - how do we fix things? Simply saying no review samples or only tours creates its own issues. How do you compare when the inevitable request for comparisons are made?
i am not talking about you in particular, i mean in general
 
Nov 3, 2017 at 6:51 PM Post #7 of 198
OK - so how do we fix that though? its a genuine dilemma. I certainly couldn't afford to review it. How do we at least stop the pandering to manufacturers - either for the gear itself, or the promise of new gear?
 
Nov 3, 2017 at 6:58 PM Post #9 of 198
OK - so how do we fix that though? its a genuine dilemma. I certainly couldn't afford to review it. How do we at least stop the pandering to manufacturers - either for the gear itself, or the promise of new gear?

This is the same issue much bigger organizations and magazines have had for ages, whether it be video game reviewers, tech reviews, car reviews, etc. I really think your only answer is to fake it until you make it, like that youtube guy, then turn coat and purchase everything you review because each review gets you 5million yt views and you make more per video than most gear costs. Then you can tear down the walls and slam on poor products that cost thousands and praise ones that deserve actual praise.
 
Nov 3, 2017 at 7:01 PM Post #10 of 198
Very informative and honest post Brooko. Think people should read more into your "Disclaimer" section on your posts with regard to who owns what. Keep it up and always look forward to your opinions and personal views.
 
Nov 3, 2017 at 7:02 PM Post #11 of 198
it can't be fix, its a industry, the goal is to make as much money as possible

Ok - so why are some reviewers considered more trustworthy than others? Why would you state earlier that you were not including my review but stating things generally. If there are exceptions which are recognised, can we use those to identify where the problems issues are, and how we eliminate those so that we can have reviews that are more trustworthy and more accurate?

I guess the first question is - does anyone even care?

Because if they don't, and Head-Fi really is being regarded as purely an advertising site with a forum - then obviously I need to move on. The last thing I want to do is contribute to a site which is heading in a downhill spiral (in SNR).
 
Last edited:
Nov 3, 2017 at 7:04 PM Post #12 of 198
This is the same issue much bigger organizations and magazines have had for ages, whether it be video game reviewers, tech reviews, car reviews, etc. I really think your only answer is to fake it until you make it, like that youtube guy, then turn coat and purchase everything you review because each review gets you 5million yt views and you make more per video than most gear costs. Then you can tear down the walls and slam on poor products that cost thousands and praise ones that deserve actual praise.

Thats actually a good point - but ethically I an't do that. And my point is that i don't want to make anything out of this. i do it because I enjoy listening to new stuff, and I enjoy writing. And it gives me genuine pleasure to contribute and be part of the wider discussion.
 
Nov 3, 2017 at 7:08 PM Post #13 of 198
But what is accuracy? Everyone has different hearing and music preferences, audio "reviews" are just impressions to me, the best you can do is find a "reviewer" with similar tastes as you and roll with it, what's the point of having perfect graphs and measurements if it just sounds like arse to you. The best thing you can do as an audio enthusiast when looking for gear is to test them yourself, I know some people are limited by that, and impressions are the next best thing.
Comparisons are also essential if not compulsory in a review, it's hard for reviewers but auditory memory is weak and your points and description don't matter if you have no reference point.
There's also the case of gear and source pairing, I personally believe it does make a difference so it's another cog in machine.

What I'm trying to get at is that people reviewers and readers alike should take "reviews" less seriously, our hearing senses just aren't as developed as lets say our vision, you can't benchmark music like a game or do colourimetry tests for a monitor. There are just too many variances between listeners and impressions, to make statements like objectivity or accuracy.
 
Nov 3, 2017 at 7:20 PM Post #14 of 198
But what is accuracy? Everyone has different hearing and music preferences, audio "reviews" are just impressions to me, the best you can do is find a "reviewer" with similar tastes as you and roll with it, what's the point of having perfect graphs and measurements if it just sounds like arse to you. The best thing you can do as an audio enthusiast when looking for gear is to test them yourself, I know some people are limited by that, and impressions are the next best thing.
Comparisons are also essential if not compulsory in a review, it's hard for reviewers but auditory memory is weak and your points and description don't matter if you have no reference point.
There's also the case of gear and source pairing, I personally believe it does make a difference so it's another cog in machine.

What I'm trying to get at is that people reviewers and readers alike should take "reviews" less seriously, our hearing senses just aren't as developed as lets say our vision, you can't benchmark music like a game or do colourimetry tests for a monitor. There are just too many variances between listeners and impressions, to make statements like objectivity or accuracy.

Great points. And I would fully expect reviews to be different. We are after all unique. I think though that you may have missed the point about the SNR and the fact that I believe that some of the current reviews are not merely different in the fact that they are different, but more in the fact that something which has real issues (like real problems with frequency response), can be glossed over and rated super high. RHA CL1 would be another polarising IEM which I think comes down to not only mere preference. When there is a 20 dB lift from mid-range to lower treble peaks, there are some genuine issues there.

And the Kinera H3 one is something that I really puzzled over. Me choosing it as a review sample is the same as I would be doing as a buyer. i based it on the relentless positivity of almost every review. How can something with such obvious issues be given 4s and 5s, and make front pages when the reviews do not state the issues clearly?

Maybe I'm expecting too much?
 

Users who are viewing this thread

Back
Top