Chapter 1 Reviewing – How Did I Start & How Did It Grow
Jan 25, 2016 at 10:46 PM Post #18 of 46
Appreciate the thoughts 
 
Definitely catch up again 2016 as long as timing OK !  Next time I'd love to make it an entire weekend though - the Meet on the Saturday, and then a catch-up somewhere small so that I could spend some more quality time with the gear I missed out on!
 
Jan 25, 2016 at 10:52 PM Post #19 of 46
  WE ALL suffer from new toy syndrome from time to time, and we all have different ears. There are instances where I've gotten something in and thought it was the greatest thing ever. A year later I sit down and listen to it and think it's not very good at all. There are things I've reviewed and love, while others who's opinion I highly respect have felt the opposite. The star ranking is a crap shoot, and should be seen as nothing more than an indicator of whether the reviewer likes it based on their preference. 
 

 
The point of reviewing is helping customers maximize their dollars, and without degrading manufacturers.

 
Great post Vince - and totally agree with your thoughts.  The good thing is that more and more people are including an "about me" section nowadays - so at least they are starting to state their known preferences/bias.  That makes it so much easier to understand the rankings they've applied.
 
And I agree with you on the target audience being our fellow community members. One thing I'm always cautious about is that people will be making buying decisions based on what we write. That confers responsibility, and should never be taken lightly.
 
Jan 26, 2016 at 6:45 AM Post #20 of 46
  I noticed you guys were discussing star ratings earlier. I'd like to add my own two cents on it.
 
I personally feel the star rating is crap. Let me explain...
 
WE ALL suffer from new toy syndrome from time to time, and we all have different ears. There are instances where I've gotten something in and thought it was the greatest thing ever. A year later I sit down and listen to it and think it's not very good at all. There are things I've reviewed and love, while others who's opinion I highly respect have felt the opposite. The star ranking is a crap shoot, and should be seen as nothing more than an indicator of whether the reviewer likes it based on their preference. That goes for every review that is on Head-Fi for the most part. 
 
If I really like something, and feel that they perform beyond their asking price it should get five stars. If we think it's good, why not make sure we tell everyone it is? I think all too often people make their purchase based just on how many stars it gets. Truth be told, purchases should be made based on understanding their own preferences, then looking for a product that matches them. 
 
Star rating opens up a can of worms that consists of many variables, all of which can't be factored into a single star rating. If we are going to do stars, there should be a star ranking based on several separate criteria, and the final star ranking should be a mathematical average of all of them. Still, if it's done this way it's factoring every criteria equally. That isn't fair to people who hold particular criteria in higher regard than others. 
 
Another thing, I don't get how some reviewers feel so inclined to be super critical with their star ranking like they're holding out for the mothership of earphones to beam them of into an alternative dimension of music we have yet to experience in order to give it five stars. Let's not be arrogant in how we rank things, but more informative to the community. Three stars is good, four stars is better, and five stars is best, knowhatamsayin?
 
The way I see it, for me and what I write, I want to keep things this simple:
 
5 stars = An awesome product that I highly recommend, arguably the best at its price range. Very few if any customers will be disappointed if they read my review and feel that the product matches their preference and purchases it.
 
4.5 stars = A really good product that performs very well, but I can think of a couple products that might possibly be a better option at that price range.
 
4 stars = A solid piece of gear that has some minor flaws that prevent it from being elite. 
 
3.5 stars = A middle of the pack performer that doesn't match what the better products in it's price range can do. Not the worst but definitely not the best.
 
3 stars = A middle of the pack performer for it's asking price. Not a total waste of money, but they could have probably spent their money on something else and had better results.
 
2.5 stars = This is the lowest rating I will probably give a product before I tell a manufacturer that I can't review it. This is a product that is on the lower half of what I would consider satisfactory for it's price. There are many things to improve on, yet it still has some positives to point out.
 
Anything below 2.5 stars, I am offering to ship it back at my own expense. I don't want a company thinking I'm bailing on a review. I also offer constructive criticism of why I feel this way as well in hopes that they will use it to improve future products.
 
The point of reviewing is helping customers maximize their dollars, and without degrading manufacturers.


I certainly agree to this. Personally I take star ratings as a reflection of the product being reviewed and never use them as a benchmark for comparison against other brands and/or models.
 
Jan 26, 2016 at 7:52 AM Post #21 of 46
Hi Paul,
 
I'm looking forward to this epic read, an excellent blog subject from the master himself!
beerchug.gif

 
Jan 27, 2016 at 7:12 AM Post #23 of 46
A very good introductory post Brooko. I will be following this blog for sure.
I hope to read more on what you have to say on the Objectivity vs subjectivity balance in a review and the associated accountability.
 
Simplified examples:
  1. There may be a sharp 10k peak in the measurements but it sounds natural to me.
  2. I may detest a ultra-bass-heavy signature but should be able to recognize(and rate) it as a very good bass head IEM(if deserving).
 
I also hope you will share your thoughts on "Long term listening" and how it affects the review process.
 
Jan 27, 2016 at 9:47 AM Post #24 of 46
Being 62 now, a nice rise at 10khz isn't too bad for me. I find that I go from a rolled off top to a raised top in order to get more clarity and then back to a rolled away top for relief!'

Absolutely neutral (or so called neutral) can become very boring ........

So one reviewers poor might actually suit me and vice versa. Reviews are quite tricky really.

....and people get so het up about them.
 
Jan 27, 2016 at 2:25 PM Post #25 of 46
  Hi Paul,
 
I'm looking forward to this epic read, an excellent blog subject from the master himself!
beerchug.gif

 
Thanks TD - but I've made too many mistakes along the way, and still way too much to learn to be considered a master.  Still just an apprentice - and continuing to learn from some of the real masters who have been before me.
 
  A very good introductory post Brooko. I will be following this blog for sure.
I hope to read more on what you have to say on the Objectivity vs subjectivity balance in a review and the associated accountability.
 
Simplified examples:
  1. There may be a sharp 10k peak in the measurements but it sounds natural to me.
  2. I may detest a ultra-bass-heavy signature but should be able to recognize(and rate) it as a very good bass head IEM(if deserving).
 
I also hope you will share your thoughts on "Long term listening" and how it affects the review process.

 
Thanks - I think we'll probably touch on some of those points in the next couple of chapters.  The long term listening one is a really good topic, and will be interesting to get others input on it - because even the difference of 8-10 days with a review unit vs say 2-3 days can have massive difference.
 
Jan 30, 2016 at 5:16 AM Post #26 of 46
  ...
 
Star rating opens up a can of worms that consists of many variables, all of which can't be factored into a single star rating. If we are going to do stars, there should be a star ranking based on several separate criteria, and the final star ranking should be a mathematical average of all of them. Still, if it's done this way it's factoring every criteria equally. That isn't fair to people who hold particular criteria in higher regard than others. 
 
Another thing, I don't get how some reviewers feel so inclined to be super critical with their star ranking like they're holding out for the mothership of earphones to beam them of into an alternative dimension of music we have yet to experience in order to give it five stars. Let's not be arrogant in how we rank things, but more informative to the community. Three stars is good, four stars is better, and five stars is best, knowhatamsayin?
 
The way I see it, for me and what I write, I want to keep things this simple:
 
5 stars = An awesome product that I highly recommend, arguably the best at its price range. Very few if any customers will be disappointed if they read my review and feel that the product matches their preference and purchases it.
 
4.5 stars = A really good product that performs very well, but I can think of a couple products that might possibly be a better option at that price range.
 
4 stars = A solid piece of gear that has some minor flaws that prevent it from being elite. 
 
3.5 stars = A middle of the pack performer that doesn't match what the better products in it's price range can do. Not the worst but definitely not the best.
 
3 stars = A middle of the pack performer for it's asking price. Not a total waste of money, but they could have probably spent their money on something else and had better results.
 
2.5 stars = This is the lowest rating I will probably give a product before I tell a manufacturer that I can't review it. This is a product that is on the lower half of what I would consider satisfactory for it's price. There are many things to improve on, yet it still has some positives to point out.
 
Anything below 2.5 stars, I am offering to ship it back at my own expense. I don't want a company thinking I'm bailing on a review. I also offer constructive criticism of why I feel this way as well in hopes that they will use it to improve future products.
 
The point of reviewing is helping customers maximize their dollars, and without degrading manufacturers.

I don't think a universal mathematical average would work. As each person has their own subjective weightings. For me, I think it is 80% (price/sound quality), and 20% form/function/accessories. I can't give a rating on sound quality without considering price range. It isn't fair to expect a £25 bluetooth headphone (reviewing one right now) to sound like the Ether-C (one of my favourite headphones that I can't afford to own).
 
I like your star scale and find myself doing something similar. The key part for me is sound at price range. I may steal your ratings. Personally, I don't think the star scale is sensitive or specific enough. I don't feel that the the RHA pieces I reviewed recently are 4* or 4.5*, so I wrote in my review that they were 4.25* and whinged a bit about the sensitivity and specificity of the scale. I'd like to have a 20 point scale (0.25 scale increments). I have, however, given a 1.5 star review. After doing so, I was worried that I'd be out of any tours from manufacturer fear. I think that I may have missed out on one potential review because of my willingness to post such a low review. I think that 1.5* review was accurate, and I value the credibility it gives, but I don't think that all manufacturers will feel the same way.
 
On another subject, I put a lot of time into each review. I spend 3 to 5 hours writing them and take several pages of notes during 10-50 hours of listening on each item I review. The part that I find difficult is trying to review multiple items at a time. My natural instinct is to optimize and simultaneously review them in one review, but if you want to build up your review database, you've got to keep 'em separated.
 
I've still got a lot of my own gear to review, but have been fortunate to get samples and loaners to review.
 
I follow a number of manufacturers on twitter, and several follow me back. So I need to get better about tweeting reviews and impressions. I should probably schedule some review tweets now.
 
Paul, I'm really looking forward to further entries in this series. I find your reviews very informative, but tending toward a little dry. The emotional experience of listening to gear doesn't always come through for me, but this also gives a feeling of objectivity. I admire your scientific process and would like to incorporate elements of it. My personal style at this moment goes the other way; I am not as information dense, but I think my prose is entertaining. I also try to write eye-catching titles.
 
Jan 30, 2016 at 6:03 AM Post #27 of 46
The thing about that star rating system, which is well thought out imo, is that he is taking price into consideration when giving stars.

So, if price weren't taken into consideration ...

Portapro 2 stars? TH900 4 to 5 stars depending on your preference?

Taking price into consideration ....

Portapro 5 stars? TH900 2 or 3 stars?

That leaves a slight difficulty in that, say, I'm looking for the best 'hi fi' quality for the money. The recommendations would swimg from cheap to expensive perhaps leaving anyone using reviews in the difficult situation of which to buy.

For me, the Portapro is a 5 star and so is the TH900. So then it becomes which one is best value maybe? That would lead me to the Portapro, which although good, is technically nowhere near the TH900.

Reviews are SO difficult.

I also find it curious that reviews don't really go into listening volumes. That really does affect what you hear by quite a large extend. For instance, I find the Sony V6 terrific at lower volume, but sibilant at high, when I swap to something like a Momentum. In reviews though, not many refer to perceived sound differences at different volume levels which is something that I have found crucial to headphone sound.

Measurements are great and give an idea, but not exactly an idea of what always happens on the head. If I look at the Sony FR measurements for instance, I wouldn't have bought one. I was given one to work with and found it extremely insightful into recording qualities and very good at lower volumes, which is way better for the ears in spite of that massive treble lift that it has.
 
Jan 30, 2016 at 6:58 AM Post #28 of 46
The thing about that star rating system, which is well thought out imo, is that he is taking price into consideration when giving stars.

So, if price weren't taken into consideration ...

Portapro 2 stars? TH900 4 to 5 stars depending on your preference?

Taking price into consideration ....

Portapro 5 stars? TH900 2 or 3 stars?

That leaves a slight difficulty in that, say, I'm looking for the best 'hi fi' quality for the money. The recommendations would swimg from cheap to expensive perhaps leaving anyone using reviews in the difficult situation of which to buy.

For me, the Portapro is a 5 star and so is the TH900. So then it becomes which one is best value maybe? That would lead me to the Portapro, which although good, is technically nowhere near the TH900.

Reviews are SO difficult.

I also find it curious that reviews don't really go into listening volumes. That really does affect what you hear by quite a large extend. For instance, I find the Sony V6 terrific at lower volume, but sibilant at high, when I swap to something like a Momentum. In reviews though, not many refer to perceived sound differences at different volume levels which is something that I have found crucial to headphone sound.

Measurements are great and give an idea, but not exactly an idea of what always happens on the head. If I look at the Sony FR measurements for instance, I wouldn't have bought one. I was given one to work with and found it extremely insightful into recording qualities and very good at lower volumes, which is way better for the ears in spite of that massive treble lift that it has.

 
This is so true. I find a review very useful when the reviewer states his listening volume.
 
Jan 30, 2016 at 7:10 AM Post #29 of 46
I've posted some longer impressions using whole stars for specific categories--build, fit, looks, sound--without giving an overall rating. The reader can decide what matters to him. I don't take price into consideration because everyone has a different budget.

1/5 = failure
2/5 = problematic
3/5 = A-OK
4/5 = very good
5/5 = near-perfect

I'm not going to give 3 stars to anything I wouldn't be willing to use on a regular basis (which is most things out there). Maybe that's a stricter standard, but it's important for the reader to have a very clear idea of the reviewer's opinion.
 
Jan 30, 2016 at 7:34 AM Post #30 of 46
  Paul, I'm really looking forward to further entries in this series. I find your reviews very informative, but tending toward a little dry. The emotional experience of listening to gear doesn't always come through for me, but this also gives a feeling of objectivity. I admire your scientific process and would like to incorporate elements of it. My personal style at this moment goes the other way; I am not as information dense, but I think my prose is entertaining. I also try to write eye-catching titles.

 
Nice to have you here and I appreciate the input. I guess some may find my reviews a little dry - but I follow a pretty simple formula - I write what I'd like to read if I was researching a product. And I think this can be an important distinction for prospective reviewers.
 
I've noticed that some of the time, posted reviews are little more than an advertisement for a manufacturer - or at least that is how I see them - and this trend is helping no-one.  The pictures are pretty, the specs are quoted, but there is very little thought gone into the actual review.  Where are the actual comparisons with other gear?  Where are the real world measurements? I take it pretty seriously - because what I write might influence people who are spending real money.  Honestly, I've seen some reviews lately and I have to wonder what people are thinking - especially when I get to hear the exact same gear, and their "reviews" are miles away from my own experience.
 
That was part of the reason for the blog - to see if together we could all raise the quality of out reviewing standards (and I include myself in needing to improve).
 

Users who are viewing this thread

Back
Top