Dropping the Rating System
When DPRP started publishing reviews in 1998, there were not a lot of websites doing what we did. A few perhaps, and we had all seen many of the music magazines out there, of course. Some had ratings, and some (well, most) did not. We had a brief discussion about it at the time and decided to use ratings on a scale of 1 to 10. Some reviewers started using half-points and sometimes even a plus and minus, so basically quarter-points, and we left it at that.
Through the years we've changed several things on the site. Categories have come and gone, most notably the News and Gig Guide. Times change, the world changes, the internet changes, readers change, so we change as well. We dropped the quarter-points for the ratings and a couple of years ago the half-points as well.
And now we're going to drop the ratings completely.
Let us explain a little.

Complaints
Through the years we've had different types of complaints about the rating system. And to some extent they were all valid. Reviewers had their own system of ratings, so were not comparable. Ratings seemed to become higher in general.
From the reviewers' side, it's often hard to summarise an opinion about an album in a single number. Some don't really care and give the first number that pops up in their head.
Pros and Cons
So we decided to take a look at the pros and cons of having a rating with every review and see what other options we could think of. As we did with every change we made to the site.
On the pro side of a rating system, we found that people would have a quick summary of a review. When you read a lot of reviews from a certain reviewer, you get to know their taste and can therefore compare to your own and get a quick impression from a single number.
The negative side started with the same point as for the positive side: people would have a quick summary. The danger is that some people only look at the rating. That's not what we spend all that time listening and writing for.
But more importantly, all the other points on the negative side focus on the meaning and value of the rating.
We realised that there is a danger of rating inflation. When a reviewer finds the second album better than the first, they are inclined to give it a higher rating. But what if the next is even better? A higher rating then has less meaning.
If we use a rating, that single number should reflect something, mean something. We quickly realised different reviewers had different ideas about that, even with a proper description. A 6 means different things, based on the reviewer. In our system, a 6 was still a nice album, although hardly anybody saw it like that, and a 6 was often taken as an insult (most likely because of the rating inflation mentioned above). The discussions following a certain rating (even higher ones that were not regarded as high enough) have taken a lot of useless time.
When a reviewer likes an album, they will rate it with a high number. When they do not like an album but they think a lot of people would like it, they might still give it a high rating anyway. This means the objective and subjective parts of a rating are not consistent.
A rating is given by a single reviewer in a single point in time and in a certain mood. Someone's taste can change (mine certainly has), and their mood changes even more often. So ratings are very likely to change.
If you're really interested in what other people think of an album, having a weighed average of a lot of people makes a lot more sense than just one. You're welcome to check out RateYourMusic or Progarchives, of course, where they do averages from more people. Our process of reviewing is just completely different. Which is perfectly fine, of course.
Looking in the past, new reviewers have had a tendency to hand out a 10 rather quickly. We've all been there, and most if not all the reviewers I've talked to about this all have regrets on some of their 10s. What I thought was an absolute future classic at the time, didn't stand the test of time at all, or was surpassed by a better album.
Feedback
In a recent post we asked a few questions, and one of them was about removing the ratings. A majority of the small group that responded rather saw we kept the rating, a minority either said it was not important or we should really drop it.
With only a small group of people responding, there is a chance only the people who felt strongly about it would say something. Since the review text is always the most important thing and the rating is just a single number, we felt that that majority of a small group did not weigh in to the pro side to make up for the con side.
We received some feedback from artists as well but only a few on this point, and they were divided 50/50. We've seen our reviews quoted by bands very often, and only a handful mentioned the rating. The vast majority selected a quote from the text.
We've had many fans complain about a rating, which to us feels like the fanclub just could not take the criticism. In all those years we've had one or two artists complain about the rating. 100% of the positive feedback we received from bands were about the reviews themselves, how we understood the music and wrote a good description so others would understand it, and not about the rating.
Other Options
We've discussed different options. Having a more fine-grained rating system (using halves again making it basically a scale of 20, a scale of 100) or even less fine-grained (5 stars). Nothing made more sense or less sense, and all the negative points would still be there.
The option of having a few groups with ratings (production, sound, originality, etc.) would make the reviewer's work even harder, and again, this would not solve the negative side of a rating.
Conclusions
There is no rating system that is completely objective and consistent among reviewers and over time. Ratings lose their value as soon as they are published.
With the rating coming from one person (or two or a few, in the case of a Duo Review or Round Table Review), the value is also minimal even at the moment of publication.
We could not think of, and no one could us provide with a real positive side to ratings. So we've decided to stop the rating system.
If you've come this far, you know now that we've not taken this lightly and have really discussed available options. Keep on reading our reviews because that part will not change.
Thank you.