- ARPDAUPosted 12 years ago
- What’s an impressive conversion rate? And other stats updatesPosted 12 years ago
- Your quick guide to metricsPosted 13 years ago
Metacritic: A Defence
Though by no means a new target, I was drawn recently to a Develop magazine column by Simon Byron discussing the multitude demerits of Metacritic and other review aggregators.
I use Metacritic on a regular basis, so I feel obliged to step into the firing line. I like being able to get multiple perspectives at once, and to quickly ascertain how well a game has done at review – and while I’m not going to jump straight into questions of artistic objectivity, I like avoiding the trap of reading a 90% review and finding out after purchase all the others were in the 50s.
The Decline of Journalistic Standards
Byron begins by attacking general post-web journalistic standards. “The barriers to entry have fallen so far over recent years that anyone with the ability to type words into the Internet can almost legitimately call themselves ‘press’.” Far be it from me to disagree with that statement – my first web review was off the back of four A Levels and a decent sample, and that was five years ago – but you can hardly blame the hobbyist reviewers.Byron, in fairness, also blames Metacritic’s selection criteria. “Its sources are a mix of top specialist review destinations and loads of My First Internet Sites.” This flaw is compounded by the fact that although Metacritic weights the reviews, this rating system is secret – “We’re just supposed to trust the opinion of a site which apparently doesn’t have an opinion.”
Again, it’s a hard accusation to deny, not to mention a strange flaw in the system – why does Metacritic reference such amateur sites? My disagreement is only with Byron’s interpretation of the facts.
The Defence
First, the weighting system may be secret, but at least it’s there – and I’m sure we can all guess which sites are the heavier hitters.
Second, his dismissal of the ‘lesser’ sites seems somewhat naive (I’m sure it’s not, it just sounds it) in light of front page scandals like GamespotGate and more recently the Tomb Raider / Barrington Harvey ‘score managing’ ordeal. Clearly it’s impossible to place all your faith in sites with everything to lose. The guys whose Bioshock review “…had been read by 451 people in its month since publication” are a lot less likely to be swayed by anything other than honest impressions – even if that doesn’t leave time for more than a casual nod to grammar.
I’m also swayed by the (un)canny accuracy of the aggregates. Certainly on the games I’ve worked on, the aggregate score in each case has been pretty closely matched to my own expectations. Further, the standard difference in scores is usually pretty tight. You get a couple of extremes, but everyone else is usually bunched up within 10% – 15% of one another, ie the scores are reasonably representative. An aggregate score is never intended to be all things to all men – but if it can capture the majority, surely it’s done its job?
What’s more, the ‘respected’ sites like Gamespot and Eurogamer seem to differ from one another just as much as any other – indicating, as if it needed pointing out, that this is a subjective game – there’s no such thing as a ‘correct score’.
The Value in Aggragation
On that note, I’m not sure what to make of Byron’s comment that, “Some sort of central hub of opinion is fine in theory – why make up your mind yourself when you can simply think what everyone else does?” On the one hand I feel Metacritic’s hub has proved defendable in practice, and on the other, I object to the suggestion that the site encourages sheepism.
Byron may feel that Metacritic panders to “score-obsessed autism that proper journalists… become legitimately dismayed about,” but I look at it the other way. To take a Metacritic score as the be all and end all of a purchase decision is clearly almost as flawed as treating a single review score similarly. To take a Metacritic review list as a reminder that no one opinion is objective – that one man’s 50% is another man’s 80 – is good business sense.
Conclusion
The value in a review isn’t found in the reviewer’s education, his readership, or even the quality of his writing (though the latter is certainly nice). A valuable review is one that reflects your own preferences, and that’s something Metacritic delivers in spades. I read the very best, and the very worst review in any given list, and it consistently does a better job of addressing concerns of importance to me than the sanitised mass market approach of the IGNs and PC Gamers.
Finally, to address accusations that Metacritic scores are too influential in publishing, I’d point out that Metacritic is a reasonable representation of critical success, and therefore anything which encourages publishers and developers to pursue quality as well as sales can only be a good thing.