Why You Should Avoid Metacritic

Why You Should Avoid Metacritic

At one time or another I’m sure we have all found ourselves looking at Metacritic. Sometimes we are only looking for some of the most critically acclaimed games for a particular year, another time we may just be looking at past popular games to get an idea of something we may be interested in, and yet another time we may, hopefully not too often, go to Metacritic so that we can use it as a tool for judging the worthiness of a game.

Other than being somewhat of a review aggregator to easily find a bunch of reviews piled together, Metacritic does not have any legitimately good features. Uses to be sure, but not any you should be looking for.

All review aggregator like sites in any medium – places like Rotten Tomatoes- have a problem in just the idea itself. There will always be some kind of distortion in validity or “truth” behind the statistics that will be warped depending on the various aggregator’s approach.

Taking Rotten Tomatoes as an example, movies are rated in a variety of ways, but the one that gets shown the most often is the “fresh” or “rotten” rating. To be given a “fresh” review, some threshold must be met (or it is chosen by staff). A movie may then be “certified fresh” but the average review from critics is only 6.1/10. Misleading to say the least.

Why avoid Metacritic? Well there are two major reasons to avoid it. The first is just in the way that the reviews themselves are chosen and inserted into their formula to give what they call a more “accurate” number to the review. It is far more manipulative, less-subjective, and misleading than Rotten Tomatoes- at least Rotten Tomatoes gives you the actual average review behind the “fresh” one.

So how does Metacritic come to their number? Well first, the publications/sites included in the metascore are limited. No big surprise there, and there is no real issue with some kind of arbitration process to prevent hundreds of unproven sites onto the list. No problem there.

Obviously, then, big sites like IGN will be on there and some smaller sites will be shut out. But this is where I have a problem. Metacritic’s metric/algorithm to reach the metascore is unknown (and they will not share it aside from a brief explanation). But they do tell us one piece of interesting information about how they reach the score: each site/publication is weighted in the score.

Which means that Metacritic is subjectively, not only choosing which reviews are included in their score, but they are also choosing the worth/merit of each review site. How can that lead to any kind of “accuracy?” Just the arrogance of choosing one site’s worth over another is astounding to me. Even if they do it with something as simple as seniority/popularity of said site. Just because IGN is more popular that some other site doesn’t mean that somehow the review they give for a game is worth more than another. That is assuming it is done that way (which it likely isn’t), regardless Metacritic will not say how they weight their scores anyway.

Metacritic has put themselves on a dangerous path. They try to explain themselves by arguing the merits of quantity over quality (their assured quality), and of course take it to the extreme when using examples like having 100s of reviews and the dangers of doing having so many and how having so many makes adding more statistically insignificant.

However, where is the explanation for weighting scores? I see the merit, as said previously, in some kind of arbitration process in choosing which sites to put into the score. That makes sense. But why then give some sites more value over others? There is no real reason to do so. It makes the score so subjective to the point of worthless.

They do offer some explanation for their reasoning behind the reviews and where they are rated:

 For games and music, we work to identify publications that (1) are well-regarded in the industry and are known for quality reviews; (2) actually seem to produce quality reviews (or, if not, are so influential in the industry that they have to be included); and (3) have published a good quantity of reviews.

Aside from the subjectivity, just look at what they have in the parentheses in #2, “So influential in the industry that they have to be included.” That seems like a significant problem to me, as it opens up doors for all kinds of issues in what is included. Attempting to avoid some logical fallacies, could that not point to a big issue? Maybe letting those slip through could be argued as a “margin of error” but considering the potential for harm to a review, especially when it might be a heavily weighted site, is it worth it? Is the potential for reducing the score by as little as a few points be worth it?

Before totally evaluating Metacritic’s worth, just look at subjectivity in general. Subjectivity is a part of all reviews to be sure, but with a single review you still have the opportunity to read the review itself and understand why a reviewer came to their conclusions. But with Metacritic you have absolutely no idea why the score ends up being what it is. In other words, you have no review to read to evaluate for yourself.

Why not then just have each review have equal weight while still choosing which sites to include? Would that really take away from the “quality” of the score that Metacritic gives? Since they seem so worried about quality, they can still ensure it through their process of which sites they included (that of course could be an issue as well, but a far lesser evil than this one). How do they measure prestige or respect in the industry? How do they evaluate a “quality” review over a poor one? Every part of the score that leads to the score itself is ridiculously subjective to the point, as said before, of being worthless.

Metacritic adds absolutely no value to the gaming world, other than, as mentioned before, providing a reasonable list of releases by date.

The second reason to avoid Metacritic is far more sinister. Metacritic has been used by publishers as a kind of measuring stick to determine the success of their games. This made big news two years ago when Obsidian missed out on bonuses due to Fallout: New Vegas missing the Metacritic goal. Think back to the hypothetical earlier, of the potential for a drop in even one or two points. Tell Obsidian that is an insignificant problem.

For some reason, publishers see Metacritic as valuable and popular enough to be influential. Because publishers put so much stock into it, Metacritic becomes incredibly dangerous in how it can affect the gaming industry. Some developers jobs may be on the line, some games may not be made when publishers judge how games in that genre do on Metacritic, and a whole host of issues and ways of evaluation when using Metacritic.

That is of course speculation, but that does not mean that the potential does not exist. And seeing the Fallout example, I don’t think it is too far out there, too much of a stretch, to suggest that publishers are utilizing Metacritic as an objective tool to try and treat making games strictly as a business. Reducing games and their effects/value to numbers.

Metacritic and its metascore is incredibly misleading, and Metacritic has for more influence and effect on the gaming world as a whole than it should.

That could be, maybe not solved, but significantly helped if we all avoided Metacritic – strip it of its influence.

Home » gaming » Why You Should Avoid Metacritic

About Andrew Otton

Editor-in-Chief at TechRaptor.