A report claiming to reveal the systems that Metacritic uses to assign importance” to reviews from various sites has been dismissed by the reviews aggregator.
Gamasutra has reported that websites whose reviews are included in Metacritic's roundup were split into six different categories, with those in higher tiers having a more significant affect on the overall score than those in lower tiers.
The list caused a bit of a stir online as many little-known sites are included in the top tier while big sites such as Eurogamer, Destructoid, Joystiq, OPM UK and Videogamer were relegated to lesser tiers.
However, while Metacritic has admitted that it does use a weighting system, it claims the information in the report was incorrect.
Neither that site, nor the person giving the presentation, got those weights from us; rather, they are simply their best guesses based on research,” an official Metacritic statement said of both the report and its source.
Their guesses are wildly, wholly inaccurate. We use far fewer tiers than listed in the article. The disparity between tiers listed in the article is far more extreme than what we actually use on Metacritic. For example, they suggest that the highest-weighted publications have their scores counted six times as much as the lowest-weighted publications in our Metascore formula.
That isn't anywhere close to reality; our publication weights are much closer together and have much less of an impact on the score calculation.
Our placement of publications in each tier differs from what is displayed in the article. The article overvalues some publications and undervalues others (while ignoring others altogether), sometimes comically so. (In addition, our weights are periodically adjusted as needed if, over time, a publication demonstrates an increase or decrease in overall quality.)”