Not A Man, A Number: Metacritic Rates Devs

By Alec Meer on March 28th, 2011 at 11:27 am.

Well, that told him

If it exists, it must have a number stuck to it: this is the Metacritic way. Its voracious maw of review aggregation has now expanded to include individual developers – which means actual people are now being given a personal numerical rating. This is an average number (out of 100) based on the various reviews of games they’ve worked on. I must confess I find this concept mildly sinister, but maybe I’d feel differently if I was able to go around telling people I was worth 91%.

So, who of the many alumni of PC gaming has been treated favourably by this new system? And who’s fallen foul of it? Who is empirically proven to be the better man: Warren Spector or Ken Levine? Richard Garriott or Derek Smart? John Carmack, John Romero or American McGee? And Portal’s Erik Wolpaw or Portal’s Chet Faliszek? And which poor bastard was deemed to be worth just 8%?

Here’s who I’ve looked up so far – I am, of course, expecting you to suggest further honourable gentlebeings below. This is achieved simply by typing dudes and dudettes’ names into the main search box on Metacritic.

  • Cevat Yerli (Crytek): 91
  • Bill Roper (Blizzard/Flagship/Cryptic): 90
  • Ken Levine (Irrational): 89
  • Chet Faliszek (Valve): 89
  • Erik Wolpaw (Valve/Double Fine): 88
  • Sid Meier (Firaxis/Microprose): 86
  • American McGee (id/EA/Spicy Horse) : 86
  • Cliff Bleszinski (Epic): 86
  • Doug Church (Looking Glass/Ion Storm/EA/Valve): 84
  • Warren Spector: (Origin/Looking Glass/Ion Storm/Junction Point) 82
  • Peter Molyneux (Bullfrog/Lionhead): 82
  • Chris Avellone (Black Isle/Obsidian): 81
  • Chris Taylor (Gas Powered Games/Cavedog): 80
  • Brian Reynolds (Microprose/Firaxis/Big Huge Games/Zynga): 79
  • John Carmack (id): 78
  • Ruslan Didenko (GSC Gameworld): 77
  • Julian Gollop (Mythos/Ubisoft): 76
  • John Romero (Origin/id/Ion Storm/Midway/Monkeystone/Gazillion/Loot Drop): 75
  • Richard Garriot (Origin/NCsoft/Portalarium): 66
  • Derek Smart (3000AD): 61
  • Oleksandr Khrutskyy (Deep Shadows): 61
  • Sergey Titov (Stellar Stone): 8

Phew! Much to comment on there (Garriott and Roper make for particularly interesting discussion points in and of themselves; the former’s recent games have soured his mighty past achievements, while the latter’s Blizzard mega-scores mask troubled times such as Hellgate and Champions), but let’s use as our launching point Valve’s Wolpaw and Faliszek.

The ex-Old Man Murray pair have worked together for most of their adult lives, but have different scores. Chet is officially one better than Erik. Will he gloat and chuckle and sneer about this every day? Will Erik fall into a deep, dark depression that results in him carving ’88′ into every tree trunk, car door and human face in Seattle? Who knows.

What we do know is that the disparity results from Wolpaw having worked on one game that Faliszek didn’t – Psychonauts – while Wolpaw isn’t credited as having worked on the Left 4 Dead games. The latter was generally better-reviewed than Psychonauts: hence the mystery 1%.

My fear is that the games industry might use this system as a factor when looking to recruit people or decide on payrises. Metacritic numbers have already been known to affect the likes of retailer stock buys, publishers and studios’ public profiles (“we are/want to be a 90% Metacritic average company” is a refrain I’ve heard many times of late, especially from larger devs and publishers) and even developer bonuses. I can all too easily imagine a firm deciding it will only hire or promote staff of a certain score or higher, believing that to be the benchmark of their aptitude, and thus an indicator of how likely their work is to bring in the high scores and thus the high revenues.

“Only 88%? Not good enough, Mr Wolpaw. We’re after an 89%er at the very least.” “If you’re not worth 91% by Christmas you won’t get that payrise you need to feed your 18 children, Meier.” And pity the guys with a 60 or 70 albatross hung around their necks. What if they’ve just not been able to be attached to a big, expensive manshoot project that attracts enough breathless, drooling reviews from shooter-hungry review sites? Cutting one’s teeth on lower-key projects is no bad thing, and frankly learning hard lessons on shitty, underfundedgames is only going to increase your skills. This doesn’t meaningfully reflect any of that, or a whole lot more.

For instance, let’s try Ruslan Didenko, lead designer on the last two STALKER games. He’s a 77, which by current games industry score perception standards means mediocrity. Yet we know STALKER titles are amongst the most technically and creatively ambitious, clever and atmospheric videogames of recent years; yet their relative inaccessibility (by mainstream standards) mean they’ll never score the big numbers on some of the biggest sites/publications. Not to mention that Stalker Clear Sky was a bit of a boo-boo (at least at launch) so it’s understandable that its numbers are a bit lower than its successor, Call of Pripyat. Yet Pripyat demonstrated deftly that the devs had learned their lesson and honed their craft. Does a personal rating of 77 reflect that accurately?

Similarly Deep Shadows’ Oleksandr Khrutskyy, one of the lead designers on Boiling Point. That might be a game of legendary comedy, but in a very real way it’s an incredible technical achievement from a tiny team with a tiny budget. 61% doesn’t exactly convey how driven Khrutskyy surely is.

Then there’s poor, poor Sergey Titov, who was producer and programmer on Big Rigs: Over the Road Racing. Apparently he is only 8% good enough. There is 92% wrong with this person. How’s that going to look on the CV? Quite clearly Metacritic scores will not be the only factor involved in hiring and salary decisions, not by a long way: my feeling is simply that they perhaps shouldn’t be involved at all. I’m not sure what other purpose this new rating has, however. Theories?

Sure, it’s an entertainingly, perhaps even passingly informative idea to see and do these kinds of rankings, but I’m concerned they paint a incredibly inaccurate picture of a developer’s achievements and skills. Aggregating game review scores might be a useful touchstone but unavoidably loses nuance – do we really want that of people too?

, .

210 Comments »

  1. parm says:

    So, what happens now is that studios start salary negotiations on the basis of these bullshit scores.

    (don’t think it won’t happen, bonus payouts already depend on Metacritic scores for the games themselves)

    • Alec Meer says:

      It’s almost as if you commented before you read the story. Almost.

    • parm says:

      I read the story, but inadequately. My brain somehow eliminated the paragraphs between the two images. I am experiencing shame and shall hie myself away for basic reading and comprehension classes with haste.

    • Mike says:

      Then again, maybe this isn’t such a bad idea? It already happens in academia, where you are measured in many institutions on the basis of papers published and citations achieved, rather than the quality of the work you do.

      It’s not because people are blind to the work you’ve done, it’s because the field is too big and the people making the decisions can’t spend all year reading through work trying to decide who is better than who. So perhaps it’s unfair, but understandably unfair. The best of a bad situation?

    • parm says:

      Couple of issues with that though: academic papers are usually the product of maybe four or five people’s work (and often less), whereas games are usually the product of teams of tens of people, maybe even hundreds. Also, a single percentage score for a game tells you very little about whether, say, the programmer who worked on the particle effects, or the artist responsible for rock textures is any good at their job.

      It also totally fails to take into account the conditions under which a game was produced – for example, a friend of mine worked on some ropey cash-in GBA ports back in the day; one game in particular, they had 8 weeks to do it, had none of the original artwork or audio assets, nor any of the original code to work from, so they had to lash it up from ripped screenshots and audio recorded by playing the original game and sampling it; in the end they didn’t get the AI finished so there are massive, massive gameplay holes in it too. The game was, he will be the first to admit, unmitigated shit, and was quite rightly panned in the reviews; the fact that the team was understaffed, underresourced, working to a totally unreasonable deadline and so forth isn’t captured anywhere in the 3/10 scores it got everywhere. Fact is, he’s a one of the better programmers I know, and I’d happily employ/work with him, but by his Metacritic “developer score”, you’d chuck his CV in the bin before you even looked at it.

    • Berzee says:

      Yeah But He Didn’t Read The Post.

    • Ultra Superior says:

      It is a very stupid idea. Games are rarely products of their designers. (No, really.)

      Anyone who’s been in the industry knows that the executive role on the project has always the man with the cash. The stupider that man is, the more he tends to involve and ruin the game in development.

      (As in putting in all the cool features he’d seen in the latest activision game and discarding the stuff that “wouldn’t appeal to average consumer”)

    • battles_atlas says:

      Is it contractual that all Alec’s posts feature him being a bit of cock to the first commenter?

      @ Mike
      The system used in academia is a bad idea though, so why would you want to apply it elsewhere? As you say, it doesn’t measure quality but a combination of quantity and proximity to the mainstream. Its a shit system that limits academic freedom of enquiry. It might be acceptable if it was actually necessary, but its justification is another example of the tyranny of targets that I was under the impression was out of favour now NuLab have been given the boot. Its all predicated on a rampant distrust of professionals, and the belief that no one is capable of doing their job succesfully without half a dozen bureaucrats reifying their performance in soundbite metrics.

      Fuck the Numbers, as NWA might say if they were middle class white boys.

    • Jamison Dance says:

      @battles_atlas: I think Alec’s comment was justified, seeing as the article was published at 11:27, and the first comment was at 11:30, which just echoed something written in the article. I admire RPS’s efforts in dissuading those stupid “First!!!!111″ posts, and it seems like that is what Alec was trying to do.

      And yes, parm did say some more insightful things later, but maybe those were encouraged by Alec’s response.

    • Alec Meer says:

      Only if the first commenter does something to deserve it.

    • Urthman says:

      It’s a hard habit to break.

      “The first sentence of a recent Deus Ex 2 preview on pc.ign.com: ‘There’s a tendency among the press to attribute the creation of a game to a single person,’ says Warren Spector, creator of Thief and Deus Ex.” (via OldManMurray)

  2. Jockie says:

    I’d lke to think the interview process for game developers is a little bit more nuanced than typing someones name into metacritic, I bet Blizzard are delighted though.

    • Dances to Podcasts says:

      According to Metacritic, Cevat Yerli is one point better than Mike Morhaime and two better than Rob Pardo. This will not do!

  3. Dan Lawrence says:

    Some brilliant developers work on terrible projects and some terrible developers coast along inside successful projects. Most developers don’t get to pick the projects they work on.

    I doubt any sane companies will base their recruitment on a metacritic score.

    • mcwill says:

      The games industry’s management echelons are not exactly known for their sanity.

    • bascule42 says:

      Apropos of nothing…I read somewhere a while ago that a high Gear Score could seriously affect employment prospects.

    • LionsPhil says:

      It needs weightings, and badly. If they’re doing a straight mean of games-worked-on then it’s treating that bit of junior developer work for a straight-to-the-bargin-bin puzzler as every bit as important as spending years as lead developer on your groundbreaking Opus.

    • bob_d says:

      @ LionsPhil: Weightings wouldn’t remotely help, even if they were possible (job titles are flexible and credits aren’t terribly accurate, so usually only the team itself knows who did what, and even then there are likely differences of opinion). The whole scoring system is fundamentally meaningless. The scores are aggregates that aren’t remotely interested in the particular roles played by individuals: artists who do amazing work could have their scores dropped by providing work for games that were terrible in every other respect, for example. The publisher could insist on releasing the game unfinished, management could make some decisions about graphics and gameplay that don’t go over well with reviewers, the game could be mis-marketed and end up creating unfair expectations or causing the game to be misunderstood, etc., etc. There is no possible way that the gross Metacritic score could provide any meaningful information about the individuals in their database.

  4. Jackablade says:

    57% eh. Lucky they’ve only got a couple of my games on their list.

    • Jackablade says:

      Wait, no I should have 63%. This is an outrage. I’d have 65 if it wasn’t for Blood Drive dragging down my average like some kind of… appropriate sporting analogy.

    • skurmedel says:

      That game was so 47 on PS3, but the 360 version was 15% worse!

  5. Doesntmeananything says:

    If this would somehow affect the industry, then this is the most abominable thing ever. If not, well, then it’s just vile.

  6. Alexander Norris says:

    Looks like someone accidentally put a 9 in front of Cevat Yerli’s score, there.

    • Premium User Badge FunkyB says:

      Joking aside, he has a high score because he’s worked on three manshoots that are boring and safe in reach, but fun in execution which seems to lead to monstrously high metacritic scores. As Alex suggests, if you want something interesting you have to accept some people will not get it. This is just another way to homogenise the industry.

      Edit: Gah, I meant, of course, Alec. However I was responding to an Alex. My brain is too feeble for such gymnastix.

    • John Walker says:

      Who is this Alex?

    • Daniel Carvalho says:

      Heck, Crysis was not boring for me.

    • Teddy Leach says:

      Any game where you can punch a man through a wall is an instant hit for me.

    • Alex Bakke says:

      Rule #26?

    • gorgol says:

      He said in reach, not in execution.

    • skalpadda says:

      “Who is this Alex?”

      Sir Alex Guinness?

    • faelnor says:

      a CEVAT YERLI score

    • westyfield says:

      Alex is Alec’s evil twin.

    • Premium User Badge FunkyB says:

      As gorgol pointed out, I said that Crysis et. al. were safe and boring in their aims but their execution was found by many to be fun. This is not a criticism, merely an observation that they chose to focus on making a fun game rather than attempting to cut new ground. Whether or not they succeeded is irrelevant to the argument.

      My concern is that the well-trodden path seems to lead to higher metacritic scores. Perhaps this is understandable if metacritic is viewed as a polish-ometer, but it does not appear to measure originality or impact. These defy metrics.

    • DJ Phantoon says:

      No, Alec’s evil twin is Cela.

  7. The Sombrero Kid says:

    any company who uses this system in their hiring policy will make very poor hiring decisions and will most likely fail as a result of that, thus I’m not too worried, I’ve been credited on games I’ve had nothing to do with & not been credited on games I’ve worked very hard on, I’ve also been credited for 2 hours work on games versus taking a game almost solely from inception to completion without any other input, to rate ME equally for these contributions is fine from a metacritic perspective but from an employer perspective is ridiculous.

    • Multidirectional says:

      Precisely my thoughts. Studio that would make hiring decisions based on these bullshit scores is a studio that doesn’t have a clue on how to spot talents in the first place. All they would produce is generic shit games, so it’s hard to care anyway.

  8. ChaosSmurf says:

    If “you must have 2 years of experience to begin working in the industry” becomes “you must have 2 years of experience and an 80 metacritic rating to begin working in the industry” you can wave goodbye to the industry.

    • bob_d says:

      It’s not (just) years of experience – the more important metric used in the industry is “shipped games.” (Yes, even for lower level positions.) Given how many co-workers I have had that hadn’t shipped a game in 10 years (despite constant industry work on multiple titles), it’s a completely meaningless measure of experience. My fear is that this, despite being recognized as being totally meaningless, will end up being used by the industry in some way as well.

    • Tacroy says:

      That’s how it works, though – the company looking to hire for a new position puts an unachievable list of requirements in the job ad*, and then when they hire you anyway they have a significant upper hand in salary negotiations (like, “oh you don’t have ten years experience in C++ (because you’re only 20), so we’d like to make your starting salary reflect that lack of experience. Don’t worry, you’ll get a raise once your skills are up to par!”)

      *unachievable either because the people who do have all those skills aren’t going to be willing to work for that salary, or because the skills are simply impossible for a mere mortal to have all at once

  9. sockeatsock says:

    Alec Meer (RPS): 100
    Jim Rossignol (RPS): 101
    John Walker (RPS): 678
    Quintin Smith (RPS): 87980
    Kieron Gillen (ex-RPS): 99999999

    • John Walker says:

      You appear to have a very weak understanding of how percentages work.

    • Daniel Carvalho says:

      Perhaps he is redefining how we think about percentages.

    • Man Raised by Puffins says:

      John should actually have a score; as I recall he did a bit of writing for the Broken Sword Director’s Cut on the DS. Brian also had a cameo in Angel of Death, so lets count that one too.
      Via the magic of maths: (78 + 73) / 2 = 75.5
      Congratulations John, you’re 0.5%* better than John Romero!

      *well, strictly 0.6666% recurring better, but lets not get ahead of ourselves here

    • The Tupper says:

      “You appear to have a very weak understanding of how percentages work.”

      Indeed, John. You are worth at least another thirteen thousand percent.

    • Simon Hawthorne says:

      Nah, it’s just you only understand them 100%. sockeatsock understands them 124%.

    • skurmedel says:

      He seems to understand them about as good as the games rating business frankly. For it to be a percentage it needs to be a percentage of something.

    • Sarlix says:

      Edit/ just woke up….nothing to see here move along…

    • BarneyL says:

      Perhaps Alec is the baseline and the others are scored relative to him.

    • DJ Phantoon says:

      Really, the fact here is that numbers are useless. Hopefully, this is Metacritic jumping the shark and people will stop paying attention to it now.

      And honestly, should the ramblings about one random blogger about one game hold the same weight as people who do this for a living?

  10. tomeoftom says:

    How damagingly stupid. How the hell are they supposed to numerically gauge the effect a single lead designer had on the overall product? This is ridiculous.

    • gorgol says:

      “How damagingly stupid”. My new favourite phrase.

    • gorgol says:

      But you know, its the consumers, i.e. us, that are to blame for this trend. If we were to stop using metacritic as a method of judging what games to buy, then it would stop gaining as much importance for developers and publishers.

      An alternative to using metacritic is to read sites like this which tell you what new games have come out as they come out, giving you a good idea of what they are about, so that you can decide whether to give them a try or not.

      Spread the word! ;)

    • Urthman says:

      What kind of idiot uses MetaCritic that way? Its only real usefulness is to find a bunch of reviews to read more easily than a Google search.

    • bob_d says:

      It’s even worse than that – it isn’t just for lead designers, but for everyone who gets game credits. So the artists who did amazing work on a game will have their scores brought down by the bugs and lousy design, for example. it’s worse than useless.

    • Wilson says:

      @gorgol – I can’t see any way you could actually tell how many people use Metacritic to make their purchasing decisions, and even where people do use it there are probably a ton of other factors as well. So I don’t think you can blame consumers for Metacritic being seen as a valuable measure of worth (whoever actually thinks that anyway), since I highly doubt they could cite any figures (certainly no reliable figures) for how many people used Metacritic in purchasing decisions.

    • gorgol says:

      The fact remains that publishers view metacritic score as extremely important. They do so because it reflects sales figures. Which means that there is a direct and proven correlation between review scores and sales. Only consumers can break that trend, by educating themselves.

      Obviously the people on this site don’t place much importance on review scores, after all we come to a site that doesn’t give scores, but obviously a large part of the consumer base does. That needs to change.

  11. icupnimpn2 says:

    You can go to advanced search for people/artists, click the check box for game, and leave the search box blank… this will give you a list of everyone. But it is unfortunately not organized in any way. Thought it would be fun if they presented a true Top 20 or Bottom 20.

    Anyway, pretty horrible considering that many people may start out working on crap games simply because they’re at the bottom of the food chain.

    • Dances to Podcasts says:

      What you’d probably get is top 20; bunch of people who happened to only work (or get credit) on that one good game, bottom 20: bunch of people who happened to only work (or get credit) on that one bad game.

  12. frenz0rz says:

    Richard Garriot is only worth 5 more than Derek Smart? Wut?!

  13. Teddy Leach says:

    DEREK SMART. DEREK SMART. DEREK SMART.

  14. Solivagant says:

    I really doubt recruiters will really use this. Project Managers, I think, are the ones that might look at Metacritic scores. But they dont handle recruitment.

  15. CMaster says:

    So do you get included in your score every game you are credited on? Regardless of how big or small your contribution?

    Also, not long before we see games featuring lead designer Alan Smithee, with level design by Alan Smithe, Alan Smythe and Adam Smithee

    • Premium User Badge Oozo says:

      As I have written below: It’s even more arbitrary. If the cruel gods of fate have summoned up somebody to put down your name in the GameFAQs database, you will be credited for that game. If not, well, your loss.

      Needless to say that those gods are some right lazy bastards who don’t bother to summon database minions all that often.

    • bob_d says:

      The most minor contributions are indeed being treated the same as major roles.
      I immediately thought of the Alan Smithee phenomenon when I read about this, but there would be a couple differences from the Hollywood “Alan Smithee” mechanic, though. In Hollywood it’s reserved for those roles where there’s an expectation of creative control and that person doesn’t feel that they got it. There’s no such expectation in the game industry. In Hollywood, the Gaffer never feels the need to get credited as Alan Smithee (nor is there a system that allows that to happen). Also, in the game industry, “shipped games” is an important metric of experience that determines if you can get subsequent jobs, your job position, pay, etc. Not being in the credits for a game is equivalent to not having worked for that period of time in some ways. If you worked on a great PC game, the lousy PS3 port would count against you (even if you had nothing to do with it). If there was a system that allowed developers to remove their names from the credits, they’d have to do so before it was published; a game that was poorly marketed or had undiscovered bugs would see its score drop. Not to mention that no company would actually allow the employees to use pseudonyms in the credits – that would be an admission the game was bad before it was even published. Developers with name recognition will have their names being used by the publisher to drum up publicity, so there’s less than no chance they’d get credit control.

      In other words, if this score actually gets used for anything, developers are screwed.

  16. DanPryce says:

    What I’m interested in is the fact that they’re rating the individuals rather than the studio. If the studio was rated there wouldn’t be any fuss whatsoever – Valve and Blizzard sit at the top, and all is right with the world. It’s because the rating is on the people themselves that this pisses me off. It might not necessarily have been their input that caused the bad reviews – if a developer designs a decent game and the graphic department drops the ball, should he or she be held accountable? It doesn’t make sense to rate the individuals.

    • Inglourious Badger says:

      Very true. This whole thing would make sense if it was used on the studio as a whole rather than the actual devs. As it is it’s just a wierd joke.

      Be interesting to see how the likes of Ion Storm or Looking Glass would fair on a metacritic basis. Deus Ex – Daikatana + Anachranox = 70%? = Not good enough. Might explain why some of these studios didn’t last :(

  17. Grinterloper says:

    I find numerical scores for reviews are pointless (oh god I made a pun) anyway, how can you put a definitive value on something which is intrinsically subjective and nuanced? what is the unit of good?

    “using my scoratron-o-graph I can see that this game contains 84 “goods”" It just doesn’t work, but to assign them to the developers themselves?

    How do they measure this? I assume it’s purely based of the aforementioned fabricated review scores and unit sales which can be altered by the change of the winds (or fat wads of advertising cash.) So unless the metacritic people are actively snooping on the developers day to day work, hiding behind office plants and sneaking microphones into breakfast croissants, their scores are utterly baseless and can’t help but fail entirely to reflect the actual ability and worth of a developer BUNKUM I SAY BUNKUM!

    • Jason Moyer says:

      Metacritic isn’t measuring something that’s subjective, though. It’s measuring how people feel about a game, not whether or not a game is good.

      Videogames as a whole are past the point of giving a shit if games are good, it’s about whether people like them and spend money on them. The difference between, say, 1980-era Activision and 2011-era Activision is the same as the difference between Disney while Walt was alive and Disney after he died. In one case you have a bunch of guys trying to make good products with the idea that good products will sell enough to continue making good products, in the latter case you make things for the sole purpose of selling them so you can buy yachts and BMW’s.

  18. Premium User Badge Rinox says:

    How did Roper survive the debacle that was Hellgate London? :-/

    • Dominic White says:

      Because, contrary to INTERNET RAGEFEST 3000, Hellgate wasn’t that bad. After patches, it was pretty fun. And yet you’d never think that when you hear people talk about it. People seriously wishing physical harm on the developers, and no shortage of ‘I hope they never find work again’ sentiment.

      Once you escape the internet echo-chamber, you find out lots of interesting things. Like how DX: Invisible War was reviewed well and how most people actually liked it, even if it wasn’t as good as the original.

    • Premium User Badge TheApologist says:

      Yeah, I was annoyed about Hellgate going away, not really about the quality of the game. Couldn’t quite work out quite what the rage was for.

    • ReV_VAdAUL says:

      The anger was about it not being Diablo 3.

    • Premium User Badge Rinox says:

      Just to be clear: I never played Hellgate London and was talking about its reception. :-) (70 on metacritic on the PC)

      And with ‘survive’ I meant ‘to get such a high score on this ranking’. I don’t wish anyone’s career to be ended by any sort of bad or good game, obviously.

    • jstar says:

      Deus Ex invisible war was an absolute car crash of a game.

    • poop says:

      I’d like to point out at this junction that dominic white bought a lifetime subscription to hellgate

    • TillEulenspiegel says:

      ike how DX: Invisible War was reviewed well and how most people actually liked it, even if it wasn’t as good as the original.

      It’s not about objective quality, it’s disappointment. When you make a sequel that tosses out what many people loved about the original, they’re going to hate your guts, even if it’s a decent game in its own right. See also: DA2. That’s not an “internet echo chamber”, that’s a fairly predictable result of disappointment.

      I quite enjoyed the HGL beta, but never got around to buying the full game before it went bust, mostly because I really didn’t like their two-tiered, not-quite-an-MMO business model. It made getting around on the tube much more fun when I visited London for the first time, though.

    • Dominic White says:

      See, when I’m disappointed, I say ‘Well, that’s disappointing’, maybe sigh, then do something else. There are people who are still raging about Hellgate and DX;IW just as strongly years after the fact. That’s not disappointment, that’s… I don’t know what it is, but it isn’t healthy or sane.

    • Hallgrim says:

      @Dominic White: Ain’t hard to find crazy on the internet.

      /gibbers

    • Archonsod says:

      “Just to be clear: I never played Hellgate London and was talking about its reception. :-) (70 on metacritic on the PC)”

      Because critical reception =/= market reception.

    • Premium User Badge Rinox says:

      Well, the discussion was about designers being judged based on their metacritic averages.

      I didn’t refrain from playing Hellgate London because of reviews or anything else. I just didn’t care about it, like I didn’t care about Diablo. I hope that’s finally clear to everyone now. I was just wondering about how Roper got an average score of 92 on metacritic with a game judged at 70 on his resume.

  19. alexdulcianu says:

    I enjoy reading this blog, so I can’t let this hopefully mistyped statement slip: “Yet we know STALKER titles are amongst the most ambitious, clever and atmospheric videogames of recent years;”.

    I am sorry, but it is a widely accepted fact that the STALKER series ARE the most ambitious, clever and atmospheric videogames of ALL TIME.

    Meh, I’m just joking around here(I’m not), but really, no one has ever came as close to making a perfect open-world FPS as GSC did with STALKER. Not to mention they’re probably the only developers that kept improving their series over time and listening to the demands of their fans.

  20. Giant, fussy whingebag says:

    This reminds me of the much more interesting thing where someone used Metacritic data to show the stats of individual reviewers. (Looking for the link now – will edit in when I find it)

    • Giant, fussy whingebag says:

      I’m starting to think I imagined it….

    • Xocrates says:

      Metacritic does the stats of individual reviewers. Just click on the name of the reviewer and you get a bunch of data, including whether his reviews are usually higher or lower than the average.

      Maybe it was that you were thinking about?

    • Giant, fussy whingebag says:

      No, the thing that lives in my head was something else.

      (Apparently in my imagination) someone had done some data mining on Metacritic and written an article about reviewers and their scoring statistics, covering them in more depth than metacritic does. Also, I think it was about specific individuals rather than publications as a whole.

      Of course, since I can’t find a link, this all just seems like so much nonsense.

    • RagingLion says:

      You’re not imagining, I remember that too. Was is specific so one of the gaming sites like PC Gamer or Eurogamer I’m thinking. I remember RPS reporting on it I think.

    • Premium User Badge Lambchops says:

      Yeah, it was some data mining of Eurogamer which showed the average score reviewers tended to give games (I think it was then compared to the metacritic score to show how far they deviated from consensus).

      i’ve got a bookmark too the blog on my home laptop (there were quite a few interesting articles but it has been quiet lately), can’t find it right now as calling it “The Player” makes it un-google-able.

    • Giant, fussy whingebag says:

      Ah ok, so I was thinking it was Metacritic, but it was actually just Eurogamer. I’m glad some people knew what I was talking about, even if I didn’t!

    • Premium User Badge Lambchops says:

      As promised here’s some linkage. Actually, turns out you were right (there was a Eurogamer piece but it wasn’t related to Metacritic).

      Here’s the Metacritic one: http://www.theplayer.sekritforum.com/?p=51

      And here’s the Eurogamer one, which was actually about commenter disputing reviews with good old tiresome “read like a [inserts score here]” type comments: http://www.theplayer.sekritforum.com/?p=38

      The Eurogamer one I was initially thinking of was linked to elsewhere in teh above blog and was this years worth of analysis (which does compare to Metacritic) – http://eurogamer2009.heroku.com/

  21. President Weasel says:

    Bill Roper most recently oversaw Champions Online as Executive Producer. The game’s mediocrity seems fairly reflected by its metacritic score of 72: it’s not bad, it’s just not very good. great character editor, lack of content, bit of a grind-fest leading to not very much of an endgame, and a pernicious “sell the disk, and the subscriptions, AND the micropayments” business model.
    Seems like rather an indictment of metacritic’s “stick scores on people” system that he ends up being at the top of that list through being attached to some spectacularly good and well-reviewed group efforts from Blizzard, despite the game he actually helmed being rushed and unfinished, while people who have been in charge of some much better or at least more ambitious games have lower numbers semi-arbitrarily attached to them yes I know this is what you are saying with the article

    Perhaps metacritic could develop an algorithm that took into account the number of people on a given project, or the level of control a given person exerted (less weight given to being spokesgoatee for Starcraft than executive goatee of Champions, perhaps?) or a simple “add 15% for artiness” metric.
    Or just not hang arbitrary numerical scores on people.

    • Premium User Badge Malibu Stacey says:

      Since when does 72/100 = mediocre? Surely it’s around 22 points too high.

      This is the problem with sticking arbitrary numbers on things like games reviews. Very few people actually understand the scale they’re using (notable exception being EDGE magazine where an average game gets a 5 not 79%).

    • President Weasel says:

      since metacritic includes scores in its calculations from publications that mark from 7-10, a low-70s average is pretty mediocre.
      World of Warcraft and expansions – low 90s.
      Warhammer Online – 86
      Lord of the Rings Online – 86
      City of Heros – 85
      Dungeons and Dragons Online – 74
      Champions Online – 72.

      Now I’d be the first person to agree that in a sane world, a mediocre game wouldn’t get an average rating in the low 70s. This isn’t that world though: in this world, a low 70s score tends to mean a mediocre game, a game with great ambitions that didn’t quite realise them, or a game the reviewers didn’t really understand.

  22. Premium User Badge Oozo says:

    And let’s not forget the fact that they base their knowledge on who’s worked on what project on GameFAQs. Which is another horrible decision, since that database is FAR from complete. In other words: Plain stupid.

    If it really has a profound impact on the industry or enterprises who base their recruitment strategies on it, those deserve to go down, I guess.

  23. Daniel Carvalho says:

    John Carmack 78?! You gotta be kidding me. And judging by the most the developers above him, there was clearly very little Maths and game industry knowledge involved in Metacritic’s “calculations”.

    John Carmack should be much higher, if not at the top for his contributions to the gaming industry, and this list is retarded. But I guess, that is stating the obvious.

    • Premium User Badge Stijn says:

      John Carmack is probably mentioned in the “special thanks” for about every game using an id Tech engine, without him necessarily having done anything on the game itself…

    • Daniel Carvalho says:

      Well, he helped formalize certain technologies such as BSP, which, is like, a foundation block for most 3D engines this century. Whether it’s his engines or not. Big deal. That’s just one thing, there’s more.

  24. Gassalasca says:

    The Prisoner reference!

  25. Jorum says:

    “Games are art”
    If we want that to be the case maybe it is a good idea not to stick numerical values on everything and everyone.
    I wonder what Kandinsky or Van Gogh’s metacritic art score would have been while they were working?

    Art involves vision not just technical craftmanship.
    As Alec points out Boiling Point was a nightmare, but at least they aimed for something bold.

    If this takes off then it will only push games industries further down the hollywood road where safe, risk-free, dependable, bankable crap is the focus

    • Dances to Podcasts says:

      Van Gogh is known for not selling very much at all during his life, so he likely wouldn’t even get a Metacritic rating for being such an unknown. You know, just like how many good games don’t appear on Metacritic simply because they’re not ‘big’ enough.

    • Jason Moyer says:

      Bach wouldn’t have a metacritic score until 200 years after his death, but it would be 100′s across the board.

  26. Jolly Teaparty says:

    I can’t help but knee-jerk a bit here; so what? Anyone who’s been to school, let alone on to higher education, is used to having their worth as a person presented to them in some kind of standardised grade, and rarely is the system used not hotly debated on an annual basis. It’s not like their metacritic scores are accredited by any institution. I think any use of them in a selection process would reflect worse on the employer than the developer.

  27. jstar says:

    Well I think this is a brilliant idea. I keep having arguments with people about who is better, Warren Spector or Ken Levine. And not just who is better but how much better. Finally we can put that age old question to rest because clearly not only is Levine better he’s a whole 7 better. So fuck you Spector fanbois. Because that’s not just any old 7 better it’s 7 in the 80s which means more than 7 in the 70s so actually it’s probably more like 14.

  28. mcwill says:

    Once again the quietly publisher-centric viewpoint of Metafilter is demonstrated. The quality of a final production as reviewed by critics has virtually no relationship with the quality of the people working on that production, and people working in the industry rarely get to choose the projects they work on, let alone affect most creative decisions.

    This is just another stick to beat developers with, nothing more.

  29. Mitza says:

    It’s a great idea, imho, but it’s implemented badly. They seem to take into account all the games a person has been credited on, which makes it quite useless (for example, Tom Hall is also rated by the voice-work he’s done on some games :)). It would be great if it would take into account the role of that person within the game team, his responsibilities and so on. Otherwise, it’s just numbers that don’t carry any weight.

    • jstar says:

      It’s only a good idea in the same way that sharing AIDS is a good idea.

    • ReV_VAdAUL says:

      The more people who have AIDS the more motivation / funding there will be to find a cure!

      Oh but all the research will be done by people who post on the New Scientist forums.

  30. Rond says:

    Who the hell is this mister Yerli to score higher than both Carmack and Spector?

  31. TXinTXe says:

    I think we should star to campaing against game reviews’ scores. And by “we” I mean “you”, of course.
    Serioulsy though, I think it’s a far more important matter than the diferent release timeframes, and I know that here in RPS you don’t do that, but I think that no one should do it.

    • Jorum says:

      The games industry is the only one whose vast majority of critics and reviewers think a percentage score is a) appropriate b) means anything.

      Book reviews don’t go to the absurdity of stating that Harry Potter is 4% better than Catcher in the Rye.

      I’m not sure how it started (it’s been around ever since I remember and I started with Atari’s and C64s), but its a deeply stupid idea that diminishes our medium and frankly makes us look stupid.
      The quicker we dump it the better.

    • skurmedel says:

      And for some reason uses a scale from 0 to 10 or 0 to 100 and sets the median at 7… Why not use a logarithmic scale then? Or radians?

    • stahlwerk says:

      I think it became journalistic practice because early game reviews were but a few columns in general home computer review magazines in the early 80s, which were but a few columns in office appliance review magazines in the 70s, which were read by and thus written for nerdy bureaucrats deeply in love with numbers.

    • Premium User Badge Oozo says:

      As a certain Mr Wright put it so eloquently over at Gamasutra:

      “Again, it boggles the mind: of all the “art” criticism in popular culture, the reviews with the worst quality and most screwed up metrics somehow have the most power and influence over their medium.”

      Even though I’m not sure that I would fully agree with the “worst quality” thing, he certainly has a point there.

    • Thants says:

      “The games industry is the only one whose vast majority of critics and reviewers think a percentage score is a) appropriate b) means anything.”

      Well, the film industry definitely does as well. I mean, they tend to use stars rather than percentage but it’s the same thing.

  32. Turbobutts says:

    Okay, can somebody please explain to me why Cliffy “Blame it on anyone but us if our shitty console shooters don’t sell” B has a higher score than walking gods such as Peter Molyneux, Doug Church and John Carmack?

    • FreakyZoid says:

      Because what Metacritic have done is create a system that gives you no actual useful information at all about an individual developer.

    • President Weasel says:

      Yes, it’s because the metacritic algorithm assigns a score based on the averaged review scores of the games gamefaqs says the people worked on.

    • Rond says:

      And somehow Wolfenstein 3D is rated 57. They’re a bunch of madmen over there.

    • Urthman says:

      No one makes me miss Old Man Murray more than CliffyB.

  33. popeguilty says:

    Wait, wait, American McGee outranks Warren Spector? Not to shit on his work at iD- his art is great- but he hasn’ done anything worth playing since, well, he left iD. How on earth does he outrank the creator of Deus Ex, Ultima Underworld, Thief, and System Shock?

    • diebroken says:

      Apart from American McGee’s work on the Alice series (and not including Bad Day LA!), I can see your point.

    • faelnor says:

      Thank god, Warren Spector is neither the creator of Ultima Underworld nor of Thief or System Shock.

    • BooleanBob says:

      The point stands, though. Wasn’t Bad Day LA universally panned?

      More generally, given that I’m sure Metacritic have admitted to being selective – as opposed to comprehensive – in their incorporation of review sources, the entire premise of their site, be it in ‘metricising’ individual games or more collectively their producers, always struck me as being fundamentally flawed.

    • Frank says:

      Bah, they gave McGee credit for his level-design on Doom and Quake, but somehow forgot to count American McGee Presents Bad Day LA (Metascore 28) and American McGee’s Crooked House (63).

  34. FreakyZoid says:

    I’m on 93%. According to Metacritic I am a better developer than all of those listed above. This is clearly nonsense.

  35. Robert says:

    As usual with them numbers and statistics.. they are only as good as the person using them. (and knowing their worth and caveats)

  36. terry says:

    Jeepers, the sooner everyone stops ascribing any sort of significance to Metacritic scores the better.

  37. ReV_VAdAUL says:

    This will be exceptionally harmful to game reviewers. Oh its directly very bad for developers no doubt but if this catches on then those developers will be less willing to associate with or talk to any kind of objective reviewer. If your livelihood depends on good reviews its only sensible to cultivate relationships with biddable yes men in the media.

  38. AdamK117 says:

    Completely rediculous, metacritic generally seems to give high ratings to games that reviewers love. While this isnt a bad thing (they’re professional reviewers after all) they do tend to give numbers between 70-100 frequently regardless of how good it is and additionally tend to score mainstream games really highly (because PC-Gamer doesn’t want to be seen as the only reviewer slamming Dragon Age 2 now, do they?).

    Also, Gabe Newell: No metacritic :<

  39. Navagon says:

    It will be a factor in salaries simply because games companies want games that sell well and bolster their reputation for quality titles which generate repeat customers, sequels and all that bullshit. They’re not going to pay much for the guy who brings home a 65 on this metascore report card. That’s not going on the fridge. No way.

    I know it’s unfortunate that metascores largely based on blatantly paid off reviews can make or break the people not involved in paying reviewers off, but that’s the way it is. Look at what happened to Alpha Protocol.

  40. Deano2099 says:

    Yes, for it to be any use at all there needs to be some sort of weighting system. Especially as it’s one thing doing this now, where a lot of the big names got into decent development positions pretty quickly as the industry grew around them, but in ten years’ time it’ll be ridiculous. Latest hot-property developer Mr X who has been lead designer on three brilliant 90%+ titles in a row gets an average of 60% because of the five years he spent as a QA tester on hidden object games.

    Still, it’s fun, but a huge amount of stuff is missing too.

    Dave Gaider is on 93 which beats everyone else so far. But that’s purely on Neverwinter Nights and BG2. He’s not even credited for the Dragon Age games.

    Tim Schafer: 86
    Ron Gilbert: 82 (who is also rated 38 for movies, and 27 for TV, if you were still wondering how ridiculous this is)
    Mike Stemmle: 79 (again, no mention of the Lucas Arts stuff)
    Jane Jensen: 80 (although only Gabriel Knight 3 is listed, not GK1 or 2, KQ6, or any of her casual stuff at Oberon. Or Gray Matter)

  41. skyturnedred says:

    Feargus Urquhart – 84

    With all the Black Isle games and NWN2: Mask of the Betrayer in his list of games, I think he should be rated at least 92.

    • Jason Moyer says:

      He and Chris Avellone would rate damn near 100 in my book.

  42. President Weasel says:

    I can’t tell if the people posting “such and such a person should have a higher score” are being knowing and ironic or not, and that bothers me.

    • John P says:

      Likewise. The issue is not who should be higher than who, it’s that this kind of ranking shouldn’t exist at all. It’s a bit sickening really. And absolutely absurd.

    • Mad Hamish says:

      I think they are the “this review reads like a 7″ people that post n Eurogamer

    • Nick says:

      ugh, hate those people.

  43. tlarn says:

    I put as much stock in aggregate scores as I do in horoscopes; interesting to look at, but I don’t put much value in them at all. I look at individual reviews after seeing those Metacritic scores; just seems silly to me to take these numbers at face value without looking at the reviews themselves.

    Still a scary thought that people could, can, and will take these numbers on developers seriously.

  44. Jac says:

    Numbers are dumbers.

    • ReV_VAdAUL says:

      And with that I put on my shades and strutted out of maths class for the final time.

  45. wisnoskij says:

    I don’t really see the point of this.

  46. Namos says:

    So when is Metacritic starting its selective breeding program, again?

    Ridiculous.

  47. Lobotomist says:

    You know its wrong when Bill Roper ends on number 2 spot

  48. Urael says:

    Kill Metacritic with fire! This is bloody ridiculous but it could work in our favour. Hopefully by assigning numbers to people,other people will then start to understand how bloody daft it is assigning numbers to games. Or am I being too optimistic on this one?

    “(“we are/want to be a 90% Metacritic average company” is a refrain I’ve heard many times of late, especially from larger devs and publishers)”

    I just threw up a little. Yeeeuch, What a horrible thing to aspire to.

    • Thants says:

      I choose to believe that this is the result of someone who works at Metacritic realizing the fundamental useless of the concept and pushing it to this ridiculous extreme in an effort to bring down the system from inside.

  49. bill says:

    Hey it works for movie directors and actors too!
    So now I know that James Cameron is 8% better than Ridley Scott. And mark Hamil is at least 25% better than Hayden Christiansen.

    Weirdly, i was hoping for lots of crazy examples of why this would be all wrong… but almost the people I looked up seemed to be about right.

    • bob_d says:

      Looking up a friend of mine (who despite his dozen or so game titles, doesn’t even appear in their database), I accidentally found some actor whose score was brought down to 20-something because he had two very minor speaking parts (where his character was not even named) in really terrible movies. It’s just one of the more obvious problems with a system like this.

  50. Soon says:

    I prefer the gladiatorial system.