Thanks For Screwing The PC Over, Genuinely

By Alec Meer on November 29th, 2010 at 6:06 pm.

I had a chat (registration required, and yes I’ve heard every argument against that) with Markus ‘Notch’ Persson for my day job recently, and was rather taken with one particular observation he made about the current state of PC gaming. To crudely paraphrase: the big publishers pissing off to console because they thought the PC wasn’t as lucrative as platform as they’d like actually turned out to be a good thing.

With all the sound and fury of big, PC-specific, graphically intensive games gone, there was space for something new – something better, I’d argue – to come through. Leading on from that, I’d like to thank the graphics card companies for making such a right royal mess of the PC. I’m not being sarcastic. They did us a favour.

Here’s the quote I’m on about:

The games industry started moving away from PC and into console a lot. While there are a few hardcore PC studios around, most of it seems to be focused on the console versions. They only really port the PC versions. The indie market really could blossom because people started realising that we’re actually doing interesting ideas in the indie games. Something like in the early 90s, games that were made by id Software or Epic – small developer teams who actually took chances because they didn’t have huge projects. So the indie scene could blossom; there are a lot of indie games on console too and they’re selling really well as well. But I think it’s one result of the sort of abandonment of PC gaming.

The other thing leading on from that, which we touched on very slightlly in the interview, was that perhaps the ‘abandonment’ was caused by gamers feeling to consoles to escape the tyranny of expensive graphics card upgrades, which is a fascinating idea: the vainglorious pursuit of ever more power and speed led to a perhaps inevitable downfall. I’ve blogged (in my biz voice) a little more on that here, but I’d like to be a little more emotive with you trusted souls.

Admittedly, my experience of 2001-2007 was coloured by working on PC Format and thus being required to keep up with the pixel-pushing Joneses, but the need for new graphics cards seemed constant. A steady flow of graphically-intensive, milestone titles gave NVIDIA and ATI cue to keep on pumping out ever-faster cards – which in turn led to a weird sort of terror amongst PC gamers both existent and potential. Every year or so, we’d need to splash out again, usually in the form of hundreds of pounds/dollars. It was exhausting. Worse, it was bewildering.

The furiously complicated naming conventions the graphics card companies used/use is, as far as I’m concerned, unforgivable, as was the sheer range of different boards available at any one time. Numbers, letters, ‘ultras’ and ‘pros’. Once I kept up, doggedly. Now I don’t care enough to try to. Not only did that overblown marketing vampirism make selecting which was the best card at a given pricepoint incredibly hard work for casual upgraders, it meant sub-par boards were put out with names incredibly similar to decent ones. Just a GS or GT at the end could make all the difference, and you could too easily end up buying an absolute lemon by mistake. What was the purpose? To trap us into endless upgrade cycles from fear and confusion alone? Or was the PC simply such an unregulated Wild West (something that otherwise serves the platform well) that the graphics card companies didn’t really know what they were doing either?

Now, it’s all change. Gaming-capable PCs are much cheaper, and few new games require a high-end machine – either because they’re ports of console titles or they’re lo-fi, marvellously interesting indie fare. While I miss being able to decisively say that the PC is the most technologically progressive platform, I wouldn’t swap this age for that one. RPS circa 2004 would probably have been wall-to-wall FPSes. Tedious. Now, with the big publishers off pouring everything into their Call of War Honor on console and the graphics tech companies seeing a dramatic decline in card sales because there’s not much that demands an expensive upgrade, PC gaming is wall-to-wall wondrous unpredictability, left to free-thinkers unbound by technological and budgetary restrictions – which means I can decisively say that the PC is most creatively interesting platform.

I haven’t upgraded my graphics card for going on three years. I don’t expect to have to any time soon, unless I’m planning on picking up a ludicrously big, high-res monitor. I don’t see another Crysis on the horizon*. And I’m glad.

* By which I mean a technological landmark of a title, which both fuels and is fuelled by a massed rush for system upgrades. So don’t give that ‘wot about Crysis 2 hurr’ claptrap, bucko.

__________________

« | »

, , .

258 Comments »

  1. Cunzy1 1 says:

    Yes. This is why I rarely play games on a PC. On my downtime I don’t want to buy a product and then potentially have a chance to play it without upgrading this or that or messing around with deleting files or downloading bits of software.

    Console on, Game in, Game on.

    Well until the Xbox 360 and the PS3 came along which seem to want to emulate the PC experience by downloading this n that whilst a non real time loading bar sort of moves along. HOW HARD IS IT TO MAKE REAL TIME LOADING BARS?

    • Rich says:

      If you could attach a console to a proper monitor, which runs at Stupid x Big resolution, sit it at a desk and use a mouse and keyboard to play it, then maybe I’d agree with you. Not that I normally have to do much to get a game that’s already installed going, besides turn it on, double click the icon, wait for it to load, play. I don’t even have to put anything in, since most games these days don’t actually need disks.

      Also, unlike the black boxes that consoles basically are, I rather like that I can generally solve any problems my PC throws at me. Consoles have less problems to solve, generally, but when you get one the thing is going in the bin, unless you want to spend silly money, or whatever mega-corp built the thing as eventually admitted that it was their fault anyway.

    • Gap Gen says:

      I think the 360 and PS3 are the reason graphics haven’t really changed much in 3 years. If all the big-budget games are coming out on console, and consoles haven’t changed in 5 years, there’s no point in blowing your console love-in by making a version of your game that looks ten times nicer on the PC. Hence there’s no demand for graphics cards any better than the top end 5 years ago.

    • Rich says:

      “consoles haven’t changed in 5 years”
      Indeed Microsoft have reportedly said that the xbox 360 is only in the middle of its life, and will continue to be their main platform until 2015. That’s a long time for hardware.

    • Diziet says:

      You can attach a console to a proper monitor. In fact my PC runs quite happily through my Philips TV and my consoles also run quite happily through my Acer Monitor. Both setups are used depending on what I am doing with each device. The wonders of hdmi/dvi.
      I find games a pleasure on both systems and find most of the PC/Console argument quite sad. With the exception of FPS games which are better with a mouse and keyboard. I rarely enjoy them these days though.

    • Rich says:

      Yeah, I appreciate that you can. I just need a better TV… or a bigger living room in which to fit a small desk and fancy monitor.

      An xbox 360 really is just a (fairly flimsy) PC anyway. I actually recognise all the bits inside.
      I have no idea how the architecture in a PS3 works though.

    • TotalBiscuit says:

      “If you could attach a console to a proper monitor, which runs at Stupid x Big resolution, sit it at a desk and use a mouse and keyboard to play it, then maybe I’d agree with you.”

      I can, I own a PS3. All 3 of my consoles run through my 30″ Dell 3008WFP.

      ITT : People that don’t know what they’re talking about.

    • Danarchist says:

      The only thing keeping me from buying a console is quite frankly the lack of mouse/keyboard. My thumbs lack the agility of my other digits and the guy at the top of this stack of comments had it right. I hate how big and “pixley” everything looks on a tv from a console. Whereas hooking my laptop up via hdmi and using my wireless keyboard and mouse produces 80″ of pure eye candy.
      But if someone came out with a mouse and keyboard that works on the 360 and all games past and present I would go buy one tomorrow just to play some of the console only games I missed out on. Doesn’t seem like THAT technically difficult a thing to do, I always wonder why no one has.

      There is no way in hell you can convince me that aiming a targeting reticule with your thumb is as precise/easy/pleasurable as it is with my razor mouse.

    • Rich says:

      Well excuuuuuuuuuuse me TotalBiscuit. What about the mouse and keyboard bit?

      I primarily play real time and turn based strategies, and FPSs. The first two just don’t work without a keyboard, and I’ll never get used to the last one using a controller.

    • UW says:

      TotalBiscuit: Oh so your consoles maximize the potential of said monitor with an output at a 2560×1600 resolution? I see, I see. That’s surprising, I didn’t think that was possible.

    • Kryopsis says:

      The PS3 has integrated support for standard USB mouse and keyboard, as well as most generic USB peripherals (with the exception of the Xbox 360 controller, of course).

    • MaXimillion says:

      TotalBiscuit: Your console is not capable of running at Stupid*Big resolution

    • DrGonzo says:

      Ah! Beat me to it. I was going to say that too, you can play with a mouse and keyboard on PS3.

    • dadioflex says:

      But only PS3 game that supports m+k is UT3 AFAIK. Which you really want to play on the PC for the mods.

    • Clovis says:

      I don’t think many games actually support the mouse on the PS3. For FPSs this is generally preferable. The super-majority of console users are perfectly happy (auto) aiming with their controllers. They wouldn’t be too happy if someone else was sniping them with a good mouse. One of the few advantages of the consoles is that they at least level the playing field for multiplayer.

      I have a PS3, but, wow, those games are bloody expensive.

    • Radiant says:

      “Your PS3 is not capable of playing at stupid big resolutions”

      To be fair neither can most pcs or most games.

      Ok they /can/ but unless you love 20 fps slideshows or only using googledocs then they can’t.

      I stick in a dvd in a console and I know it will look exactly like the screenshots and videos that sold me the game in the first place.
      There’s no getting away from that.

      Whether the game is /actually/ any good though is anyone’s guess.

    • Nick says:

      “I stick in a dvd in a console and I know it will look exactly like the screenshots and videos that sold me the game in the first place.”

      My how shallow.

    • LyskTrevise says:

      http://www.amazon.com/Xfps-360-Keyboard-Mouse-Adapter-Xbox/dp/B0010ZH3V8

      XFPS: Mouse and Keyboard for Xbox 360. Attach xbox to monitor. Get 1080p resolution for cheaper price than TV, have mouse and keyboard precision.

      Enjoy.

    • jer says:

      I dunno…I haven’t upgraded my PC in 2 or 3 years and I can still pretty much play anything I install at max or near max settings. The only one I had to “bump down” from 1920×1080 to 1280×720 was Crysis and that was if I wanted all the fancy stuff turned on except for FSAA.

      Honestly, my PC is no powerhouse by today’s standards (dual 3.6, 3GB RAM, nV 9800GTX+) and it cost me something like $900 to put together. I can still play the new games that come out and they look excellent on the HDTV. Yes, I recently caved and got an Xbox as well for some console exclusives and so I could finally play online against console-only friends, but the games on my PC still look better. Considering that my machine is basically a modest video card and wireless kb/mouse away from a mainstream PC these days, I would argue that it costs less to game this way than it does on Xbox.

      Either way to each their own. I’m glad to see that other people find the lack of big-name enthusiasm to have a positive side as well.

    • nessin says:

      Isn’t the real heart of the problem that:

      A) When someone says Game A looks awesome on a console but crappy on a PC (or comparing Game A or Game B) they’re really saying Game looks awesome at *x480 or *x720 (Console) and crappy on *x1020 or *x1200. The XBOX 360 and PS3 don’t look nicer (or equivalent) to a PC because they’re more advanced or longer lasting, they run at a much reduced capacity than your average current gaming rig.

      B) A lot of the upgrade is overblown by advertising and deliberately vague statements. The race to have the top video card for the latest game has really only taken off in the past five years, and was only an issue of note for a few years before that.

    • Dawngreeter says:

      “I stick in a dvd in a console and I know it will look exactly like the screenshots and videos that sold me the game in the first place.”

      That’s odd. I usually expect a lot better than a YouTube-grade trailer or 800×600 screenshots. But maybe that’s just me, snobishly expecting to actually use a 1920×1280 monitor for displaying 1920×1280 pictures.

      Not to mention using my computer as if it were, you know. A computer. And mine.

    • opel says:

      @LyskTrevise
      Oh, goodie. A mouse+keyboard adaptor that costs more than a mouse+keyboard.

    • el Chi says:

      “The race to have the top video card for the latest game has really only taken off in the past five years, and was only an issue of note for a few years before that.”
      Not at all. I’d say graphics cards started to become a really big issue when nVidia started to take on (and eventually vanquish) 3Dfx, in the late ’90s/early ’00s. Quake II and especially Unreal were notable battlegrounds.
      If anything, the importance of hyper-beefy graphics cards has begun to slow over the last few years.

    • Fiatil says:

      Hell I bought my laptop two years ago for $1100 and it can still run most everything out there at max settings. Hook it up to my 40″ TV and I can play New Vegas or any other multiplatform game at higher settings and (most of the time) a higher resolution than my PS3 can do for the same thing. If I would have bought a desktop instead (college woo!) it wouldn’t even be close.

    • battles_atlas says:

      Disappointed that none of you have returned to the branding issue. As Alec says, PC hardware buying is made fantasically more complicated than it needs to be by wilfully confusing naming systems. Rather than a problem that’s going away, Intel appears to be embracing this ‘strategy’ with its insane i3/i/5/i7 dual/quad 1156/1366 bullshit. Nvidia are of course veterans of this game. Even ATI have been at it recently with their dumb 6870 which *obviously* is the successor to the 5850.

      Its a tradegy of the commons type situation.

  2. man-eater chimp says:

    It is very nice being able to play all manner of games on this nice newish laptop, including games such as TF2 and L4D2, while not on the highest graphics settings, still giving me more than an adequate gaming experience. While I can also play all these fantastic new indie games!

    On the other hand, I do like watching my brother playing TF2 on his huge iMac (yes, I know) on full graphics settings and sometimes programs running in the background with no slowdown whatsoever.

    In all honesty though, I’d be content with a computer that could run Foot-to-ball manager and nothing else.

    • DrGonzo says:

      I tried to play TF2 on my parents iMac last time I was visiting home. But Steam crashed almost constantly, the game was an insanely big download for some reason, then it ran like shit had artifacts and crashed a lot. I can’t say I was impressed.

    • outoffeelinsobad says:

      I also have a nice, not-so-newish laptop, on which I only expected to be able to play maybe Risen. Turns out, it runs Crysis at max. Thanks, console devs! And thanks, ASUS!

  3. Navagon says:

    This is more or less what I’ve been saying for a long time. But also factoring in how big publishers and developers alike have lost touch with their audience.

    Whereas indies often talk directly to their customers and take into account their views and opinions, developers are increasingly designing by committee and publishers are lacing their games with DRM. Not to stop piracy. But rather to control how their paying customers may use their purchases and also to please their shareholders.

  4. Alegis says:

    I haven’t upgraded my desktop in centuries.

    • Rich says:

      I can still run Sword of the Stars on my Babbage’s Decision Engine. Why would I want to upgrade?

    • sebmojo says:

      Pish! The Mipmapping on my Assemblage of Rocks and Strings is UNPARALLELLED! Master of Orion 2 looks like a DREAM GIVEN FORM!

    • wryterra says:

      It looks like Babylon 5? Time to upgrade the Amiga, man.

    • Spacewalk says:

      Upgrade? You can do that with a PC?

    • Anonymousity says:

      I bought my pc at the right time, just before crisis (without knowing that crisis was coming out) I’d previously been on a break from pc gaming because the upgrado-fest had bored me to death, not had to upgrade since I got it.

  5. KBKarma says:

    I’d agree with this. However, my desktop machine has a very low-end graphics card (namely, an onboard one without hardware T&L), which limits the games I can play on it. I’m planning on building my own rig, and am planning on having a nice big card in there so I can play all the big games I missed.

    It’s odd, now I come to think about it: all the really REALLY graphically-intensive games I enjoy are a year or more old (the last one was Borderlands). With a nice processor and graphics card, I can play older games that I enjoyed or missed, as well as all the newer, less power-hungry ones.

    Well, I thought it was odd. I still agree, and, in fact, will be following the same route you have: stick in a nice big graphics card, and not upgrade for years.

    • Jason Moyer says:

      No Hardware T&L? Has anyone even released a GPU without that in the past decade?

    • KBKarma says:

      Yes: Intel. ;_;

      Having perused this, I’m beginning to think I don’t even HAVE a graphics card, and all the graphical work is being done by my CPU. If so, that explains the problem rather succinctly.

    • Optimaximal says:

      That was a link to motherboard chipsets, most of which have GMAs in them.

    • Tacroy says:

      Just FYI, even a ridiculously cheap graphics card (think $30 – $50) will probably do wonders for your desktop nowadays. They’re relatively easy to install, too! Just make sure you get the right kind – if your computer is old enough (> 5-6 years, I believe), it will have an AGP slot for the graphics card, otherwise it’s PCI-E.

      Also, upgrade your RAM while you’ve got your case open. There is only one rule about RAM: if you have less than 4 GB installed, you need more. No exceptions.

      Since you have what sounds like a store-bought computer, you can crack it open, find some serial number (usually it’s relatively large and in white) and google that; this will generally get you some specs on the motherboard, which will give you all sorts of useful information like what kind of RAM it takes and what kind of card slots it has.

    • Lilliput King says:

      Given that you don’t even have hardware t+l, it’s pretty incredible you actually managed to play Borderlands.

    • Mr.GRIM says:

      No, you don’t need more RAM if you have 4 gigs. I have 3 Gigs of RAM running windows vista, and I run everything at max settings without any kind of problem at all.

  6. Ybfelix says:

    I think the production cost of games has also hit some kind of a ceiling, too?

    • Dave says:

      I’m not sure about that. BioWare Austin is busting through that ceiling with gusto.

      I think we also have Microsoft to thank, for completely screwing up DirectX 10 by making it Vista-only at a time when nobody wanted Vista. Brand new triple-A titles are still targeted for DX9.

    • Nathan says:

      But so are games targeted at consoles, given how the 360 is (as far as I remember?) also on DX9.

    • Benny says:

      Yup, 360 is dx9 with a few shiny bits added on which made their way into dx10. And yes making it vista exclusive was a bit silly, when no gamer wanted a gig of OS overhead on their PC.

  7. Bungle says:

    I have been slowly coming to this same conslusion myself. While it’s a bit boring to be stuck in this gaming nether period, I know I’ll be on the ground floor when the next renaissance begins. Although, like many gamers, I only care about multiplayer from here on out.

    • Babs says:

      I like many players, care very little about multiplayer. Thankfully the wonderful indies cater to us both. :)

    • Casimir's Blake says:

      Actually I’d argue that even though there are many single-player-centric indie games, few of them provide a compelling experience. Interesting character development is a rarity.

    • Babs says:

      And I would say that character development, or in fact any worthwhile story at all, is vanishingly rare in videogames as a whole. Independent games simply mirror the wider trend, and in fact the best story I can think of recently is from Amnesia.

      But then, I play games primarily for their mechanics. If I want deep meaningful story I’ll read a book or watch a film, it’s just not where I see the medium’s strength lies. Though I acknowledge that I’m perhaps n the minority there.

    • Grandstone says:

      @Babs

      You may be in the minority, but I maintain it’s the correct one. What are the stories of chess and go? Of Tetris? Does Counter-Strike really have a story? Is it poorer for not having one? Is it worse to have a bad story than no story at all?

      I’d say the only thing that matters is mechanics if it didn’t imply that I would be fine with games about mobile obelisks shooting hope at flying bananas, but that would be ridiculous, so I suppose I’ll say that a game should have the minimum dressing necessary to justify its mechanics.

    • skalpadda says:

      “What are the stories of chess and go? Of Tetris? Does Counter-Strike really have a story? Is it poorer for not having one? Is it worse to have a bad story than no story at all?”

      Would Planescape Torment be as good if it didn’t have a story?

      Stories in games, when used well, are not just something that’s tacked onto a framework of mechanics, but used as mechanics and rewards integral to the game. Half Life doesn’t have a score counter, for example; the reward is to move forward through the world and story, finding out what happens next and getting to the next challenge. Many RPGs and adventure games make the story a game mechanic where you explore and shape it through dialogue and choices.

    • Grandstone says:

      @skalpadda

      Well said. The answer to your question is obviously “no”.

      Just the same, Half-Life motivates me not through its story but through thrilling me and promising to continue doing so. The game does most of the work, and the theme does the rest. “I am a disembodied something shooting some other things in a black-and-white corridor”, while still fun, isn’t as gripping as “I am a scientist shooting aliens in a chaotic lab”.

      I think we agree with each other. I don’t see what Half-Life’s lacking a score counter has to do with anything.

  8. Paul says:

    Hmmm, I upgrade once every 4 years with roughly 400 bucks and do just fine. The superior quality of everything on PC is worth it so much. I think internet and its widespread availability is much more a cause for the indie scene being so big, rather then big guys leaving PC.

  9. H.Bogard says:

    What’s interesting is that despite keeping below the latest and greatest graphical standards, indie developers have ready horsepower in today’s desktop PC’s in this era that they can still churn out some exceptionally eye-popping visuals with the power of artistic vision alone. Trine, Amnesia and Machinarium come to mind.

    • Clovis says:

      Some indie developers, or at least Dwarf Fortress, actually use the power of today’s CPUs too.

    • MaXimillion says:

      Clovis: Dwarf Fortress only took advantage of a single core last I checked. It utilizes the CPU poorly, and does nothing with the GPU

    • Malibu Stacey says:

      Shame Dwarf Fortress can’t use more than 1 CPU thread & 2GB of RAM so barely taxes the average gamers machine. Oh and not to mention, riddled with bugs which aren’t ever being addressed.

    • Psychopomp says:

      You spelled “features” wrong.

  10. Mark says:

    Frankly, it just makes good business sense to push the technological envelope, at least software-side, on relatively standardized platforms. Maximized the audience and all. PC has glorious wonderful everything, but when what you need is numbers, that’s what the appliances are there for. Enthusiasts are a subset of all the people who will be interested in a game. In the transaction between player and maker, somebody has to eat the cost of making the game work reliably. The PC is for those who aren’t afraid of that overhead. Those who are afraid are still welcome, of course; with the likes of Steam, they are more welcome than ever. But they are guests.

  11. Clovis says:

    During Black/Friday or Monday I’ve been searching for upgrades. Maybe just a $200 graphics card, but maybe also a new cpu/mobo/memory for around $500. However, my current computer can pretty much play any game out there, and only handles badly optimized games like GTAIV sub-par. I can’t max out Crysis, but even on lower settings it looks better than anything on the consoles.

    Stranger, even though I’ve been buying new graphics intensive games, I’ve mostly been playing Desktop Dungeons and Minecraft. I can’t decide if I want to continue my Dragon Age: Origins campaign, or instead start playing Recettear. And, oh, look, Super Meat Boy unlocks tomorrow!

    This is all very annoying actually. I’m finally at a point in my life where I could easily drop $1000 on a super top of the line computer, but there are simply no games to play on it. I’m definitely happy that the indie scene is so great, but I still like a good graphics fest every once in awhile.

    • Shagittarius says:

      Indy games are not the answer. New blood in the old industry is. Besides Minecraft, I haven’t found a single redeeming indy game in the last 2 years that was more than a glorified flash game I already played endlessly 15 years ago when it was state of the art.

      Sorry, Indy is not the be all answer, you fooling yourself.

    • Clovis says:

      lulwot? I’ve no idea what you are talking about.

    • Stephen Roberts says:

      One of the most graphically impressive games I’ve played is Just Cause 2. But you can’t justify going out and spending all your cash on new hardware because the swines went and made a really stable, smooth-running experience of it.

    • battles_atlas says:

      Totally with you Clovis on the mistiming of my life cycle relative to the graphics industry. Spend years not able to afford an upgrade, the minute I’m flush I’ve nothing to spend it on.

      As much as I wish I could, I can’t feel this whole indie love-in thing. I’ve never seen an indie game that came within light years of having the impact on me that HL or Deus Ex did. Indies may have some neat ideas, or artistic flourishes, but the package is just too limited right now. Portal is a telling example – great idea that started as an Indie, but without Valve’s support and funding it would never have been the masterpiece it was.

      As it stands, for me the celebration of Indies is just a rather desperate attempt to put some gloss on a pretty awful situation.

    • iniudan says:

      @Shagittarius

      Go play Aquaria, then I challenge you to say that again.

  12. wcanyon says:

    Don’t forget that PCs do more than just play games.

    Also, I spend on average $100 on a video card every other year and do just fine. I’ve never tried to get it to run Crysis, but I don’t care enough about Crysis. I don’t really want to drive a Maserati when a Camry gets me there in comfort and something like style.

    • Malibu Stacey says:

      “Don’t forget that PCs do more than just play games.”

      And the current generation of consoles are only games playing devices?

    • P3RF3CT D3ATH says:

      Play movies, music, and browse the Internet. With a PC, I can do all that already with a better input (mouse & keyboard) because, let’s face it, browsing the Internet is easier that way, and so much more like editing photos, software, anything really.

    • Fumarole says:

      Ask yourself this: are PCs becoming more console-like or is it the other way round?

    • bildo says:

      @Malibu Stacey

      Oh come on, you knew exactly what he meant @_@.

  13. Caleb367 says:

    Meh. This sounds like the OMG COMPUTER GAMES ARE DEAD that gets out about monthly since, well, twenty years ago . NES will kill the C64! SuperNES will kill Amiga! PS1 will kill PC! And so on and so on and so on. It still ain’t an excuse for releasing shitty console ports. Besides, get me a console which runs ArmA2. Or EU3. Or DCS Black Shark. (sarcasm intended)

  14. Risingson says:

    But then, what will happen when all the indie developers move to consoles? Like Limbo, for example.

  15. strange headache says:

    I’m sorry, but I’m not glad. Yes I DO enjoy my fair share of indie games, but I also enjoy technically advanced games requiring a decent PC. Why is it so impossible to have both? And to be honest, PC gaming never has been cheaper:

    1. graphic cards have become much cheaper, You can get a good one for 140 Euros.
    2. generally PC games are sold cheaper than their console counterparts.
    3. General hardware has become a LOT cheaper, so upgrading your system every 2 years won’t hurt that much.

    I agree that the PC is a very creative platform, but it has been like that since forever. In the past small games weren’t called “indie games” but “shareware games”, so while the name has changed, in essence they are the same. I’ve been playing games from small garage developers since the ATARI ST.

    Besodes we should not forget that more and more indie developers are moving over to the consoles, with XBLA gaining increased popularity. So while the PC can still hold onto his creative indie market, consoles are gaining ground in that area.

    Last but not least, many PC games now suffer from consolitis. Yes I know it’s a tired discussion but the fact remains. Now I don’t care for the most fancy graphics, but an advancement in hardware not only means better graphics but better gameplay, more immersion, bigger worlds, more player freedom, etc…

    I’d really hate it so see all that go to waste. While indie games are much fun, they seldom push the limits of what’s possible on a Computer. I’d really like to see more of that again and a couple a hundred bucks every few years will be worth investing into my favorite past time. Compared to the rest of all possibly hobbies, PC gaming is still amidst the cheapest ones. Pushing the technological envelope once in a while, won’t change that fact,

  16. Xercies says:

    I agree with this i just wish that there would be mid studios comeing out of it. indies and big companies are all well but wheres the good ideas with a medicrum of budget. because i don’t know about you but well…it seems the indies are full of platformers and stuff which wile interesting. i would like something like say pschonaughts or beyond good and evil made by these middle companies.

    But this is exactly what i have been saying, PC was just to damn hard to get into before so people went to consoles since it was easier. now a days 2 year old computers can play modern games well. i like this and I want it to continue…just with better games.

  17. Dave says:

    For a while there I was in a “build a new PC every two years” cycle, but I’m coming up on the third anniversary of my current machine, and I don’t feel the call to upgrade at all. My trusty 8800GT is still plugging along in a perfectly satisfying way, although to be fair I haven’t been demanding it do much more than play Dragon Age lately (must…finish…before…2).

    I realize this is by no means an original observation, but it’s interesting how the slide away from massively demanding PC exclusives toward console ports has the side effect of putting some (most?) of the burden on the community. I’m thinking specifically of the Oblivion/Fallout games here specifically: the graphics and game play are all well and good for consoles, but the mod community has put in a massive effort into upgrading every facet of those games in order to more fully take advantage of the PC’s capabilities.

    Which, arguably, they shouldn’t have had to do. I just wonder if that sort of thing isn’t going to become the norm.

    • Droniac says:

      I think that has more to do with getting everything to fit on a single DVD than with the move of high profile developers to consoles. There are several PC exclusives, like The Witcher and the S.T.A.L.K.E.R. series, that see similarly massive community undertakings with huge texture mods, lighting mods and NPC reskins. Those games may not need it quite as much as the likes of Dragon Age or Fallout: New Vegas, but there’s still an enormous difference between playing S.T.A.L.K.E.R. vanilla and modded.

      This may have gained a lot of popularity with Bethesda’s more recent RPGs, but it’s hardly a recent phenomenon. Modders have been overhauling their favorite games with better graphics for well over a decade now. Just look at the resolution mods for Infinity engine games (Baldur’s Gate, PlaneScape: Torment), the high definition texture pack for Deus Ex, the texture packs for Neverwinter Nights, the high-res texture packs for Unreal (Tournament) and the FreeSpace 2 Source Code project.

    • Mistabashi says:

      Low res textures are rarely due to storage limit; if you take Fallout 3 as an example the game takes roughly 6.5 GB on the disc, which leaves about 2GB free – easily enough to double or quadruple the texture resolution.

      The reason all these console ports have low-res textures is because all the assets are designed for consoles, which have very limited amounts of memory. It really wouldn’t be a lot to ask for developers to work at higher resolutions and downscale them for the console version, but it seems they either don’t care enough about PC users or they don’t want their console version to look significantly inferior to the PC version.

    • Droniac says:

      @Mistabashi
      Which leads right back into my original point: it’s not – always – due to console ports, because there are plenty of PC exclusives suffering from similarly poor texture work. That’s either a result of lazy artists or excessive culling on texture quality when trying to make it fit on a single DVD. The latter sounds far more likely when you consider how much work went into the engines for games like S.T.A.L.K.E.R. and The Witcher only to have them suffer from relatively low-res textures in the final, PC-exclusive, product.

      You’re definitely on the mark when it comes to Bethesda’s games, but they’re not the only ones making games with muddled and low-res textures.

    • Mistabashi says:

      Well I’m not sure about The Witcher, but I wouldn’t agree that Stalker suffers from low-res textures on the whole – most are between 1024 and 4096, which is perfectly adequate. A lot of texture mods for Stalker are the same resolution, just heavily sharpened in Photoshop (one of the reasons I tend to dislike most of them). And of course modders are generally amateurs, so they won’t necessarily realise that most of the percieved sharpness issues are down to the mip-maps rather than the actual resolution. There are a few that could stand to be higher-res, such as most NPC textures, but the reason for this is simply that they’re really old – the first game was in development for over six years, and some of the most recogniseable textures (such as the rookie Stalker for example) have been present since the really early builds.

      The fact that modders make texture packs isn’t necessarily a relection on the objective quality of the game’s original textures. Stalker just has a very dedicated and active modding community.

  18. Alex Bakke says:

    I’ve had my current GPU for 2 years now, plays everything on high still, around 40FPS. Good enough for me.

  19. Cables says:

    I don’t understand why people keep complaining about upgrading hardware. I haven’t upgraded my computer for years and it still runs fine. It’s even older than the 360 and it can still run games better than it.

    • sebmojo says:

      My benchmark is the 8800 GT – anything faster than that is gravy (unless you have a huge monitor, natch). Three years old.

    • LionsPhil says:

      Yeah, L4D2 and TF2 seem happy at native res with everything at high on mine.

      Shame it’s such a noisy, hot, power-hungry thing. But then all the newer cards are worse. I can’t wait for integrated Intel stuff to finishing undermining nVidia and ATi because my goddamn Intel HD-powered ultraportable LAPTOP can run those two at medium-ish settings, and it does it off batteries for hours with a single fairly quiet fan while just getting warm.

    • noobnob says:

      Actually, the latest ATI graphics cards (5xxx/6xxx series) aren’t power hungry at all, are considerably less noisy and run cooler, save for the bulky Dual GPU cards. Can’t say the same about nVidia’s latest cards, except for the GTX 460 and the odd entry-level, fanless cards.

    • Sir-Lucius says:

      I think part of it is that people have a natural tendency to want the best. I’d actually argue that much of this “upgrade every year mentality” didn’t come from an inherent NEED to upgrade to run the game, but from a desire to upgrade to run the game at it’s max settings. I actually did most of my gaming from ~2004-2007 on a laptop when I went off to college, and while granted it was a gaming laptop, I never felt like I NEEDED to buy a new machine. Granted, this was near the end of the graphics rush era of PC gaming, but I was able to play Crysis on the system at a mix of low/medium settings just fine. Was it ideal? No, but turning down the graphics options from high to medium or lowering the resolution from 1920×1200 to 1680×1050 or so still let me enjoy games just fine.

      I certainly agree it’s nice that smaller indie projects have taken a bigger spotlight within the PC community, but I do feel like this notion that you have to upgrade every year has been misrepresented, even from within the PC community. Sure, hardware wouldn’t let you run new releases completely maxed out for 3 years like people can do now, but I also think it was an exception, not the norm, to literally HAVE to upgrade hardware in order to play a game.

      PC games have these nifty little sliders that let you customize the visual fidelity of a game according to your own specific hardware, but I do think a lot of people are turned off knowing that they COULD have a better looking game available if they had a different set of hardware.

    • LionsPhil says:

      So what you’re saying is that games should take a more Apple-like “you don’t need to know about controlling this” mentality and replace all graphics options with autodetecting.

    • LionsPhil says:

      Thanks, comment system, for stripping out the </devil’s advocate> that greatly affected the tone of that post!

    • malkav11 says:

      I think part of it is that people -are- seduced by the gigantic gorgeous monitors. Little do they realize that that mega monitor commits them to equally megalithic resolutions that up the computing power they need by huge, expensive steps. I ran at 1024×768 for years and enjoyed new games at high to maximum graphics settings (resolution aside) and decent framerates on decidedly inexpensive hardware. And it was certainly never required to upgrade my video card every year – I do it maybe once every other generation, so…oh, every three years or so, I’d say. I have at times gone longer. Oh, I upgrade -something- every year, just because I like to, but not infrequently it’s something less expensive like hard drives. Now that I’m making a livable salary, I’ve allowed myself to go up to a 1440×900 native res monitor, which is great, and still allows me to go without monstrously expensive kit, if less easily than 1024×768 did.

  20. Lewie Procter says:

    I’d also say that developers of PC exclusives are also catching on to the idea of optimise their games for the mid/low end of hardware specs.

    If only the top 10% of computers can play your game, then that is a lot less potential customers.

    I’d probably say that is a combination of digital distribution earning them more money from older titles than they would have from budget ranges of old software (as the margins are higher) in the past, and the now long in the tooth 360 squirting out ports that will run nicely on a three year old PC.

    I don’t think we have to worry about indies moving over to consoles. It mostly happens when Microsoft/Sony/Nintendo throw money at them, and they don’t have unlimited funds.

    • Sinomatic says:

      Indeed, I think this idea that PC gaming is for enthusiasts and consoles are for everyone else is incredibly limiting. With more and more gaming capable PCs, laptops and macs about these days, and the ease with which games can be bought and downloaded, restricting PC games development only to titles that push the top tier of gaming hardware is just silly.

      Besides, I’m not a PC gamer because I have prettier graphics than I’d have on a console (well, with the exception of the wii perhaps), but because I prefer the type of games, or I prefer the control scheme, or the ability to mod and get custom content…….or a bunch of other reasons that have very little to do with hardware.

    • PlayOm says:

      I think that HL2 was the first game to do this. It was an advanced game (if not perhaps eyeball-meltingly pretty) that ran pretty well on low-spec machines. As with everything else Valve does, its taken the rest of the industry a while to catch up

    • Malibu Stacey says:

      There’s a difference between what VALVe does & what Lewie is describing. VALVe make their games to look like the dogs bollocks on top end hardware & scale down. If you target the low to mid range of system specs & don’t give a monkeys about the top end, that’s pretty much the complete opposite of how VALVe do it/did it.
      I’m all for games scaling their visuals/features to fit the system they’re running on but it’s very disingenuous to think that is what is what the majority of developers are targetting in the present day.

  21. dogsolitude_uk says:

    Another thing that helps Indie devs and bedroom coders is that it’s comparatively easy to develop on the PC.

    Sure, you need to understand about loops, variables, Object Orientated Programming and suchlike, but if you fancy getting started there are a number of free and cheap entries to game coding that will allow you to create executables that run on most Windows machines.

    There’s stuff like Game Maker available if you’re not too sure about coding, but if you’re comfortable with curly brackets then C# and VB are both available with compilers from Microsoft, and they’ve even got the XNA add-on for those languages which allow you to use 3D graphics and sprites with the .Net framework.

    It’s not as simple as Spectrum Basic, but still a bit easier than Z80 assembly language, and IIRC Blitz Basic is still around too, and now compiles for the Mac and Linux.

    Stuff like Blitz and Game Maker may not give rise to the next VVVVVV(sp?) or Minecraft, but they cetrainly allow kids to experiment with code on their own at home, perhaps getting used to objects and methods in the process, thus making C#, VB and XNA easier to understand, perhaps moving on to C++ with all its little pointers and so on.

    If anything, I believe it’s that kind of thing that will keep the PC in Indie and Bedroom-coder heaven for ever and ever. And ever.

    Edit: oh, and on-topic, I’m still on my dual-core 6600+ with 640MB 8800GTS in case anyone’s bothered. I’m not, and really haven’t felt like upgrading because for some reason I can run everything on max anyway. Maybe I should buy some more recent games or something, but I’ve been as happy as a pig in shit playing The Witcher, STALKER, and the Indie stuff that gets covered here.

    • TSA says:

      Yes, and if you don’t mind getting 3D rendering and physics, and high-level scripting for free, there’s Unreal and Unity with their free-as-in-beer versions. Then there’s stuff like Blender, Panda and various flavours of Ogre, Irrlicht, Sauerbraten, Darkplaces, Fixel etc if you care more about free-as-in-speech. Really, there are no excuses to not make games.

    • Harlander says:

      Lies! Vile lies and slander!

      Laziness is, and ever shall be, the perfect excuse not to make games!

      Besides, with the huge PC coder scene, you can just wait a bit and someone will probably independently implement your idea. My cliché roguelike, Game Involving Zombies, went into indefinite hiatus, but only a couple of years later, someone made Rogue Survivor, the same idea but much, much better.

  22. Humppakummitus says:

    Now that we have all this power, we should use it for something. I mean, where are the Pixeljunk Shooters of the PC world? Little innovative games that use the processing power for gameplay instead of cloth physics. Dwarf Fortress is a very good example, by the way. Hmm, I have a nasty feeling Flash is keeping them down…

    • dogsolitude_uk says:

      “Now that we have all this power, we should use it for something.”

      Good. We should use it for Good.

    • Malibu Stacey says:

      See up thread re: Dwarf Fortress.
      My machine is a 2 & 1/2 years old Core 2 Quad with 8 GB of RAM. Dwarf Fortress can’t use more than 25% of either my CPU or RAM & doesn’t even touch the GPU (an 8800 GTX, was an 8800 GT but a RMA in April got me a free upgrade).

      If you people want to uphold some indie game as being the be-all & end-all of PC gaming at least use something like AI War which can actually use modern multi-core CPU’s to some of their potential.

    • DMJ says:

      No! Evil!

    • Ken says:

      Pixeljunk shooter! I WANT FOR PC! Please!

    • Nesetalis says:

      I cant really complain as to that.. Toady doesn’t really know multithreading.. and that is NOT something you want to create something real with until you know exactly how to do it.. multithreading is the biggest problem with most video games these days, specifically toward memoryleaks.. in ram and vram.. then there are race conditions, ring locks, and all kinds of other things you have to be aware of.

      if toady tried to make dwarf fortress threaded right now, he would create a mess.. if he stopped and went and did something else to learn proper threading.. he would anger his fan base :p

  23. Hunam says:

    The idea that you need to upgrade your GFX card every year is ludicrous, i’ve never, ever done that and I try to keep up. I upgraded this year, but before that, it was 3 or so years ago when the sublime 8800 GT came out and that could handle almost anything that was thrown at it if you were reasonable with your options.

  24. CMaster says:

    And there we have an interesting counterpoint to Jim’s article the other day.
    He specifically wants another PC-eating mega-game.

  25. President Weasel says:

    I upgraded my graphics card this year when my old 8800 blew up. I also went up to 4GB of slightly faster RAM when my old 2GB died on me. Other than that I’m typing this on a PC that must be 3 or 4 years old, and it still handles today’s cutting edge titles of tomorrow as though they’re from yesterday.

  26. _michal says:

    I have three years old PC, I haven’t upgraded anything, just bought another 2GB of RAM. I can play almost any game maxed, with few exceptions – that’s why I’ll be upgrading in spring. I can’t play FSX, DCS, Rise of Flight on high settings and I believe I won’t be able to play Storm of War: Battle of Britain too. Oh, wait, those are PC exclusives ;)

  27. Vinraith says:

    In a lot of cases modern indies are the size of what we called “big developers” 20 years ago, and are producing innovative and unique games in a similar way. I haven’t even bought a console this generation for lack of anything particularly interesting on those platforms, so needless to say I’m not particularly concerned about the continuance of most “AAA” games on PC. Give me Minecraft and AI War and you can keep your CODBLOPS.

  28. Inverness says:

    It’s been a fantastic experience to not have needed to upgrade my graphics card in years in order to be able to play the newest games. I don’t particularly like how some games seemed to have been “consolized” in the gameplay and interface areas, but the sort of standardization of hardware requirements is a major plus.

    I just hope the big upgrade that will come after the current console generation makes up for all the progress that wasn’t made in this generation.

  29. noobnob says:

    You won’t have to worry much about asinine graphics cards naming conventions, which I reckon it’s much better now, and dozens of different models from different AiBs manufacturers in the future. Both AMD and Intel are developing all-in-one solutions that have both CPU and GPU in a single chip (it’s more than that, a bit complicated), which should effectively kill all of the entry level market for graphics cards and raise the bar on integrated graphics processing in mainstream computers. Basically, makes the whole process of buying X hardware to run Y Game a whole lot painless. These solutions aren’t exactly graphics powerhouses, but they’ll be great to introduce most folk to PC gaming and expand the market.

    Well, that’s what I hope, anyways. The chips will be out in 2011.

    • Alaphic says:

      Sauce?

    • noobnob says:

      Oh yes, should’ve mentioned them…

      On Intel’s side, you have Sandy Bridge, which should be out at the beginning of 2011.

      On AMD’s side, you have Fusion, not exactly sure about the launch dates, but the mainstream chip was rumored to be out in 2011 Q3. This is the one you should keep your eye on if you’re interested in light notebook/netbook gaming without paying an arm and a leg for it.

      And here’s a demo of AvP on Fusion and City of Heroes on a netbook.

  30. InsidiousBoot says:

    New gameplay… Yes.. We need that, I don’t see the point anymore in playing overdone games.. I mean.. seriously whats the freaking point, I’ve been there seen that e.t.c… atm I’m playing through ARMA2. all other FPS isn’t that fun anymore, must say that SH did tick me for a year but its a bit dead atm, and playing against bots and new players isn’t all that much fun considering I’m one of the top players and theres too much of a gab between. And I’m quite tired of the game, I want a new gamemode really.. Capture the Flag, and We’ve been asking for it quite some time.. sigh. I was hoping Crysis 2 maybe would be having a good MP but meh… I’ll see if its any decent. RPG’s have been boring for me aswell. Haven’t even played the Mass Effect series, Dragon Age was kinda fun for a little while, didn’t finish it though. Race games are pretty lame aswell.. this one game called BLUR sucks hard. I don’t get the fuss about it. Played it at a lan party a week ago.. I didn’t like it, and besides that the only racegame I really like is GRID, DIRT2 is plagued by windows live, which doesn’t save games.. uhm.. ofc WoW garbage is out there.. and no I don’t like that anymore. I’m kinda in the middle of maybe trying out Vanguard Saga of Heroes, further more.. oh.. I could learn Starcraft 2, but meh RTS isnt really my thing. Although I tried it. I have it, but I’m not really going to play it.. I know myself well enough. I have Minecraft.. but I’m waiting on a patch that fixes MP mobs and what not.. I only like MP because its more social, and for some reason, I kinda skipped SP after the last patch. Maybe I just don’t like endlessly digging for resources while no ones looking, if that makes sense. But yes.. My thoughts of the moment right here. Its a shame, I love PC. I do love arcade shooters, SH is kinda like that. but only because of MPR Boosting and some exploiting of game mechanics. but still, maybe Brink will give me back some enthusiasm. I’m waiting on Portal 2, HL Black Mesa, EPI 3, TDU2, Dungeon Siege 3, Diablo 3, Deus EX, Red Orchestra 2, and not much else except maybe Crysis 2. All other games that are planned for release are either not known by me or aren’t all that special. once again…. Sigh, as I’ll have to wait some while to get my hands on good old hack and slash and fun and puzzling platform jumper first person Portal 2. Yes.. Now that I think about it, the only really good thing PC still has is STEAM. Gabe Newell ftw, only thing I’d like is to get info about Half life epi 3… takes forever damn it. hey… yes.. Duke anyone?

    This lap of text is way too long.

  31. Carra says:

    Hardware development seems to have stopped.

    Three years ago I bought a new pc. 4 gb of ram, 1tb harddisk, a nice dual core and a 8800 GT. All for less than 800 euros.

    These days it still manages to run everything I throw at it. Even in higher resolutions. New pcs these days still seem to come with a 1 tb harddisk & 4 gb ram. Why upgrade?

    • Baines says:

      I’d honestly like to see hardware stop focusing on more power and instead spend a few years working on more efficiency anyway. The power arms race has made PCs one of the more electricity-hungry devices in a home, and then you burn more electricity trying to cool the waste heat being produced by that PC.

      But energy efficiency was always going to be a hard sell when games and hardware were pushing system requirements every year, and software coders were being told to not care how much code pushed a system because there would always be more power available. Maybe, just maybe, things might get a bit more reasonable for a while.

    • Nesetalis says:

      i agree whole heartedly! lets have some highly efficient GPUs, CPUs and hard disks (its amazing how much energy HDDs suck down)

  32. Monchberter says:

    Splash out on a nice (neutral, pref Antec) case, keep your hardware mid level and buy a low-mid end GFX card. It’s a philosophy that’s seen me good for the past 6 years or so.

    I’m still running a quad Core Duo on DDR and a low end DX11 card through a 3 year old mobo, and it absolutely CREAMS Crysis in 1080p.

    Beyond this, everything is just willy waving.

  33. ZephyrSB says:

    Another punter here that hasn’t upgraded since the release of the trusty ol’ 8800GT. It really does seem this was the point where the diminishing returns of such an aggressive technology race began to kick-in. That, and nVidia probably kicking themselves over how cheaply they priced it.

  34. Chris says:

    Also, nice to be able to get a realistic gaming rig to run without browning out the neighbourhood. More performance/Watt = cheaper electricity bills, less overheating problems, save the planet, etc…

  35. geldonyetich says:

    It’s not exactly a new thought that indies were given a great opportunity in the vacuum left when the big wigs found other platforms to be more lucrative. It’s mostly been what PC gamers had to tell ourselves to keep the faith.

    The fundamental problem is piracy: it’s easy to do on the PC and weakly enforced to the point where some projections put it st 90% of blockbuster copies of a game are being played by pirates – never mind their excuses they could not afford them anyway or that they hate DRM. I’m not sure I blame the pirates, as a problem before piracy was undoubtedly the boring clones.

    Indies jump right on board in this lousy environment because they’re not nearly as starved for money. If only a couple thousand people in the entire planet pay you $5 then, hey, that was $10,000 worth doing. When your game costs millions of dollars to make, that can be a problem, but indy games don’t. In fact, a lot of indies pretty much give their games away for free, this is the flash games, game maker games, BYOND games, things like that: it’s almost as bad as piracy to provide people the efforts of hard programming for free, as it devalues the medium, but they figure being noticed in itself has value.

    Notch is a great exception to the norm. The man developed a cool tech demo in Minecraft, and this managed to nail the human imagination partly because something so rudimentary as a block-based world is easy for people to manipulate. However, it’s almost criminal how much money he’s made selling a highly incomplete game… it’s not like A* is all that hard to do in a grid-based game, so wonder why it is that his mobs have no apparent pathing ability. Now that he’s a few millions and a company started, I would be very disappointed if he didn’t reach a sophistication near Dwarf Fortress or Love.

  36. Monchberter says:

    HAIL THE 8800GT THE LAST OF THE REVOLUTIONARY GPU’S!

    • TotalBiscuit says:

      I can’t hear you over the sound of how awesome my new 580GTX is.

    • Optimaximal says:

      Erm, how can a die-shrink be called revolutionary?

      It was just a minified version of the 8800 GTS/X.

    • Monchberter says:

      Minified maybe and slightly dated, but it represented the best bang to buck ratio for quite a while, and judging by the RPS readership, it’s still very well regarded.

    • Starky says:

      The minified bit helped a lot though, drew less power and as a result ran cooler so lots of manufacturers were selling it with a 10-15% factory over clock, which was and still is huge for a GPU.

      I just traded in my 3 year old 715mhz 512 8800GT (G92) (inno3d ichill), for a 1GB 5770.

      Which basically gives me the same graphics settings (usually high/best), and the same frames per second I used to have… the difference is my 8800GT did that at 1440*900, and my new one does it at 1920*1080 over HDMI to my 32″ LG TV.

      Massively beats any console… I can play games at proper 1080p, unlike the consoles (except a few ps3 titles)…

    • CraigB says:

      I can’t hear you over the sound of how awesome my new 580GTX is.

      Well, yeah. It’s a Nvidia card. You can’t hear ANYTHING over one of them!

    • Deuteronomy says:

      The 8800gt is awesome for the fact that you can run it fanless. I’ve had an accelero s1 on mine for 2 years without a problem.

  37. BrightCandle says:

    A lot of people have been complaining about the performance of Call of Duty Black Ops. A core 2 duo and basic graphics card isn’t really enough to play it smoothly, and there is a good reason for it. While a 3 year old PC might be equivalent in power to the console the graphics in the PC version are significantly better. A Decent PC is required to run it at a reasonable frame rate (which for such a fast game honestly needs to be 60 fps to be competitive).

    There have been plenty of games that on high settings or with Eyefinity (well worth trying if you’ve not got to play with it) that require more than the current crop of the very highest cards can handle.

    But I also agree somewhat with the whole point of the article. The removal of all the junk ‘AAA’ titles from the PC scene does mean that the console games list gets filled with dross while the PC indy games and shining examples get more time in the market.

  38. Jetsetlemming says:

    I’ve never ran into that constant demand of upgrading, mostly because I couldn’t afford it. Somehow, I’ve still managed to play most PC games acceptably. My current PC cost around $350 and can play modern games perfectly (My Radeon 4670 is astounding for something that cost as much as a single call of duty title). Before this machine, I had a Pentium 4 and Radeon 9250, and that also was quite impressive for the $40 plus shipping I paid for it back in 2004 so I could play Half-life 2.

    PC gaming’s always been cheaper for me than console gaming. I need a PC anyway for schooling, and upgrading that existing machine for gaming is a lot cheaper than a whole console. I still don’t even own an HDTV because all my console gaming is on a Wii. I have my nice, big, cheap ($99 on sale for 1440×900 and decent quality) Samsung LCD on my PC for things in HIGH DEFINITION.

  39. Rat says:

    Real classy move putting a picture of Crysis up there. Stop giving the Crysis haters something to troll about.

    The only thing I agree with this article is the obfuscating naming conventions of graphics cards–that’s pure marketing assholery, plain and simple, with no purpose other than to confuse and deceive consumers.

    But. The graphics arms race is a great example of competition spurring innovation. Better hardware is a Good Thing, full stop. It has made new types of games possible and old types prettier. The only “tyranny” here is in your imagination–if you have a problem with the demand for these cards or the games that require them, the blame lies with the consumers (and the developers, to some extent, but they would go out of business if the market wasn’t there).

  40. Adam says:

    GPU Companies are partly to blame, true. And so are companies that make other types of computer hardware. If people only knew how easy it is to install a new graphics card and give new live to their PC’s again, there would a lot more PC love out there.

    But sadly, most people consider a PC upgrade equivalent to getting a whole new tower and all.

    Simply put, PC gaming is for many about pushing the limits of graphics, FPS and getting the most out of our hardware.

    Consoles are great too, no need to upgrade as often and with HD shiz, games admittedly look amazing as
    well.

    But the article stresses the selfishness of hardware companies and their tricky naming of GPU’s. And yes, they are to blame for the downward trend of PC gaming. But I still believe deep down in my heart that PC gaming is alive and well : )

  41. Jambe says:

    While I miss being able to decisively say that the PC is the most technologically progressive platform, I wouldn’t swap this age for that one.

    Perhaps it’s because you don’t follow hardware anymore, but yeah, you can decisively say that PC’s are still technologically more progressive than consoles. The 360 is five years old and the PS3 is four — in that span, we went from ATI’s R250 (x1300-x1950) to their HD 6000 series, and from Nvidia’s GeForce 7 to their GeForce 4/500 series, both of which represent performance increases in orders of magnitude. Meanwhile the consoles have stagnated and the PS3 has actually regressed in terms of capabilities. A modern $200/£130 graphics card is more powerful than an entire console, CPU + GPU combined.

    Also, as Hunam points out, the suggestion that one must constantly upgrade a PC’s graphics card to be able to play recent games is just hooey. Again, it might just be a bias you developed from working in the industry, but the vast majority of gamers play on relatively small screens (most under 1.5 megapixels — check out the Steam Hardware Survey). $100-150 graphics cards drive modern games at those resolutions now, and they did in the past, too (scaled, of course, as the average panel size increased). If you bought a card from the $200-250 range you could see three or more years of good performance out of it.

    *shrug* The PC is a wild west compared to consoles, sure, but it’s really not hard to make a good buy. There are umpteen sites out there (I like the Tech Report) with down-to-the-component buyers guides which anybody could easily follow given a touch of patience and manual dexterity. Heck, you can get a run-of-the-mill OEM product with a good GPU already inside, or get one with integrated graphics and pop in your own discrete GPU, and you’ll have something that can run modern games at 1680×1050 for $5-600 (including Windows 7). If you spend $650-700 you could go up to two megapixels (1920×1080) or more.

    This article is more than a bit sensationalist. As is the norm, the capacity of technology continues to skyrocket forward as Moore’s Law predicts; it’s just that more software developers now think there’s more money to be made developing stuff to a lowish baseline standard of technical complexity. They’re not wrong. There will continue to be Cryteks and Futuremarks pushing the complexity of PC games higher, and those bleeding-edge markets will continue being tiny niches as they always have been. Nothing is really new… all this hysteria over console-portification of PC titles is overblown. Yeah, more AAA titles are multi-platform ports, but we have more good PC games now than we ever have.

    • Alec Meer says:

      I’m unconvinced you’ve actually read the piece sir.

    • mujadaddy says:

      @Alec:

      That’s cute you think these articles are read. Simply adorable.

      Statistics show that upward of 90% of people that make it to the jump read one or two sentences, then look at the pictures, then have already made their minds up about what the article said. Some might watch a trailer, especially if it’s gameplay. Some will post a comment. Fewer will read any comments first.

    • eric evenstad says:

      Not only did I read the entire piece (round of applause, please) but I read Jambe’s entire, almost as long post and I am equally unconvinced he did any reading.

  42. Tei says:

    The real important even on gaming is not this. This is not important, but theres other even that is making this moot point. And is the democratization of gaming. Seems that gaming is now something everybody do, even old people and the guy that drive your bus, and the policeman that control the trafic.

    We have achieved this making games playable for idiots. Take a game, a good game like Lost Colony 2. You can play with other 3 friends, and shot the wall in a corner for 1 minute, and you will WIN, FOREVER, EVETHING. A WINNER IS YOU. And probably 3 achievements will unlock. Save your judgement!… I am not making a opinion here!… all these videos about people playing COD withouth shotting, and people playing a kinect game withouth moving, all what mean is this.. games that almost play itself, ….. interactive experiences with a bit of gaming in it. “Light” games. And I don’t even have to talk about FarmVille.

    I kind of like part of this democratization. Huge budgets, and lots of talented people, can make themeparks really fun. Aventures where you just press the fire button to play. There are a lot of these mindless games, games like Dungeon Siege, and others. The important point is.. are these games fun? and the answer is yes and not,… are fun for some people, or for some people some time, and are not fun for other people, and other people sometimes.

    The only thing to moan about, is the dead of the “hardcore game”. These games that used to exist, where the player have limited lifes, and a pixel to pixel colision means he lost one life. Games like Ghost n Goblins. We don’t see games that hard anymore.
    So where are the challenged addicted people? probably playing multiplayer games, where skilled players can fight other skilled players. Games like DOTA,. or maybe games like Counter-Strike or Starcraft. The challenge addicted gamers can launch Starcraft, and fight the ladder with other of the “same race” gamers.

    All in all, I am happy with the current status of gaming. I only see two problems.

    – Console games are awnfully boring and bad. BAD TASTE. Is just that, these games are BAD, the people that play/make these games seems to have his taste neutered. Theres a reason why “Consoley” mean “bad game”, because the consoles make bad games!.. consoley don’t mean good game, because the consoles are filled by crap, average crap, and poorly designed games!. If I have to see another screen “Don’t shutdown the computer while we save the savegame” I am going to murder some people. How stupid is that fucking message and the people to put these in his games?
    – The dead of AAA in the PC. Is more a risk than a fact. But IF the AAA games ultimatelly dissaper from PC gaming, I would miss then. I like to play some AAA games, not everything is indie creativenes. Me likes some explosions and boobs.

  43. Po0py says:

    The thing about people rushing out to buy a new GFX card to play Crysis wasn’t just about all the prettyness of the game. It was also the fact that it was actually a rather good game. The sandboxy nature of it and multiple ways to complete missions and all that stuff was quite a newish thing. Nowadays with Far Cry 2, Stalker, Just Cause 2, RedFaction; sandboxy games are a dime a dozen. People won’t be rushing out to upgrade for Crysis 2 in the same numbers as before. I simply can’t see the same kind of novelty factor as a means to shift copies. But of course, by this stage, Crytek won’t care much because they’ll be shifting Console versions of the game for the first time.

  44. Greg Wild says:

    I’m hoping I can indeed just rely on my current system for a good few years to come: Phenom II 945 @3ghz, Radeon 5850, 4gb DDR2 RAM. Should be fine, I think, despite the older RAM.

    At any rate, no way I can afford a new Mobo/CPU/RAM any time soon. MAs are not conducive to PC gaming. Or spending money on anything but food and books.

  45. Hoernchen says:

    So, indie developers are going to create a new fallout2, planescape torment, dragon age – just minus all the content, you known, because that actually requires more people and money ? I should thank the consoles for holding the graphics back as much as the gameplay, just look at the current modern call of warfare duty XIIV which uses 90% of 4 cores and looks like shit, or more precisely just the way the same game number minus one looked a year ago and two years ago ?

  46. Eduardo says:

    and what about cpu companies? and memory? and motherboards? they dont count? lol

    anyway, its pretty fucking dumb to point your guns at hardware companies. do you think it would be possible for them to stop improving thier products just so devs dont feel like they should use the hardware to their full capacity? lmao.

  47. Jimbo says:

    For this argument to hold up then graphics card sales would have to have fallen off a cliff in recent years. Have they?

    My impression is that there’s no shortage of people out there with decent hardware, just a shortage of those people willing to pay for software when they can just as easily steal it.

  48. JFS says:

    I like this article. It speaks the truth. And who needs a X87999GTXhi-power ULTRA+ graphics card when you got Recettear (plus others) for five €-money. Oh. Hell. Yeah.

  49. Wixard says:

    I remember PC gaming hitting its stride around the turn of the century, and almost in free fall by the time Doom 3 came out.

    By that time, the newest most snazzy version of FPS XIV was hitting the market almost yearly, demanding higher and higher levels of hardware power. Indeed that very quickening of pace is what ultimately killed Voodoo.

    Yet the PC at this time was viable, and still popular. So what did in the exclusive? Well that just doesn’t boil down to one thing.

    A fracturing of naming conventions in the video card market. Around the time of the Geforce 4 and 3, Nvidia became far more liberal with it’s name and what allows a card to exist in a certain family. The geforce 4 MX was for all intents and purposes a Geforce 2 MX rebadged. Given that the lower the cost of the card the more popular, this caused obscene amounts of confusion. It got worse from there: a Radeon 9550se was slower than a 9500np. It was a mess.

    By the time of the Geforce 7800 this had settled down quite a bit. But by then the damage had been done. Nvidia and ATi both caused their market to shrink, quite possibly for good.

    Then came the production costs. Going from making a game for say a Super Nintendo to a Playstation 1 is a decent leap. From a PS1 to a PS2 is even bigger. Then from a PS2 to a PS3 can be several orders of magnitude.

    Art teams exploded in size, the cost of middleware went up compared to before, suddenly there wasn’t quite as much room in the budget for risky games. Outsourcing became more common. Game (and software development in general) exploded in areas like Eastern Europe and China.
    Gone were the days of being able to afford a 1 platform AAA title. Now it’s generally multiplatform or bust. Sometimes even then it’s still bust, with many games failing to even break even.

    Then there’s the elephant in the room; Game Piracy.

    Around the same time the market began shrinking, faster and faster internet connections became more common. Following that rise closely were torrents and other piracy sites.

    They cry “Stop, don’t blame piracy! Those people wouldn’t have bought it anyway!”.

    Probably because many have been pirating games for so long it’s a habbit by now!

    Throw whatever percentage you like at it, but piracy eats a huge amount of sales. I couldn’t imagine any sane person would say otherwise. People that could have or would have purchased the game but didn’t.

    Ultimately I think nothing by itself caused the death of the AAA PC exclusives as we knew them, I think it’s more a combination of factors.

  50. TheApologist says:

    I have consoles and a gaming PC. My feeling these days is that the consoles update more often, break more often, and are more expensive, and more of a hassle. Console gaming is no longer the simple option.

    I have to have a PC for work at home, and I’ve always just made sure it runs games. That is clearly easier now than it was. I have never gone this long without upgrading (nearly 4 years), and that feels good. Another factor in this is Steam however. I’ve by and large stopped caring about buying games the minute they come out. I keep track of what’s good, and pick it up in sales when I can. Steam keeps it all there, ready for when I want it without accumulating more disks. Windows and the games update quietly in the background. I rarely find myself staring at progress bars. And as for xbox live gold – paying for online? Fuck that.

    • Radiant says:

      xbox live is indeed bullshit.
      Why do we even have to go through their servers? They don’t do anything; they don’t even have lag compensation! That’s been part of pc gaming for years.

    • ShineDog says:

      Plenty of (most?) Xbox games have lag compensation, and thats more of a clientside thing anyway.