I Kind Of Miss The Tech War

The slow-down of PC tech has left a strange hollow in my life. If you’re older than a tiny baby, you’ll remember the days when your PC was perpetually on the verge of not being able to play the current crop of games no matter how frequently you upgraded it. Now, my PC insides are a couple of years old, and playing everything on maximum. This is, I think on balance, a good thing. But there are downsides to it too.

I remember our first PC upgrade. The 486 my dad bought in 1993 had 4Mb RAM, and the possibly-entirely-placebo “Turbo” button on the case, that was supposed to give a boost of memory. And then there came a point where this just wasn’t enough. The inside of a PC, having for so long been in the relatively safe world of Atari STs (512, 1024), was a terrifying place. And putting in RAM, as it still is, was most scary of all. (I now get a perverse pleasure from the absolutely ludicrous, motherboard-bending ferocity with which you have to shove RAM sticks into their little slots. The contrast of from-the-shoulder force with such delicate machinery. It’s like cleaning a Fabergé egg with an angle grinder.) But brave it we did, and our big beige box doubled in memory. The difference was astounding. Windows 3.1 FLEW!

Then came the day we needed a 3DFX card. Voodoo. (There was a short time when 3D cards were named meaningfully, so the higher the number, the better it was. Then more competition plus the advantages of consumer confusion saw this go berserk, with GRAPHIXBUSTER 90,000,000,000 and so on.) We added a CD-ROM drive for £300 (!). These were enormous new stages, the machine expanding its abilities, rather than simply incrementally improving upon them. But soon all the bits that would ever go in were in (we only seem to be removing them these days – remember ethernet cards, sound cards, running out of PCi slots? Now they just lie fallow, or are obscured by the fourteen fans of the graphics card.) It became about those bits getting better.

A two year old PC became something for nostalgic gaming, unlikely to be able to play the latest big-name releases unless it had been heavily augmented. And at around £1000 for an off-the-shelf contemporary box, that was one hell of an expensive pursuit. It became about sacrifices, working out what parts you could afford to add, how you might be able to rig it to play Unreal/Far Cry/Crisis at an acceptable level, or just giving up on that game being something you’d encounter in its year of release. It was, of course, no wonder PC gaming was a very niche pursuit, despite the ubiquity of the appropriated office equipment.

And then consoles caught up. The Xbox and PS2 never really competed with the PC. Developers tended to build for one or the other more often than worked cross-platform, or if there were ports they’d generally be far improved upon for a belated PC release. The wall of PC games in your local store still matched those for the consoles, and still had big-name games that made it stand out. Until the 360 and the PS3. Come 2005/6, the two big consoles launched with tech comparable to a decent PC. Tech locked in a sealed plastic box that would be outdated almost instantly, of course, but it created a parity that meant games were more frequently developed across all three platforms. And due to the byzantine nature of the architecture of the two consoles, it took developers a few years to work out how to get the most out of it. That created an artificial race between the improving tech of the PC, and the improved quality of console games. A race the PC easily won, of course, but one that lasted long enough for cross-platform to be entirely normalised.

As a result, developers stopped pushing PCs to their limits with their games, since they were designed to run on the increasingly years-old tech of the consoles. Sure, the resolutions would be met, the improved anti-aliasing options hopefully added in, but that crazy grind to make graphics cards wheeze was suddenly gone. And as a result, PC gaming became both far more accessible, and far less ground-breaking.

With the PS4 and Xbox ONE essentially PCs in sealed boxes, that parity was re-established a couple of years back, and this time the PC is pulling ahead even more slowly. And I think it’s good? It makes the hobby far less expensive, with a decent PC built yourself easily at £500. And it’ll last. It’ll play Fallout 4, because it could play Skyrim.

But it also makes me a bit sad, the loss of the arms race, the competitive battles between the likes of Nvidia and ATI, Intel and AMD, two main players fighting to pull ahead of the other. It was silly, sure. But gosh, it made graphics and scale evolve at the most extraordinary pace. I’m kind of sad to see that slow down, even though I know it’s obviously for the best. I’d love to see the new Crysis, the new game that means we all have to throw our arms above our heads and our bank balances out of the window, and find a way to upgrade to be able to play them. There was fun to be found in that, as stupid as that obviously is.

This post was funded by the RPS Supporter Program. Thanks for your funding!


  1. Vandelay says:

    I mentioned this in the forums just the other day, but will say it here too, I was going to do an upgrade of my PC early next year, but I am seriously contemplating ditching that plan and getting a PS4 instead.

    I built my computer almost 5 years ago. It has an I5 2500k in it, 8 GB RAM, a Radeon 6950 with 2GB RAM and an SSD. Now, I haven’t bought any of the big, potentially intensive games this year (such as The Witcher 3 and GTA V,) but I have not experienced a game that hasn’t played acceptably on the system. I can’t even think of anything that has even required me to knock the settings down to medium, so I imagine it would only be those games I mentioned above that I haven’t tried.

    If I am likely to upgrade, I would be expecting to spend the same as I spent last time around, about £800. My target was for a system that would be capable of running a Vive or an Occulus Rift at the required 90fps. That would give me the option to pick one of those up if they prove worthwhile, whilst also giving me enough grunt to push through even shoddy ports. However, it is not looking like this will be possible on the amount I would be looking to spend.

    The alternative is to spend almost a third of the cost of a PC on a PS4, which now has enough games I am interested in to be tempting (Bloodborne, Destiny, Uncharted Remastered, The Last of Us Remastered, Little Big Planet,) plus a good amount of focus on indie games that interest, like Journey and Helldivers. I would still have my PC to play the numerous games that are released and will not trouble my old system a bit, whilst any game it won’t be able to play will no doubt be multiplatform anyway, so I will still be able to play.

    Just not seeing much of a reason to upgrade at the moment and the more I think about it the more it seems like a waste of money.

    • John Walker says:

      I think sticking another 8GB of RAM in your PC and getting a PS4 to complement it sounds like a great idea! (I say this from the position of not being a tech person at all.) I’m especially keen to get a PS4 now after noticing today they’re remaking/reimagining the original Ratchet & Clank for it.

      • Vandelay says:

        Interesting, I was thinking that a graphics card upgrade in about a year’s time would be the most necessary. RAM does seem to be a stumbling block for many ports though.

        Either way, a gradual upgrade instead of a splurge, as was my original intention, seems the best route. Old hardware doesn’t seem to be struggling too much, but new midrange stuff isn’t exactly making games fly at resolutions above 1080p either. Hopefully VR will give hardware manufacturers some reason to try pushing boundaries again.

      • Somerled says:

        My 4 year old system just got its one and only upgrade, going from 8 to 16 gb of RAM. The new memory has only marginally improved performance in general, but is shining most on what it was intended for: Minecraft.

        The tech war isn’t gone. It has just stumbled drunkenly into an alley to relieve itself.

      • mattevansc3 says:

        Talking about complementing the PC we are at a point where an underpowered PC can be classed as a gaming PC. With the PS4 now joining the ranks of the XboxOne and Steam for streaming you could put off the GPU upgrade, buy a console with either Xbox Live or PS+ and just play it via your PC.

    • blur says:

      My machine, when I built it in 2011, was the exact same as yours. 2500k, 6950, 8 GB, etc. The only upgrade I’ve put into it in the four years since has been a graphics card. I stuck in a 970, and now brand new games (Witcher 3, Shadow of Mordor) are basically maxed out again. It’s not bad for a $450 (CAD) investment.

      Rather than play ports, I’d recommend getting a PS2. Man, that console has a phenomenal library of games.

      • DragonOfTime says:

        I also have (had) a pc that is (was) pretty much equivalent. Only difference is I have an i7 2600k rather than i5 and I had a Radeon HD 6970, but no SSD. I replaced the graphics card with a GTX 970 in time for Witcher 3 and now everything new runs like a charm. I even get a decent FPS in Witcher 3 with all the silly hair turned on.

    • darkteflon says:

      You’ll save up front on the hardware, no doubt. But man, geez console games are expensive. My Steam library would have bankrupted me on console.

      • Legion1183 says:

        If we were talking about the past I would agree with you, but lately I’ve been really hacked off by PC games costing near the same as their console versions, most noticeably with bigger AAA titles. I remember paying close to half for a PC game what a console game used to cost.

        However due to the fact that I now have far less time to play games than I use to because of life I’m a few months behind most PC releases – I just finished The Witcher 3 and started Fallout 4 a couple days ago and haven’t even started on most of the other releases since then that I’d like to – it means that I pretty much always get games at half the price either due to a sale or indefinite drop in price.

      • iainl says:

        The most expensive PS4 games tend to be fairly expensive on PC as well, though. With the exception of that monstrous LEGO Skylanders thing for my son, the most I’ve spent on a game since I got mine was £30 for Destiny with all the expansions, and most of the games I got for under £20.

  2. JamesTheNumberless says:

    The turbo button on our 486 DX2/66 definitely did something, it switched the clock speed between 66 and 33 Mhz. There were a few games on which you really noticed it, Elite 2 was one of them. It’s amazing how different hardware concerns were in those days. Today it’s all about getting extra frames and higher resolutions but back then it was about having enough RAM to actually load the game, on top of whatever set of drivers you were using.

    • roothorick says:

      And it actually served an important purpose. A lot of games released around and before that era used spinning on the CPU to regulate the speed of the game — at the 486’s 66Mhz, they ran twice as fast, almost unplayable. So, you could slow your machine down to play those games correctly.

      • Zafman says:

        And even then, my 486DX4/100 was way too fast even with turbo switched off. Terrrwennty years ago, I tried to play Speedball (nomen est omen) and couldn’t even see the players running around, just heard the sound of goal after goal after goal and *honk* game over! You lose! A round lasted about five seconds. Switching the turbo off made the game still run twice as fast as it should have been and was therefore unplayable. (Sometimes there’s just no substitute for a good old Amiga. Look at him collecting dust in the corner, the poor thing.)

        • ansionnach says:

          For a laugh I loaded the game up on my i7, booting into real DOS using a memory stick. Runs pretty fast, alright! Speedball 2 runs perfectly, though – I think a lot of games after 1990 or so didn’t blindly assume they were running on a 4.77MHz 8088.

          • mukuste says:

            Real DOS? Like, DOS 6.2? Does that even work on modern hardware? 64 bit CPUs, SATA hard disk interfaces, no graphics drivers? Probably the BIOS picks up some of that slack, but still, that’s very surprising to me.

          • udat says:

            Can you tell us more about this “Real DOS”? I’ve faffed around with DOS Box several times (to play UFO mostly) and it’s great and all, but it doesn’t seem to handle my most favourite multiplayer game of all time very well. I speak of course of Micro Machines 2!

            If there was a way to play that I’d be stoked.

          • mika76 says:

            Man, Micro Machines – thanks name brings back memories. Your comment made me do a bit of a search. The dosbox site link to dosbox.com says it runs fine? Also I doung Toybox Turbos which I never new existed link to store.steampowered.com Awesome!

    • ansionnach says:

      Depending on the BIOS of your system you could also underclock the CPU. Had a 486DX2-66MHz that you could clock really low through a combination of underclocking and then hit the turbo button to halve the speed.

  3. caff says:

    I’m hoping for some serious boost in graphics, what with all these 4K displays and VR gubbins needing it.

  4. Premium User Badge

    Big Dunc says:

    I still have a sound card, but it seems that I’m increasingly in the minority. Is onboard sound really a viable alternative these days?

    • JamesTheNumberless says:

      I think so, I’m in the soundcard club too but I used my onboard sound in between my X-Fi platinum pro dying and acquiring my new ZxR (Creative labs fanboy obv.) and found it really pretty good. As long as soundcards are a thing, I will have one. It is getting increasingly difficult, however, to fit them into your case what with all the space that the graphics card, heat pipes and heat-sinks take up… Actually it isn’t the case that’s the problem, it’s the motherboard design.

    • Premium User Badge

      Grizzly says:

      Sound cards utility in gaming has stagnated as much as those PhysX cards thanks to Creative’s uncreative monopolization practices. This has allowed motherboard companies to catch up up to the point that a lot of stuff that was previously available to sound cards or *really expensive* headphones(HRTF and stuff like that) to be included in the motherboard, whilst all games have dropped any support for stuff like EAX. A few games, such as the Metro and Battlefield series, have since managed to surpass the whole thing via CPU hackery anyway.

      • JamesTheNumberless says:

        Not sure I can blame Creative for the loss of EAX, my XFi card had this, it also was 7.1 (another thing that’s fallen by the wayside) it’s just that nobody bought them! For a long time the vast majority have been just making do with their onboard sound. Creative are only recently following the trend/conceding defeat in that the current “flagship” Creative card doesn’t have EAX and is 5.1.

        Another thing that’s happened is that nobody cares any longer about the latency of inputs. There was a time where if you wanted to record from other sound hardware you needed a soundcard with ninja DACs, however the trend there has been for audio interfaces to things like guitar amp modellers, mixers, and so on to provide their own hardware and act as specialized audio devices when connected to the computer by USB, completely bypassing the soundcard – or to be used with a cheap usb interface that gives you specifically what you need for recording… This was obviously the way things had to go because no motherboard manufacturers were making sound hardware anywhere near good enough to record and play back in real-time.

    • Vandelay says:

      If the sounds that come out of your computer pretty much purely comes from games, then I don’t think a sound card is worth it. I had one for a while and game sound was better, but not massively so, with Battlefield being the only time I really, really noticed a big difference. If you listen to music or watch films on your computer though a sound card will give a good boost to both. Even music from Spotify sounded richer.

      Nowadays though I have an AV receiver with home theatre 5.1 speakers hooked up and there is no comparison with on board sound. Games are still the weakest in their use in sound, but music and films sound wonderful. Just watched It Follows and the soundtrack, particularly use of bass, was hauntingly beautiful. On board sound would not have been able to achieve something so powerful, but neither would a sound card.

      • JamesTheNumberless says:

        My primary reason for having a good soundcard was recording, but like you I also benefit from having a better experience with music and movies, yet have been really let down by how little this matters in games. I remember when it was the other way around, most soundcards were unremarkable at playing digital audio but the reason you had one was because it would make a drastic difference to how the MIDI music from games was played back.

        I’ve been through 7.1 speakers, massive subwoofers, etc, etc. Now I just have my soundcard going out to a proper hi-fi amp with a set of nice stereo speakers and a pair of headphones and it’s a lot better for games than 7.1 ever was.

      • mukuste says:

        If you notice a difference between on-board sound and a dedicated sound card for stereo music playback, then your on-board sound must have been really shitty. The difference in SNR and frequency response on those things is really below the audible threshold these days.

        The only good reason to have a dedicated sound card for music is really if your on-board chip picks up interference from other components like CPU or GPU, and unfortunately that does still happen.

      • iainl says:

        Onboard sound is just fine in 5.1.

        If, and only if, you’re taking a digital line off the PC – which, since many will be running sound over HDMI, isn’t that unreasonable a concern.

        I suppose that means it’s technically using your GPU as a sound card, rather than onboard, though?

  5. Premium User Badge

    Bluerps says:

    I don’t miss that time at all. As a kid I immensely enjoyed to improve my PC (as long as nothing went wrong with the installation – I remember one terrible evening when I almost destroyed a new 3D card), but it didn’t happen very often, because of the costs involved.

    Later, I got an entire new PC with good hardware as a present from my parents when I finished school and started university. That was in 2002. After that, improvements to that machine were minor due to lack of money (I think I bought a cheap graphics card 4 or 5 years later), and I mostly had the same PC until 2009.

    This means that I associate this time of constantly improving hardware mainly with becoming increasingly out of touch with current games. I went from “Hurray! I can play all the games on highest settings!” to “Cool, Half Life 2 is playable when I reduce settings!” and finally to “Man, this ‘Mass Effect’ sure sounds great. I wish I could play it, or any of the other games that were reviewed in the same issue of this gaming magazine.”

    I still have have that PC from 2009. I upgraded the graphics card twice and installed an SSD and there isn’t a single game I can’t play due to lack of good enough technology. I never want to go back to old times.

    • JamesTheNumberless says:

      I had a similar phase, I went with just a laptop from 2003 until 2007, and not one that could play even the majority of existing games, nevermind the latest ones. I had to beg borrow and steal time on other people’s computers just to play Oblivion. So in 2007/08 I went totally over the top and build the most insane spec I could manage, this machine lasted until at least 2011 before I had to upgrade because my graphics card died. Besides putting in a new sound card a couple of months ago (again because the old one died) my PC is unchanged since 2012 and still plays most things at high settings at 2560×1600… I will probably upgrade the gfx card again soon but at this rate, upgrading the graphics every 3 or 4 years is something I’m very comfortable with compared with the bad old days of the early 2000s when so many people were mis-sold things that weren’t fit for purpose.

    • JFS says:

      I feel you, mate. I really don’t miss those times.

    • Napalm Sushi says:

      Another one for the “good riddance” camp here. With the financial mire I’ve waded through for the last half-decade, there’s no way I’d be playing current-generation games if the PC’s tech cycle was anything like what it was in the late ’90s.

      • LexW1 says:

        For sure. There’s no way I could justify dumping that much cash into a PC (let alone two PCs, my wife’s as big a PC gamer as I am – it’s how we met, even) these days. I’d have been console the whole way.

        Console games aren’t as cheap as the better class of Steam sale, but second-hand they’re pretty damn cheap.

    • LexW1 says:

      I almost miss it, but then I remember I’m not rich, and don’t even have enough money to get on the housing ladder, or for decent holidays, or to go out more than about once a month.

      So at the point it seems fucking stupid and I’m glad it’s gone.

      I can see how decently well-off people might miss it, but jesus, no. The days of having to buy a new graphics card every year, having to upgrade your PC seriously every two years? Those were days for people with money to burn. If it ever comes back, I’ll probably be forced on to consoles. Fortunately, I don’t think it will.

    • JamesPatton says:

      Absolutely agree. I was in a similar situation – I was never well-off enough to have a top-of-the-line PC, so whenever I upgraded I would get a “pretty good” one which would be dated in a year and obsolete in four.

      I can understand that some people look back on this time of breathless upgrades with nostalgia, because there was this excitement to it and you were pushing boundaries and it was a huge part of the rhythm of your youth. But you know what? This tech-race? It’s a rich person’s game. It’s how people with enough money to blow on a £300 disk drive spent their time. I didn’t have a great experience with it, and my parents are still fairly well-off middle class people. I shudder to think of how many hundreds of thousands of people were excluded from this vast, exciting, earth-shattering time simply because their family couldn’t afford a decent machine.

      Anything that lowers that barrier to entry – ANYTHING – is good. Sorry to burst your nostalgia bubble, but those happy memories were not happy for a lot of other people.

    • LionsPhil says:

      Yeah, and it’s not just financial. I don’t want to spend the time on it any more. I’ve set up a Windows install with all the trimmings enough times in my life that it’s not on the list of things I want to burn a weekend doing any more; I want to actually play the games or do the creative work the system is for. I want my working system to keep on chugging, and keep on chugging it has for the best part of a decade now.

  6. tehfish says:

    I’ve definitely noticed the tech spec race slowing to an absolute crawl.

    I still remember the time where you needed to replace the GPU almost yearly just to stand still. Yet my current GPU (and CPU) are fully 5 years old now yet still playing brand-new games, such as FO4, at perfectly playable speeds.

    Can’t complain though. i was worried at the time i’d brought said components i’d gone overboard and splurged far too much money on them, five years later they turned out to be the best tech hardware investments i’ve ever made :)

  7. JFS says:

    But how many graphics is good graphics?

    • GameCat says:

      6 graphics should do, but some folks say that it’s not enough until you have at least 8-10 graphics.

    • Biscuitry says:

      About the same number as last year.

    • guygodbois00 says:

      Always look for hardware with All the graphics, of course.

  8. Zenicetus says:

    Well, if you’re feeling that nostalgic for a PC that can’t keep up, just boot up a modern flight sim like X-Plane. Set all the graphics option to maximum, full cloud modeling, HDR lighting, and then try flying over an area with a Ultra-HD scenery mesh and a squillion 3D buildings like a major city.

    There are still a few programs like this, that are constantly updated and intended to stay just a wee bit ahead of the fastest hardware available. Not even trying to run on a console is one reason they can get away with it.

  9. Stompywitch says:

    having for so long been in the relatively safe world of Atari STs (520, 1080)

    It was 1040, not 1080.

    I can no longer subscribe to a gaming website that is 40… things… out. Good day to you, sir.

  10. Marclev says:

    “I’m kind of sad to see that slow down, even though I know it’s obviously for the best. ”

    Interesting perspective. The reason for the slow down is that everything has to be compatible with consoles because of “lowest common denominator”, therefore nothing pushes PC capabilities until the next console generation comes out (and even then I’m not seeing much so far).

    Imagine where we’d be without consoles holding back PC innovation, if developers pushed the PC’s capabilities like they did in the old days.

    It’s a pretty sad state of affairs! On another note, it amazes me that NVidia continues to survive selling consumer hardware, seeing you can run pretty much all games on “Ultra” settings without having their top model.

    • LexW1 says:

      I can imagine that.

      The PC gamer market would be a fraction of the size it is now, because hardly any adult could afford to involve themselves in it, esp. post-2008 (where the wages of everyone but the top 5-10% continue to fall, in real terms).

      PCs would not see AAA games as a result – big developers and serious studios would focus solely on consoles.

      Consoles would have much, much better support for game genres that don’t really appear on them much (strategy etc.), because of the huge numbers of “PC Exiles”.

      MMOs would be mostly on console.

      PCs would indeed have games with waaaaay fancier graphics than they do now, but it would be a very narrow selection of games, likely all developed on a serious budget, probably reliant on being expensive and with HUGELY expensive DLC, like a lot of the high-end simulators are on PC now. Gameplay would be very far secondary to anything else, though given the lower budgets, there might be more experimental games (or not – indie devs might well stick to consoles where they could actually earn money).

      VR would probably have been around on PCs for a couple of years in a serious way, but would be a niche of a niche market.

      I could go on, but I’m really not sure this rather Bioshock-esque isolated world of Super-PCs, working for the sweat of their brow, would actually be a better one.

    • RobF says:

      “Imagine where we’d be without consoles holding back PC innovation, if developers pushed the PC’s capabilities like they did in the old days”

      That’s easy. We’d be in trouble and PC gaming would be unaffordable and unattractive to most folks.

      • Hobbes says:

        Not entirely true, great games will always exist at a variety of performance points. However, with consoles and multiplatform releases now binding just how much power can be wrung out of PC releases (to some extent, we may see that change with widespread adoption of DX12), at this point we’re experiencing a fallow period.

        There will always be games like Dungeons of Dredmor though, ones you can run on the equivalent of a toaster.

    • Tam-Lin says:

      “The reason for the slow down is that everything has to be compatible with consoles because of “lowest common denominator”, therefore nothing pushes PC capabilities until the next console generation comes out (and even then I’m not seeing much so far).”

      Not true. The reason for the slow down is that we’re hitting physical limits that prevent hardware from getting faster at the rate it used to. Chips can’t get any faster; there’s nowhere to dump the excess heat. The parts keep getting smaller, but that leads to all sorts of weird things happening at the quantum level, which we don’t know how to deal with. What we can do, and have been doing for years, is growing the number of processing units (SMT, multi-core chips, graphic cards, etc), but then you shift the problem from hardware to software, and we don’t know how to parallelize a lot of problems, so having multiple cores doesn’t help much.

      I work on this issue in the server world; trust me, we’d kill to be able to make chips faster, but we can’t. There’s a great essay about this here: link to usenix.org.

      • Hobbes says:

        Processors have already hit the thermal limit for Silicon, or near enough. Intel is looking at Indium Gallium Arsenide (InGaAs) I believe as the next step to get below the 5nm die process.

        5nm, at what point do we start going into the sub or fractions of nanometre discussion? There was also discussion about some kind of vertical stacking of transistors too, where’d that go?

      • Shuck says:

        Yeah, it’s true – Moore’s Law technically ended a few years ago, thanks to heat dissipation limits, although there was also a slow-down in the rate at which transistor size decreased (we’re about to hit the physical limit of that as well, very soon). But even before that point, Moore’s Law saw diminishing returns in terms of graphical quality. In the early days of 3D, a doubling of polygons saw a huuuuge increase in the visual quality. These days, it’s pretty much unnoticeable, especially when you factor in the visual tricks developers use to fake the appearance of more polygons.
        I can’t say I’m sorry to see that era end, from a gaming perspective – there wasn’t much of an upside to seeing hardware go obsolete almost instantly, and the cost of game development double every few years.

      • Flatley says:

        That’s no mere “great essay,” that is a life-altering work of informative comedy. I’m citing it in every paper I ever write again just so I can feel connected to something beautiful.

      • LionsPhil says:

        James Mickens is a scholar and a poet.

    • JamesPatton says:

      “Imagine where we’d be without consoles holding back PC innovation”

      We would have photorealistic graphics and the ability to render thousands of hi-poly objects on screen at once, with really fancy water, lighting and other shader effects.

      Oh wait… we already have all that?

      Seriously, I have no idea what benefits we could get from better tech. Our graphics look AMAZING. I CANNOT IMAGINE better graphics than we have now. We can already simulate a huge number of things on-screen. what more do you want? :S

      • mukuste says:

        While I don’t subscribe to the whole PC Gaming Master Race bullshit due to the reasons cited above, I find it hard to believe you can’t imagine better graphics. I mean, look at your video game and then look at the latest Pixar movie or movie CGI. Heck, look at actual movies. There’s obviously still a long way to go to photorealism.

        That said, I fully agree that what we have now is “good enough” for most purposes. Actually, for me, most games released after 2010 and even many released after 2005 still look good enough for my eyes in the sense that they can look attractive and the graphics don’t degrade my enjoyment of the game.

        • JFS says:

          Also, photorealism is nice, but doesn’t equal beauty, art or usefulness. I guess that also plays a role, seeing as there is no clear “better” to strive for nowadays.

      • amaranthe says:

        Agreed, but I don’t think “innovation” here has to be limited to graphics.

        I do think the need for games to be multiplatform holds back innovation a lot, but more in the realm of mechanics, gameplay, and peripheral tech. VR is a good example – if we were only concerned with PC, we wouldn’t have this VR race between the different systems all trying to make basically the same thing – the work could be more focused, and we’d probably already have VR by now.

        I’m not saying competition isn’t a good thing – but the fact that the competition is so proprietary is what’s killing it in my mind.

        And I’ll also add that I always have to lament a multiplatform release because of how it gunks up the controls. FO4 is a good example – keybindings are not customizable, and some of the choices are a bit strange and clunky at times on PC. And there have been much much worse examples – the Final Fantasy MMOs have always had terrible menu systems and controls due to being console-friendly.

        Overall I’m really glad I don’t have to upgrade my PC every 6 months to play games, but I still really hate the fact that consoles haven’t died yet.

  11. Syneval says:

    Not nostalgic either … even though from a financial point of view I can stay current with whatever.

    Another factor is that creating detailed gfx, takes an enormous amount of time. PC is not only stagnating due to the console glass ceiling, but also because creating hyper detailed graphics takes an enormous amount of time and money!

    And even in the current situation, there is still plenty of scope to push things: 1440p or 4k, ENB, SweetFX, and mods will still challenge my 6700k + 980ti setup.

  12. thedosbox says:

    Heh. While it was fun to research and buy a shiny new toy every two years (partly funded by the sale of the old one), I don’t really miss the hassle of having to reinstall everything on a new box quite so often.

  13. Jenks says:

    How about buying that new modem every two years through the 90s?

    300! 1200! 9600! 14.4! 28.8! 33.6! 56k!

    Every one of them made the internet, and gaming, so much better. No one has really needed a NIC upgrade since they got their first one.

    • tehfish says:

      NIC upgrade, depends if you use Network shares at home or not. :P

      Internet-wise you can ignore it, try to chuck multi-gig files across the LAN and then it matters a lot :)

      But yes, for pure internet it matters not a jot generally.

  14. mattevansc3 says:

    I believe we are seeing a stagnation because all of the easy paths have been trodden. Of course no engineering feat is easy but technology wise the PC hasn’t changed that much.

    AMD’s APU was really just putting the Southbridge on the CPU. While DDR RAM is a major improvement over SDRAM its still memory modules on a stick that you put in a slot. Multi core CPUs is just an evolution of Dual CPU motherboards. Performance wise you can’t compare a GeForce 980ti to a GeForce 256 but both still follow the same GPU and RAM on a PCB put in a high bandwidth motherboard slot.

    We’ve gotten to the end of the tech tree for most of these parts. We won’t see much in the way of significant PC performance increases until Intel, AMD and NVIDIA do their own transition from HDD to SSD.

  15. jellydonut says:

    It seems like we are finally going to see our hardware being challenged once VR finally launches.

    I’m still not holding my breath for either the Vive or the Oculus. I’ve gotten too used to these companies taking a giant shit on their release schedule, so now I find it hard to care until I can actually order the damn things.

  16. Darth Gangrel says:

    At first sight, I thought you missed William Shatner’s magnum opus TekWar.

    As for the topic at hand, I have a massive backlog stretching back almost a decade, but am still able to play some recent games on my 4 ½ year old dual core laptop. I like that, but I guess there is a certain charm to the nonsensical buzzwords surrounding new graphics and whatnot.

  17. MiddleIndex says:

    PC wins.

    Thats what happens theres no one to crush, grow fat and wasteful.

  18. Hedgeclipper says:

    Everyone complained about Crysis but honestly it was fine on an older machine if you didn’t mind turning the settings down from “make my eyes bleed with razor sharp individual modeled leaves” to “pick my jaw up from the floor”. Honestly PC gaming was usually pretty good at handling new games if you weren’t chasing the latest shiny.

    The real frustration of the tech race was having a billion different sound and graphics cards barely or never supported by the manufacturer or the game you wanted to play.

    • mrbright01 says:

      I never chased the latest shiney, and I never had to update anything more than my graphics card during the lifespan of each computer (about 4 years for me before it starts to have enough issues that I find it cheaper and easier to replace than repair). I am currently playing FO4 with almost no problems on a sub-par pre-i3 cpu and a midrange graphics card from two years ago. Bleeding edge? Nope. I have to cut down on some of the lighting and shadow options. But still bleeding wonderful to play, and incredibly good looking.

  19. Styxie says:

    I did a new build last year, so right now I’m more interested in seeing how DX12 works out. Specifically I’m looking forward to being able to use the really good integrated graphics on my Intel chip (which never sees any use) and pushing up performance a bit alongside the 970 I’m using. Hopefully that’ll stave off the need to upgrade a bit longer over the coming years.

  20. Raoul Duke says:

    Come 2005/6, the two big consoles launched with tech comparable to a decent PC.
    As someone who owned both a decent PC in 2006 and a PS3, I am fairly confident that this isn’t quite right and may be more a case of memory making the PS3 seem better than it was.

    E.g. the PS3 has a measly 256mb of RAM, whereas high end PCs at the time were in the 1 gig territory. The PS3 had 256mb of VRAM, while PC video cards were in the 512mb range at that point.

    And the games that actually took full advantage of the PS3’s fixed hardware really only came out later in its life – MGS4, Uncharted etc were not around at launch and really it is later games like Assassin’s Creed Black Flag, Uncharted 3 and so on that really look good on PS3. To put things in perspective, Crysis came out about 18 months after the PS3 launched, in late 2007.

    • mukuste says:

      “Decent” is not the same as “high end”.

      Pretty much the same situation with “next gen”, where the PS4 and Xbone came out in a state that put them roughly on par with a good, but not top-of-the-line, gaming PC.

      You simply can’t put a high-end PC into a box and sell it for the prices that console buyers are prepared to pay.

    • iainl says:

      Whatever the actual spec levels of them, Oblivion on the 360 looked about as good as a mid-range GPU could make it look on the PC. Which was as good a comparison of gaming power as any, really.

  21. Nereus says:

    I see no mention of another benefit of slowed hardware advance. Less e-waste. E-waste is a huge problem in some parts of the world, and in many cases it’s a problem that gets exported from developed countries to developing ones. I am glad I don’t have to continually burn through components like they’re apple cores – because they’re not. Some of the components were mined by children in Central Africa. Others were illegally sourced from mines in the Peruvian Amazon that pollute the waterways and deforest the land. Some of them were likely assembled in less than healthy working conditions. I really enjoy my hobby, and I wish companies put more effort into ensuring more sustainable and humane sourcing of the materials. But because they do not I am very glad I don’t have to ditch my motherboard and all its attached parts every 2 years.

  22. Uninteresting Curse File Implement says:

    Now, my PC insides are a couple of years old, and playing everything on maximum.

    This doesn’t sound right to me. Or people on forums and in RPS comments threads have been lying, because according to them, you cannot fully max out Witcher3, GTA5 or Fallout4 on a GTX970, the current standard. I own a 4-year old card, and higher-quality settings in games might as well not exist to me. Arms race is still very real, games just don’t look dramatically better, but they still run slower as years go by.

    • drinniol says:

      There’s the epeen maximum, which is with as much AA as you can get, and the normal world maximum, which is as much AA as to stop jaggies. Normally 4x does it.

    • Machinations says:

      I have an I7 with 16 GB of RAM and a fast bus on the mobo (Asus military class) no SSD because its for suckers (sorry!) and lots of LED’s because dayum its cool no?

      The thing is now over 3 years old, no upgrades, using a GTX 670 – and I ran both Witcher 3 and Battlefield 4 at absolute maximums (well, in BF4 I only upscale the resolution by 150%) This on a 2560×1080 native resolution.

      Do not skimp on CPU, and future proof yourself. Good riddance to the bad old days, honestly.

      • Legion1183 says:

        no SSD because its for suckers

        Please elaborate, I’d love to know why you think SSDs are for suckers (unless of course you are trolling in which case – you got me)

  23. mukuste says:

    GPUs don’t really seem to get cheaper anymore. I built my current PC when the 970 just came out and decided to go for a budget GPU (R7 260X) to save some money and since I had a huge backlog of older games to catch up on anyway. I figured within a year the 970 would be significantly cheaper and I could upgrade then.

    Now it’s more than a year and the 970 hasn’t budged a millimeter in price. Is that the normal state of affairs? Or is it just because the 970 is such a massively popular piece of hardware?

  24. melerski says:

    We have reached the point of diminishing returns.
    It’s the reason why the new consoles are not cutting-edge technology. We don’t need more. (ok, there are a few exceptions)

    As mentioned before, gfx are pretty as a mofo now. What more do you want? Dev brains are the limit now, not the CPU.

  25. duffster says:

    You can still keep up with the arms race, I game on a 1440p 144hz system. Completely over the top? Yes but it does look very nice.

  26. Darloth says:

    Not sure what’s up with your motherboard, but I don’t need anywhere near that much force to get a RAM stick in.

    Now, the aftermarket CPU cooler locking pins… I had to lean my entire body weight on those, all the while thinking “please don’t crack, please don’t crack!” as I basically did a clumsy, wrong-angled push-up over a £200 chip only a few mm thick. Plus they always hurt your thumbs something awful…

  27. Premium User Badge

    john_silence says:

    Alternatively, there’s a pretty good dog vs cat war below John’s Dogmeat post, a few entries above. I’m a multiplatform guy myself, but I’ve been playing with a dog exclusively for the last 9+ years (apart from casual brushes with other people’s cats) and it still plays the most demanding games flawlessly. Requires a bit more maintenance maybe, but for sure I’m not going to replace that fluffcase anytime soon!
    A few weeks after we got her I had a dream in which we gave her back and adopted a turkey instead, which I guess was the petquivalent of shifting to Xbox.

  28. SteelPriest says:

    Just bought a 980ti and a new waterblock. These things are still expensive (although the how much more it costs in real terms than my Hercules 3D Prophet II GTS 32 MB in 2000, i’m not sure).

  29. thinkforaminute says:

    The 486 your dad bought? That made me feel old.