Why You Don’t Need More Than Four CPU Cores

We’re back and this week I’m saving you even more money by telling you why you don’t need more than four processor cores in your PC for gaming. You don’t need more now. And you almost definitely won’t need more for several years to come. What’s, er, more, even if your cores are quite crusty, you’re probably fine.

Back in the days when I was a bright-eyed, bushy-tailed young technology hack, Intel had a clear vision of the future. It would be multi-core. And not just any old multi-core. But massively multi-core. Yes, massively.

Two cores today, tomorrow four, then eight and 16. It would never stop. And why not? After all, that Moore’s Law thing allows for a doubling in chip complexity roughly every two years. Spend that burgeoning transistor budget on cores and an exponential trajectory for core counts is inevitable.

Funnily enough, it didn’t turn out that way. It was way back in 2006 that Intel launched its first quad-core desktop PC processor, which did indeed arrive roughly two years after its first dual-core model. Nearly a decade later, that’s largely where we remain. Four cores.

Yes, Intel sells six- and eight-core CPUs for PCs. As does AMD. But Intel’s offerings are really server chips rebranded and not for mainstream consumption. It’s debatable, meanwhile, whether AMD’s qualify as true six- and eight-core processors, if we’re talking Bulldozer-based models, and the Hammer-based six core processors are now goners.

Anyway, even if you want to ignore all that and enter those six- and eight-core chips into the reckoning, we’re surely still well behind schedule. For context, Intel is now doing server chips up to 18 cores. That’s much more in line with picture Intel was painting back in the mid 2000s.

Even supposedly CPU-intensive titles like BF4 usually don’t scale well beyond four cores…

However you slice it, then, we’re well behind what the industry was predicting for core counts in desktop PCs. Then there’s the fact that games consoles have actually gone further down the multi-core path than PCs. Both Xbox One and the PS4 have eight-core CPUs. Whatever you think of gaming consoles, they set the target for mainstream game development.

The final, most recent, piece of the puzzle is the upcoming DirectX 12 API which arrives with Windows 10, or more specifically the D3D12 graphics subset thereof. It’s due to introduce a new rendering paradigm which Microsoft claims will allow game engines to spread their workload across multiple cores much more effectively and efficiently.

Put it altogether and being stuck on four cores for desktop PCs ought to be a major problem for gaming. And yet it isn’t. Four cores is enough.

Let’s start by quickly burying the console comparison. The AMD CPU cores in both consoles are pretty feeble. They do dramatically less work per operating cycle than even AMD’s desktop PC cores, much less Intel’s, and they’re clocked well below 2GHz in both cases. In other words, a proper quad-core PC processor of almost any kind has those eight-core console chips squarely beaten.

I don’t say that in the name of sneering PC elitism. I say that merely to underline that fact that the consoles’ eight-core spec is irrelevant. Instead, what matters is that time and again, benchmarks show that frame rate scaling in games drops off as you go beyond four cores. That applies even when you are running the fastest graphics cards. The bottleneck beyond four cores is almost always the graphics, not CPU.

Even DX12’s multi-threaded shizzle looks unlikely to scale beyond four cores

In fact, it’s not just that more cores than four isn’t better. It’s frequently worse. That’s because most games simply won’t make use of the additional cores and Intel’s highest-clocked chips are quad-core, not six- and eight-core. In reality, the difference isn’t enough to actually feel in games. Instead, it just helps drive the point home. Four cores is usually plenty.

Are there any exceptions? Running multiple graphics cards can see six-cores and beyond take a marginal lead in the benchmark tables. But then I’ve already explained why you don’t want to run multiple graphics cards.

As for individual game titles, I’m sure some of you could find a game that runs faster on six or possibly even eight Intel cores. But they are very few and very far between. Check out these numbers on Bit-Tech for Battlefield 4, a game supposedly renown for scaling beyond four cores. Yup, thoroughly GPU limited.

Yes, you could probably find a setting with all the eye candy disabled where a six- or eight-core chip carved out an advantage. But it would all have to be pretty contrived.

If you’re wondering about AMD’s chips, if anything it’s all simply worse. AMD’s cores are weaker than Intel’s and as things stand they’re tough to recommend for anything but a very, very low-budget gaming box. AMD took a bet on multi-threading with its Bulldozer chips, and I’m afraid it didn’t pay off, especially for gaming.

An Intel Sandy Bridge quad or better means you’re good to go for, well, pretty much good

But if there’s anything that really convinces me that four cores are going to be enough not just today but also for the foreseeable future, it’s some early benchmarking of DirectX 12. Anandtech has some benchmarks of Star Swarm, a demo designed to show DX12’s new multi-threading prowess at its very best. And it shows zero benefit beyond four cores.

So, that’s one reservation for the future covered off. DX12 doesn’t seem likely to suddenly make six- and eight-core CPUs relevant. In the light of the awesomeness that appears to be Valve’s new Vive VR headset, should or at least could virtual reality be another? Personally, I doubt it. The challenge regards rendering VR is primarily a problem involving big pixel grids and fast refresh rates. I don’t seem dramatic new CPU workloads. So like other games, VR will be GPU limited, not CPU limited.

With all that in mind, all you really need to know is how old is too old when it comes to quad-core CPUs. There’s an element of subjectivity here. But I’d say anything from Intel Sandy Bridge onwards – so that’s Core i5 and i7 2000-series chips – are absolutely dandy.

To be honest, most of the time I suspect a high-clocked quad-core Nehalem chip – i5 7xx and i7 8xx and 9xx – would be hard to subjectively pick from the latest Intel Haswell chips, in-game, as these numbers suggest. It’s not that Nehalem is every bit as fast as the latest CPUs. It isn’t. But can you reliably tell 75 frames per second from 85 frames per second? I’m not sure I can. It’s pretty remarkable given Nehalem came out in 2009.

Whatever, there you have it. If you have a remotely recent quad-core Intel CPU, certainly within four years old and probably within six, do nothing. All, for now, is well.

168 Comments

  1. Premium User Badge

    Wisq says:

    That said, going beyong four cores can potentially help if you’re doing various other game-related activities:

    * recording / streaming game video
    * editing / encoding video you’ve recorded afterwards
    * pointer scans in Cheat Engine
    * running multiple games at once — hey, some games have boring bits :)
    * … and other tools that support your gaming but do take a decent chunk of CPU

    It should also help more in the future, because we’ve mostly maxed out the raw CPU core speed, and the only place to go from here is more cores. Games are going to have to become more parallel (or just more efficient) if they’re going to continue to get more complicated.

    • AGTMADCAT says:

      Plus Chrome with 40+ tabs, media player classic playing some video or another, a few work programs of varying intensities, and suddenly nearly half of my 8-core is already in use without even launching a game.

      • Raoul Duke says:

        Someone always bobs up with this comment on articles like these.

        Seriously, who the hell is using 40 tabs at once?

        And who is simultaneously working and watching a high def movie AND using 40 tabs? What kind of job do I need to get where watching movies is a viable option while working?

        • TacticalNuclearPenguin says:

          Editing your photos while other crap is running on other screens, as a random example.

          Really, it’s possible, it’s just a very chaotic lifestyle than can absolutely be “fixed”. The fact that some people prefer to fix it with more horsepower rather than forcing themselves to operate differently is still legit though.

          • Huels says:

            For me it’s just about having the horsepower even if I never use it and than that one time that I do use it, I’m thankful.

            In December 2009, I built my most expensive rig. I did not hold back on anything. Price was not going to be an issue. I picked up a duel socket motherboard and had two Gulftowns which was Intel’s very first six core processors. Each cost $1,200 and I had 12 psychical cores in my PC along with tons of other stuff. in 2011. I even tried to get the CPUs to rise above 50% workload and it took a whole lot of stuff running to do that.

            People like you and I can and will buy what we want and we appreciate being able to have the power we want.

          • Premium User Badge

            Don Reba says:

            You could start a raytraced rendering, and it would use up however many cores you have.

        • Weedus says:

          “Seriously, who the hell is using 40 tabs at once?”

          I do. All the time.

          • Chalky says:

            Some people define “using” has more than just having the tab loaded sitting there not being looked at.

        • P.Funk says:

          *Raises hand*

          Sometimes more tabs. Its like working at a desk in a library with all your books at hand. I just have tabs open, go do something else, come back, read them, use them, then eventually start closing down the ones I don’t need. Sleep my computer and all that stuff I was going to read is still there.

          Sometimes I have two browsers open with 80 combined tabs probably. It annoys the hell out of anyone I tell for some reason. Why is that?

          • Valkyr says:

            Probably because we can’t.

          • P.Funk says:

            Hey, I’m rocking a very old Core2Duo, so I dunno why anybody can’t. :P

          • kael13 says:

            Likewise. I’ll often have a good 40 tabs of personal stuff, and then another 20 of university-related, study tabs. But then I am running 6 gloriously overclocked cores and 16GB of RAM.

            Trust me, more power is better than less power. Yeah, some games are horrifically threaded, but that’s only set to improve. And I tend to keep my main board and CPU for 4 years before upgrading.

          • kael13 says:

            For some reason I can’t edit, but even now, I have more than 60 tabs open in a single window.

            … It doesn’t look like 60 tabs.

        • AGTMADCAT says:

          Well not actively using simultaneously, obviously. But 10-20 technet etc. tabs open all related to a thorny Active Directory problem, 5 more on car-buying support tabs, plus 10+ used car listings. And then a few RPS articles that either I want to finish reading or want to go look up the game more. Plus gmail and feedly always, of course. And then a few webcomics that I’m currently reading from the beginning, so once I’m caught up I can add them to RSS. Oh yeah, and then another client had a weird Exchange problem, so that’s a few more tabs. A couple of Amazon pages for stuff I’m comparing, a rail map of Europe, a rail pass website, and a couple of other vacation-related things for when my wife gets home and we can resume planning… it all adds up! mostly it gobbles up RAM, but some of these pages are also eating up more CPU than I would expect.

          And being an IT consultant means you’re always working, but can often work from home, and it’s nice to be able to consume media while fightpunching progress bars. =)

          • Baines says:

            RPS might be part of your CPU usage problem. I’ve posted about it in the forums, but RPS is currently running a background ad for Book of Unwritten Tales 2 that appears to eat a substantial chunk of the CPU when the animated bit is onscreen. And it eats another sizable chunk of the CPU whenever you are moving the mouse.

            (I believe it also can make Firefox become unstable over a long enough period of time.)

          • Shadowcat says:

            Baines: Sounds like you should be using link to noscript.net

            Don’t let web pages run stuff that you’ve no approved. It’ll do wonders for your CPU usage.

        • Person of Interest says:

          Sometimes when I restart Firefox after a crash, it asks if I want to re-open the 300 tabs I have spread across 20 spaces in 4 windows. I always say yes. It’s fun to switch to some old tabs and see what was interesting to me 6 months ago… or learn which sites have since gone offline.

          I also open another Chrome window whenever the tabs get too small to individually click on, rather than figure out which tabs I can close.

          If all this sounds like nails on a chalkboard to you… then my work here is done.

          • MattM says:

            I cant leave firefox open for more than a few days. It gobbles up memory slows other applications down.

        • Synesthesia says:

          Me! I’m a tab monster. I currently can’t even see what each tab is on my home monitor, though after this reading sessino the icons will probably pop up. Work computer is even worse, with a multimonitor setup. Theres just too much insteresting shit too read!

        • snowgim says:

          Graphics for game dev. :P
          I often have multiple graphics programs open at once (two game engines, photoshop, substance, 3dsmax), plus spotify, firefox with about 10-20 tabs (I try to keep tabs to a minimum), 1 separate firefox window with youtube playing in HD and occasionally a game running in the background that I can alt-tab to once in a while.
          My 6-core handles it pretty well, not sure how much of a difference over 4 cores it would be though.

        • Vin_Howard says:

          Well if he has Google+ on of those tabs, that along counts as 40+ tabs (although that mainly just a massive use of RAM)

        • Buuurr says:

          Ummmm… usually people with jobs use 40+ tabs at once… I am a dev… I have a minimum of 32 open at all times.

        • Shadowcat says:

          The question isn’t “who has as many as 40 browser tabs?” because that’s a silly question — who doesn’t accumulate dozens of browser tabs over time?

          The question is: who would be doing that with Chrome ?! Seriously, Chrome’s memory usage is nothing short of appalling, and its per-tab processes mean that your machine can easily start swapping when you attempt to flip through your tabs, if you’ve left them alone for a while, paging in one tab at a time.

          Using 40 (or hell, 400) tabs in a web browser is nothing special. Doing it in Chrome is mad.

        • TheSplund says:

          40 tabs isn’t unreasonable (though I do it in Firefox and not chrome) – I have about a 20 page homepage collection (including RPS) that I like to think of as my ‘morning Newspaper’ (as I haven’t had a newspaper in 20+ years) -I’ve got a bunch of 15 tabs open from the yesterday, and I just searched around for some stuff for my daughter and have almost hit 40 pages today before thinking about closing them.
          But 4 cores is enough for me (i5 2500K and a GTX 970)

        • tehfish says:

          I have a firefox bookmark folder containing ~65 different webcomics that all load all at once into tabs daily :P

        • childofthekorn says:

          Typically have about 20+ tabs working in at least 2 different browsers (some work related websites don’t scale well with certain browsers), music/movie for background noise or for that momentary escape. System administration, network administration, picture/video/music editing, content creation of other sorts among other things. Its really not that rare in the scope of it/is related positions.

        • Premium User Badge

          DelrueOfDetroit says:

          I wouldn’t say 40 tabs (maybe 40 between my PC and my phone) but, yeah I often have quite a few open. It’s not as bad since I quit using Facebook. On RPS for example I will just scroll down the main page and open every interesting link until I get to where I was before. Couple days without visiting, maybe a Pipwick Papers and just from this site alone those tabs can build up.

        • MacTheGeek says:

          What kind of job do I need to get where watching movies is a viable option while working?

          The kind where your code has to compile.

      • Jade Raven says:

        When I first started using Tree-Style Tabs in Firefox I used to have about 200 tabs open. These days I usually have 50-100 tabs open. I can’t run leave it open for more than about 3 days without it crashing now, but I don’t really blame Firefox for that.

        I also recently upgraded from 6GB of RAM to 12GB in order to play intensive games (Planetary Annihilation, et al) without having to close Firefox first – it works a treat.

        And fun fact, it turns out running Battlefield 3 will cause VLC to crash if it is playing a video (that doesn’t happen with GOM player though, which is weird).

        • AGTMADCAT says:

          I went from 6GB to 16GB for the same reason. Needed a motherboard swap, my creaky AM2+ motherboard didn’t have DDR3 capacity. And despite flooding it with my water cooling system quite thoroughly, it still insisted on working perfectly, so I just had to accept upgrading without a catastrophic failure!

          • Jade Raven says:

            I bought a fancy new i7 920 back in 2010 and have been unable to upgrade it since because it has yet to be substantially surpassed! But no matter I’ve upgraded to an SSD, a 27″ 1440p IPS display, new graphics and finally the RAM.

            There’s been lots of gaming hardware progress in the last 5 years – just not in CPUs.

        • overthere says:

          Exactly with tree style tabs you can actually use and interact with that many tabs, collapse trees, group etc. I have a cull every once in a while but 100-150 seems to be pretty std. Its great for scientific literature searches!

    • TacticalNuclearPenguin says:

      True that, but then i guess the article was more intended to remove some stress from users, and to pinpoint the importance of fat cores rather than puny ones.

      Money not withstanding i too would like to go with the enthusiast platform for all the extra crap and the warm and cuddly feeling of going the state-of-the-art route, just because. Haswell-E didn’t want to be bought though, as there are always been 3 price tiers from SB-3 that shared a common thing: only the cheap one would have less cores.

      So i kinda hoped for a 600 euro 8 core thingy with some unimportant cuts like it used to be, but it didn’t happen, though admittedly this time the cheapest entry had some strong points that were lacking in SB-E and IB-E.

      Maybe Broadwell-E or even Skylake-E will meet my tastes better, afterall it’s hard to justify such an expense with anything less than 110% satisfaction. Why i think the article is still spot on though? Because indeed, with my SB i still feel like i can wait another 3 years, and also because any next CPU upgrade i’ll make will be commanded more by the desire for a super shiny new thing, rather than real need. Actually, i think i’m more interested in the new features like PCI-3 ( yes, 2.0 here ), M.2 and so on and so forth.

    • TD23ASUS says:

      Hah, that being said, you would not believe what a terrible four core can do. I have at the moment a four core i7 (I believe all are four core) but I’ve had two computers past. See my very first one is 7 years old, and it woks like a charm when others use.] it. See I’m a big believer in the the idea that whatever can go wrong will, and that as long as something has worked before. there’s a big chance that it’ll work again. The old computer has been upgraded from 500 KB of ram to 1.5 GB or ram, but the CPU is the same. It’s got pretty sad stats (link to cnet.com) compared to nowadays but as of a month ago, but recently I found that while running Chrome on it, it moved at a snails pace, but as soon as my friend came in to help me, it worked fine.

      My second a laptop with a i3 (dual-core) and 2 GB of RAM, which was mostly taken up by age and others using it to the point where it could barely function, and with its 500 GB hard drive with 30 MB left (Yep, that’s right, the ‘others’ liked to download.) Despite its cripplingly slow speed of use (sometimes it’d take 10 sec to open a double clicked icon on the desktop.) I was able to play Starcraft 2 on Low setting with stability until my friend swarmed me with 300 zerglings… now just to drive the point home: that laptop had frequent blue screens due to (at the time) my lack of knowledge of drivers and how they cause blue screens if out of date sometimes.

      Now onto my current laptop. This is now my trust one. Never had a bluescreen in its life, and it has 8 GB RAM and an i7. Now i’m capable of running Starcraft 2 on max settings, but I keep them at high so there is less of a strain on the system, and so I can survive the rushes… in fact the only thing that lags me out on Starcraft 2 is my poor internet.

      I was able to run games like Sims and Starcraft, Minecraft, even HearthStone with relative ease. It cant keep a steady frame rate for most of the games, but for an i3 with minimum specs for all of those games, I call that a fair trade. Having too many cores my cause an explosion for all I care, because I’m okay with i7. I was fine with i3. I was fine with a Pentium 4. (Okay maybe not with the Pentium) Sometimes people get caught up in the best, and I may be ‘out of fashion’, but I usually watch Youtube videos at 240p. Why? Because it’s enough for me.

    • Shigawire says:

      Yeah, and also if you’re doing
      *3d graphics – rendering
      *Procedural generation and rendering Terragen, World Machine etc

  2. ansionnach says:

    640 cores ought to be enough…

    • ansionnach says:

      …or was that 640k cores?

      • Zanchito says:

        Exactly my thoughts. Cores are under used because neither simulation nor ai are a top priority. But they will be. As things are now, limited by console development, it will take a while, and that’s the point of the article, I hope, but broadly speaking, your pun is very appropiate

        • Premium User Badge

          Don Reba says:

          It is very possible that simulation and AI will be done on the GPU. These things are fast becoming capable general-purpose processors.

        • ansionnach says:

          “Back in the day” a lot of what was coming out on PC had no chance of running on consoles… and you’d have to take a deep breath before taking a look at the requirements of the next Origin Systems game. I’ve rarely been on the cutting edge, usually playing old games my machine can handle, but in spite of that I’ve probably had more respect than curses for people trying to push PC gaming forward. That’s not just about the graphics – it’s about what the interface offers – a mouse and whole keyboard full of keys. It’s something that goes back to some of the earliest home computers… but the fact is that you still couldn’t do a decent conversion of Ultima IV or V to a console because it’d be a pain to play without all those buttons and the ability to type in anything you like quickly.

          I appreciate console games too, but as someone who fell out of gaming around the early 2000s, I’m not bothering with all the watered-down, radio-friendly bollocks any more. If it isn’t trying to push PC gaming forward like Ultima or Planescape Torment I’d rather be out on the bike, wandering around the hills or just scratching my arse, however many cores it uses.

          I’ve got a cutting-edge machine right now but none of the games that use it are worth playing. Ultima Underworld is the best game I’ve played in the last year or so (last fifteen years, even?) and it’d run on a 486.

          This has probably been not entirely related to the discussion. I do agree with what’s being said about what’ll probably be needed to play the latest games over the next few years but it’s far from a blessing that low system requirements are because so many games will be designed for consoles too… and consequently, won’t be worth your time anyway…

          It’s not too different on the console side, where game design has shifted away from the hard core and away from challenge.

  3. darkshadow42 says:

    While 4 certainly looks plenty for most genres I think the only games which are going to need more processors in the near future are games which are simulation heavy. At work I am currently using a 12 core machine which shows a big drop in run time up to 6 cores for simulating the traffic in a city (and that’s not even modelling car users as individual agents), it still takes more than an hour for the assignment routine to path everything around each other. For a city builder game to reach any where close to a half decent level of simulation and scale, I think you are going to need quite a lot of processors.

    • waltC says:

      Yes, AMD processors are only “slow” when it comes to single-thread execution–change the game to multithreaded execution and suddenly the $100 AMD 6-core is keeping up with and/or besting a $200 i5. That’s a fact, jack…;) How about an AMD 8-core cpu running @ > greater than 4GHz for ~$150? Intel has nothing to match it at that price–but that’s beside the point. Also, if you are a computer user–(and of course who isn’t) who does more than just use his computer as a glorified super-uber games console, then you can play your four-core games while doing something *else* in the background while you are playing the game @ full bore. I do that often with my six-core AMD. Last but not least, increasingly, games are GPU limited as opposed to cpu limited, which means that the higher the resolution and the more eye candy applied, the power of the GPU determines the frame rate whereas the power of the cpu becomes increasingly less of a major performance factor. That becomes dramatically clear @ 4k resolutions +, for instance. The difference between Intel & AMD cpus in terms of 4k game frame rates with lots of eye candy turned on is nearly nil–except the Intel cpus cost 2x as much. Heck you can even spend 10x as much for a $1k Intel cpu and come out approximately 2x as fast as the fasted AMD cpu; but think on the metric–spending 10x as much to get 2x the performance doesn’t seem like a bargain to me…;) It’s like thinking that 90 fps is so much better than 60 fps when you see a benchmark bar chart–but when you play the game you’re amazed by how little the experience differs simply by way of customer perception…!

      AMD cpus are simply the best *if* you are looking for bang for the buck–you cannot do any better for your money. If you want the fastest possible performance and you don’t care what you spend to get it–Intel is surely the way to go. But if you have a budget and are looking for best bang for the buck, AMD has no peer.

  4. Det. Bullock says:

    Having spent most of the budget for an i5-3470 when I upgraded two years ago I find this article strangely soothing.

  5. CookPassBabtridge says:

    This is true for “games”, but in sim world the more powerful chips will give you an advantage you will not see in Battlefield 4 et al. Start adding in Photoscenery to X-Plane, and under certain magical conditions (i.e fart about with affinity masks) Prepar3D will load up all your cores, winning you some very helpful extra frame rates as you start adding in all those add-ons. FSX wont give two shits, but then that’s like asking your grandad to care about setting up wi-fi. He’s just too old to make it work.

    Overclock that CPU and you will be sitting pretty. Weirdly, Eurotruck 2 really benefited from a nice overclocked multi core chip, I guess because its a sim too. In fact to get it smooth in the DK2 the only way for me was to push my 5820K up to 4.5GHz. Niche products, sure, but it sets them apart from general gaming.

    • TacticalNuclearPenguin says:

      I’d wager the most important part of that equation was the overclock though, but yeah, i think this article mostly served the purpose of having people relax a little about CPUs, since worrying about the next generation of GPUs not happening before TW3 and GTA5 is already too much of a pain for the faint of hearth, so you don’t want extra stress!

      • CookPassBabtridge says:

        I think I must just be a dinosaur then :( I am quite happy with the stress, and love tech and progress. I like learning how to get the best from my kit and push it.

        I think you are right about the OC with ETS2, though of course the Oculus runtime would have needed it own bit of breathing room. Don;t know enough about it TBH. I am definitely an advocate and lover of moving tech on though. Maybe its because at this stage in my life I finally have the income to let me do so, but I cannot help but wonder what we are losing whilst we are sitting comfortably.

        On more global stages, It frustrates me when conservatism holds back the “what might be”, whether its space programs, supersonic flight. I feel the same about pushing the boundaries of AI or simulation. Its all very sensible, but oh dear I’ve gone cross-eyed.

        • TacticalNuclearPenguin says:

          The reason with my stress is that i’d buy the first decently cooled hypothetical 980ti ( Titan X with hopefully no shader cuts but half the VRAM and no double precision stuff ), and i’d want to do it pretty soon actually, but Nvidia is not letting me do that.

          Eitherway, to stay in topic, uses for more cores are surely there, especially with Intel’s big platform that offers other neat things like a truckload of PCI-E lanes and so, it’s not just the CPU. I’ll probably buy such a thing when the midpriced offer is at 8 cores and i’d like to have the ability of running it comfortably at 5ghz or so.

          In that case i’d absolutely wouldn’t mind about the high price, i feel it would still be worth it. And yes, stagnation and the “it’s enough” mentality is something i don’t like either, just like “you need X”.

          Since when it’s been about need anyway? Afterall, we just need to eat, have some sort of roof above us, sleep and few other things.

    • Zenicetus says:

      X-Plane also uses multiple cores to run the flight models for AI aircraft, up to a current limit of 20 I think. Every AI flight model is the same as the one your own sim plane is running, so if you’re stacked up on approach and dealing with windshear or other weather effects, all the AI aircraft are dealing with it too. The flight models are spread across as many cores as you have available.

      There is also a graphics improvement up to 4 cores (IIRC), but then it tops out. Right now, I think multicore past 4 in X-Plane is basically about running the AI aircraft.

      I agree with what the OP is saying, for general gaming. But things like this in flight simulator land are a different world.

    • Jason Moyer says:

      Yeah, I was going to say the same thing. High-end sims like the last 2 ArmA games or Assetto Corsa will eat 4-core AMD CPU’s for breakfast.

  6. The Sombrero Kid says:

    There are a lot of ways to look at it but one way is that there’s simply no point in scaling the number of cores much higher because we already have processors in our machines that serve that purpose, making cpu’s behave like gpu’s means making all workloads hard to program, since there is inherent difficulty in highly threaded programming. We prefer to use the fast individual cores for everything and then optimise intensive jobs onto the dedicated parallel chip we already have in our machine.

    In short we need both massively parallel and super fast chips and we already have both.

    • Raoul Duke says:

      Not only that, but the CPU/GPU ratio has been all out of whack for ages – up until recently I had an old Phenom II X4 which was quite happily feeding an R9 290.

      The GPU side of things is still the bottleneck in 90% of modern games, and the other 10% probably aren’t written very well.

      • P.Funk says:

        I’m still rocking an old Core2Duo E8400 OC’d to 3.6 and my main bottleneck is STILL the GPU.

        • skyturnedred says:

          I just went from Core2Duo 8500 to Core2Quad Q8200 and part of me feels things are slower.

      • FireStorm1010 says:

        Imho depends on what games you like to play. Strategy games are often very cpu hvy. That said, most of them even these years are single threaded or single + a bit at best, so mutlitple cores still doestn help that much (the speed of a single core does tough)

  7. DrManhatten says:

    Another example article why RPS better should stay away from any hardware recommendations as they seem to have no clue what they are talking about. Here is my question? Do you want better AI in games (Yes/No). Do you want better Physics in games (yes/no) do you want better simulation in games (yes/no), do you want bigger worlds with more players (yes/no)

    If you answered any of these questions with yes. Then MORE cores = MORE computational power = MORE of the above. But if you like to be stuck in 2009 fine go ahead.

    • James says:

      What qualifies you to say that more cores are better, more so than RPS?

      The main point raised in the article is that the present level of AI, physics etc. does not scale to the number of cores. In benchmark tests the difference is very small, with my quad core i7 it is very rare to see more than 60% of my processing power in use even playing games like Elite: Deangerous. The levels of existing power give room for AI and physics to develop, however as other RPS articles have covered, progress in the field of AI is very slow and physics are frankly fine as is – see BeamNG Drive for an example of how brilliantly accurate physics are achieved with very little processing power.

    • Shadow says:

      That’s all fine and theoretical, but the truth is games so far can’t properly utilize all that raw processing power efficiently. I don’t know if it’s due to consoles anchoring technical progress or whatever, but while PC hardware keeps advancing, the same can’t be said about the efficiency games utilize said hardware with.

      Hell, there’s still an alarming number of games/engines which aren’t 64bit nor properly harness four CPU cores as it is, and the technology has been out for several years.

    • Poppis says:

      I’ll gladly get a 8 core cpu if somebody would start making better AIs and physics in games.

      • Jeroen D Stout says:

        I’ll gladly get a 8 core cpu if somebody would start making better AIs and physics in games.

        This sentence summarises in so few words the immense gap between “games as they are now” and “games as they could be”.

      • CookPassBabtridge says:

        Yup. They won’t start making better AI’s and physics because they look at the hardware surveys and see that half their potential market is still running on dual core, and the other half on quads. They cater to the market they know will buy.

        Its a marriage of convenience that’s comfy but dull, slippers and farts before bedtime but at least we know the gas bill won;t present a problem. Users won’t buy new kit, and by extension, won’t buy those new, more complex games. So developers don’t make those new, more complex games.

      • HidingCat says:

        It’s really a bit of a chicken-and-egg situation, isn’t it? Devs see most have 4 cores, they develop for that, hardware makers see most consumer-side software only uses four, and thus don’t bother making most CPUs more than 4 cores, then devs see…

    • Scandalon says:

      Well, yea, obviously the answer to all those questions is “Yes, we want those as options.”

      However, do you think telling folks to spend the extra money, now or the immediate future, for more cores makes any sense? What game are/will make use of them? There are of course exceptions as Wisq enumerated, and a bit of chicken-and-egg scenario for developers*, but for now I have to agree with Jeremy’s assessment.

      *It will be interesting if anything on the new consoles is ever very thread dependent, probably doing heavy work with both AI and Physics as you mentioned, and to see how it plays out on PC’s, assuming imagined game is ported.

    • Asurmen says:

      I answered yes to all those, but that answer is completely irrelevant if no developer is doing those things.

    • Baines says:

      The article kind of addresses that issue with two arguments. First, that games that can utilize more than four cores are most likely already GPU limited once you get to four cores. Second, that many programs don’t even bother to use four cores in the first place. Having additional cores becomes irrelevant when you have nothing that actually benefits from having those additional cores. (And the average gamer probably isn’t.)

      Yes, it is nice to want things like better AI and better physics. But what games are actually doing it? Game AI has arguably regressed over the years. You also have that whole minimum spec issue with PC design, and the question of whether you want games actually playing differently based on the CPU core count.

    • po says:

      Why would game developers need to make use of the CPU for parallel processing, when most gamers already have a massively parallel processor in their computers, namely the GPU. It’s going to be years before CPUs come close to the number of ‘cores’ that are already present in a GPU, and it’s also going to take time to develop CPU APIs that are on the same level as eg. CUDA.

      And meanwhile outside of gaming the two biggest reasons to have a multi core CPU are 3D rendering and video compression. Guess what is the most efficient way of doing these in both Adobe Premiere and 3DS Max? On the GPU.

      CPUs aren’t going to need more cores, because there is already a better solution available. The most that is going to happen (and which has already begun with AMD’s APUs) is a merging of the CPU and GPU, to provide a single chip that can provide a few cores for processing at maximum speed, and a few thousand for maximum parallelism.

      Right now that isn’t really needed, because without consumer level software to make use of a parallel processor, there’s no real market for a combined chip. It’s just like when PCs had a separate co-processor for floating point. Most people didn’t need a DX chip until there was software to make use of it.

      • Jeroen D Stout says:

        CUDA does not do too well on choice trees or systems that are not serial atomic steps, however. But you are right, a lot of tasks can be generalised for CUDA, and they are massively ahead of CPU, but that is also because they are less versatile.

        But I think this is a necessity argument. For quite a few tasks, game developers would prefer using the CPU for easy-of-use and broadness of applications. CUDA is more what is used in lieu of mass CPU cores, as far as I can see. Give or take more CPU cores I would rather have more CPU cores for more complex AI agents which cannot be atomised to CUDA.

  8. EsKa says:

    “For now” are the key word here.

    Except if there’s some unexpected major breakthrough in chipset architecture, either we start building bigger (space wise) individual CPU again or we start learning how use multiple cores efficiently.

    True, some game genres like FPS aren’t likely to ever scale well with the amount of cores your CPU has because everything is so linked together that partitioning tasks is very difficult.

    However, ‘world’ simulations, turn based strategy games (time to wait between turns), and basically everything that can be cut into mostly independent chunks can benefit immensely from having multiple cores and they should scale very well, granted they are written properly (which can be quite a difficult job, sure).

    Not even mentioning the massive amount of progress that has been made toward simplifying multi-threading in most of the major programming languages in the last few years.

    • P.Funk says:

      Yes, its better to buy more than 4 cores just in case there’s an unexpected breakthrough.

      Remember children, when comparison shopping on a budget NEVER forget to calculate the need for a deus ex machina overhead.

      • EsKa says:

        Except that the exact opposite of what i just said.. So i’ll clarify, i guess: If there’s no breakthrough in technology 8..16..32x core will become the norm.

        Damn, I am, right now, writing a game that nearly linearly scale with the amount of processor you have. 20k agents are running on my 4x cpu, 40k would a 8x.

        • EsKa says:

          grumbl, no edit button, so my apologies for the grammar.

          That said, next time, please read below my first sentence before writing an angry comment.. seriously.

          • P.Funk says:

            “Except if there’s some unexpected major breakthrough in chipset architecture”

            Is the key part of your comment. There’s no point in commenting anyway along your line of reasoning since short term future (like the next few years) sees no sign of needing more than 4, ergo the whole point of “need” versus “might like to have just in case” is a waste of time, but for some reason everyone seems to want to say “Hey RPS, wtf you doing making a carefully couched statement clearly intended as near term wisdom?”

          • EsKa says:

            Ok, point taken but you’re assuming an awful lot of things about my opinion on the subject, leading to the conclusion that I should have taken more time to write my initial post. Allow me to start anew, please:

            No, there’s zero point *at the moment* in buying a 8x (or over) core based CPU. Not because because it’s more CPU power than anyone would need, but because most programmers can’t be arsed to write something that scale properly (and i don’t blame them, it’s bloody difficult). Not mentioning that some game categories will never scale well with multiple cores, as explained in my first comment. So, I totally agree with the article on that part, and never tried to say otherwise.

            My point, that I seemingly made unclear, is that IF there’s NO major technological break-through in cpu architecture (and it’s very unlikely, physics and all that), parallel programming is the future and most programming languages are already making important steps in making this fact a reality.

            Basically, our point of disagreement is the “when”. From what I see, that’s a year or two, max. before 8x core become the “recommended” setting for games. I may be wrong (and hope so, I can’t afford a new cpu), but I sincerely doubt it.

            Cheers,
            SK.

          • P.Funk says:

            Point taken, but for those who are still budget minded 8 cores probably won’t be for a minimum for a very long time. 8 needs to become useful before it becomes mandatory and I don’t see that changing in 1-2 years.

            I’m still riding 2 cores barely and honestly its my motherboard thats really holding me back more than my chip’s raw horsepower in terms of keeping up with newer bits and bobs.

          • EsKa says:

            hey, we just proved those who said that internet debates only lead to insults wrong, yay for us ! :)

            Honestly what rubbed me the wrong way was the tone of the article, not the point it tried to make which, as i said I do agree with.

          • Loratun says:

            Someone who is buying a gaming computer nowadays should expect to be able to keep it for roughly 5 years, since they probably won’t be breaking through the CPU plateau we’re at. Buying more than 4 cores now might mean being able to keep a gaming rig longer than that, since performance gains are likely to be gained from better multithreading in the next 5+ years. So investing more money now would mean saving money in the long run by being able to delay the next upgrade.

          • Loratun says:

            P.Funk, CPU DOES bottleneck frame generation if single thread speed is too slow, causing frame lag. A fast CPU is required to have a constant and smooth frame generation and to avoid micro-stuttering which breaks immersion.
            In 2012, TechReport tested 18 CPUs checking for frame time generation and found out that CPUs were not all created equal.
            link to techreport.com

  9. stinkytaco says:

    Now that you can stream to multiple devices, there is a definite need for Symmetric multiprocessing.( for the same reasons servers use them, to run multiple programs)

    If dad wants to play DOOM 3, and the kids want to play Minecraft, and mom wants to watch Netflix, you are not going to get very far with a quad-core rig.

    The home computer model is reverting back to the old mainframe days where the is only one device serving to many smaller “Dumb” devices.

    • Thankmar says:

      Now thats an interesting thought. Streaming might not be the exact same thing, but still.

  10. Melonfodder says:

    Having just upgraded from an i5-750 that was overclocked stably to 3.8ghz from 2.6ghz to an I7-4970K my framerate was well over doubled in Crysis 3, Arma 3 and a bunch of others. It was a mindblowing difference, and beyond my expectations.

    “To be honest, most of the time I suspect a high-clocked quad-core Nehalem chip – i5 7xx //// would be hard to subjectively pick from the latest Intel Haswell chips, in-game”

    Anecdotally I can say that this is well far from truth!
    Then there’s the matter on how heavenly Photoshop/Maya is to use. So good.

    • Melonfodder says:

      I should’ve mentioned that picking the i5-xxxx series is probably good enough for gaming and that I picked the i7 not for gaming reasons, but for work-things. The article headline still stands true, that’s for sure -but the older quadcores do present a limitation

    • Jeremy Laird says:

      May have stretched my point a tiny bit with Nehalem. Think they’re still pretty good gaming CPUs, but perhaps some benefit to be had with newer chips. Sandy Bridge onward, though, the increases become very marginal.

    • Fitzmogwai says:

      I’m still running one of the glorious D0 i7-920s, OC’d to 3.6GHz and it’s managing to keep up beautifully.

  11. rexx.sabotage says:

    Daniel Sanchez & Co over at MIT published an interesting paper on alleviating the cache hierarchy bottleneck, seems relevant.

    Scaling Distributed Cache Hierarchies through Computation and Data Co-Scheduling

    • joa says:

      Unless I’m mistaken this is for distributed memory systems — big servers where you have multiple separate processors each with their own memory attached — not multicore systems.

  12. Jeroen D Stout says:

    As a developer working on a CPU-heavy game, I have to say that 4 cores are only arguably enough as a consumer-anno-2015 because most developers are not trying to use more than 4, because that is what consumers-anno-2015 are expected to have.

    Yes, it is right to say often rendering is a huge bottleneck. However, when it comes to high-quality AI, high-quality physics with super-high frame-frames (some significant resonances need hundreds of steps per second), volumetric effects or large areas, things like enormous amounts of agents with non-flock behaviour (like citizens as opposed to pedestrian concepts), arbitrary crunch calculations or even just high-duty ray-tracing… then the actual argument of 4 cores is a bottleneck to future games.

    Just to underline – yes, in a world where game development is mostly elaborately scripted things which need to run on outdated hardware with more GPU load than CPU, upgrading your PC to > 4 cores is probably a waste of money. I agree with this purely in the limited and depressing space of what the market is now. However, in my personal world in which games strive for interesting scenes high computational complexity, this mindset is holding back the sort of games in which which mass parallel computation could change the very boundaries of what we can do.

    I would type another paragraph but I think I am too frustrated that the 21st century focussed so much on GPU’s that we neglect CPU’s; and I would just keep repeating “you are right but I wish you were not.”

    • CookPassBabtridge says:

      Thank you so, so much for contributing this. The one thing that depresses me the most about the technological stagnated marriage of convenience we all take part in is the “what if”. As long as no one shows what could be done, no one is aware of the need or has the desire fr something more. We stagnate in blissful ignorance.

      Lately VR was a big wake up call for me in this regard. It demands huge (in comparison to what is mainstream) GPU power, but it’s power that could have been commonplace if economic pressures had not slowed the rate of movement. It was only once we had VR, and saw what VR MIGHT be, that anyone said “oh you know what, I wish graphics cards were cheaper”. Tech gets cheaper the longer it has been around, and the more people buy it. Intel have dragged their feet for the last two chip generations. NVidia have made respectable advances, but games have not kept pace.

      What we could do with AI and simulation with extra power just boggles the mind. I hope your project is successful.

      • TacticalNuclearPenguin says:

        It would seem that Intel dragged their feet, yes, but in reality they simply tried a different market. So yes, in a way you’re right, but it’s not that much about stagnation rather than the market simply shifting towards mobile crap.

        Intel had little success in that area, but their focus was clear. Coming from SB, the other generations weren’t that much on a step up for the stuff that we want, as in simple CPU power, but the real leaps were made in the integrated GPU and power consumption at the very same time, which more or less means that they did some pretty impressive stuff.

        It’s just too bad that we simply don’t care about that stuff. But then, with not only desktop but even consoles losing relevance and perceived “need” for more grunt, it’s hard to convince companies to work for us again.

        • DrManhatten says:

          Well reality is that game devs have been lazy gits when it came down to CPUs they relied for over 10 years that single clock speed / IPC would increase. And eye candy would be delivered by increasing GPU power. So game code was long stuck in the early 2000s and often still is nowadays. Outdated graphcis API were the drawing submissions was limited to one thread/ core (yes I am looking at you OpenGL and Direct3D9&10) made synchronization with the rest of the system a nightmare.

          Only on the PS3 we saw some amazing coding stuff going on in exploiting the Cell Architecture, its rather weak GPU and its quirky way of parallelisms. But Sony made a full roll / step back with PS4 on that by going the lazy traditional way.

          So what was Intel going to do? Office software, internet browsing and etc was all running more than fast enough in 2006 tech. PC games one of the last large consumer domains didn’t push any envelope and didn’t look like they would in the near future and then at the same time came Apple with their stupid ultra mobility craze. Intel wanted to a slice of the pie as desktop PC//workstation market was saturated and there was no applications besides the real serious stuff that required more compute power. So that eventually got all shifted off in the server realm and now the new mantra was mobility at every price.

      • Jeroen D Stout says:

        Thank you! I agree with what you say – the worst that could come out of VR is faster GPUs, and that is good.

        My game (Cheongsam) simulates body language and facial expressions, with an AI to dynamically generate motions, which I found needs to be done at high frame rates (>250fps) to get small resonances and such correct. It is really computationally incredibly intensive to run characters at those frame rates… So it is all done in parallel on my 6 cores, but I will probably have to bring to back to 4: exactly because of the horrible realisation we are stuck at 4!

        I actually have a production joke that for future games every extra CPU core we add means I can run one-and-a-half character more.

        • CookPassBabtridge says:

          Well i really hope this takes off for you. Whether you are a fan of VR or not, one of the most stunning demos I tried in the DK2 was “Coffee Without Words” by Tore Knabe. It did something that I think you would like too – put you in front of another “person” and tried to have them react to you realistically. The sense of there being another person opposite is incredibly powerful as she makes eye contact, looks away, follows you as you move your head etc. I would love to see what your tech could do in this space.

          • CookPassBabtridge says:

            I feel compelled to ask, do you have a demo for download?

          • Jeroen D Stout says:

            I had no idea “Coffee Without Words” existed! That is remarkable, it does look a little bit similar to Cheongsam. Cheongsam is this, but without VR and with AI (I suppose).

            I do not have a demo public just yet, but if you send me an e-mail on jeroenstout@gmail.com I can put you on the list of interested just in case things progress :)

          • CookPassBabtridge says:

            I’m glad you liked it! Did you try it in VR? That ‘felt sense’ of another person when she looks at you is creepy yet incredible. It makes me feel amazingly awkward and a little maudlin, like I’ve been dumped Quantum Leap style into the middle of a horrible breakup. I also wanted a latte really badly :)

            I will definitely ping you an email, thanks!

  13. ZIGS says:

    I have a i5 760 Lynnfield (2.4Ghz), currently OC’d @ 3.9. How good am I for the FUTURE?

    • Premium User Badge

      Waltorious says:

      I have almost exactly the same question… my CPU is an i5 760, running at 2.8 GHz (not overclocked). Opinions on other parts of the internet seem to be that it’s the weakest point in my system now (upgraded to an R9 280x about 6 months ago). But will it last?

      Note that in terms of hardware-intensive games, I’m mostly interested in running The Witcher 3.

      • dangermouse76 says:

        My 760 is clocked @ 3.8 which most will do no problem at all. It’s easy to do ( relatively ) with a bit of reading. With 8 Gig of Ram and only a GTX660.
        Now-a-days I go for a solid 60fps over graphics which means for me the pretties have to come down to high and AA to x2 or x4. No depth of field or motion blur. Tessellation is game depended.But that’s GPU dependant.

        I am going to keep it running for another year or 2 because I am happy with that. Your 760 is near the top of it’s upgrade path the i7 in that range wont make a difference. Basically it’s motherboard replacement time if you upgrade ( but you probably know that ).

        As for will it handle Witcher 3, yes if you put a fast GPU in you should be fine. Or even a GTX660 but there as always compromise visually with that route.
        The 760 is a good chip, it’s done me proud over the years.

        • Premium User Badge

          Waltorious says:

          So you think with overclocking I’ll be OK running The Witcher 3? I’ve sort of intended to try overclocking at some point but never actually got around to it. And I do wonder if I’ll really gain that much. But I suppose it’s pretty cheap to tr it, just need a new cooler.

          • dangermouse76 says:

            I think you will gain between 5-8 fps on an over clock. Your GPU is in the ballpark for recommended. So where as you CPU is a generation ( ? ) behind the minimum spec I dont think that will hold you back too much. You have plenty of RAM also.

            I dont think you will be running 60 fps ultra though, high with some pretty turned down or off. In short for me I think it will be a playable visually enjoyable experience. But I am definitely happy to turn stuff down to get a playable game.

            Like I said in a year or two the i5 760 will be a bit long in the tooth for me. But you have it, and as you you said a decent cooler can be bought pretty cheap.

  14. Paul says:

    Yeah with the console CPUs being as weak as they are and with DX12/Vulkan coming, it is pretty clear that my four years old 2500K will last this entire generation just fine.

    • khalilravanna says:

      “4 years”? You youngin’! I’m still rocking a Q6600 from Q1 2007. Jokes aside, I haven’t found this thing to be a bottleneck for any game yet. Though OC-ed at 3.2GHz with 4 cores I imagine is more than enough even if it is a bit ancient in CPU-years. We’ll see if it doesn’t start showing some age when I finally have a righteous GPU in the form of my ever-delayed in-transit GTX 970 to un-neck some of those bottle-wotsits.

      • LionsPhil says:

        Woo, Q6600 buddies!

        KSP murders it a bit, but that’s not a question of cores but sheer brute single-core grunt.

      • Player1 says:

        Q6600 too here, at stock clock. I never got around to OC because I fear the cooling in my machine is just not good enough. I can play BF3 just fine, but it definitely struggles with games like Arma or Arma II. Not sure if it’s my Radeon HD 6850 though which is the real bottleneck. Any thoughts?

    • caff says:

      I agree. I’m still running an i5-2500K too. Paired with 16GB ram and a 970GTX it is capable of powering a 4K screen.

  15. paralipsis says:

    While I know I could get more from my PC with a CPU upgrade, I’m always glad to read things telling me I can wait. I’m okay with turning my PC into a mono-tasking device when I want to play a demanding game, which makes most of the high-powered CPU arguments fall away.

    The big deal for me in terms of timing my next CPU and motherboard upgrade is USB 3.1 with Type-C connectors. If I can hold out on my next upgrade until these are baked in to the chipsets of the day, then I feel that will increase the usability of my PC for the lifetime of that hardware.

  16. TheRaven says:

    Really disappointed with this article. It’s very short sighted. Just because current games primarily use 4 threads doesn’t mean that won’t go up the next year or the year after. If you want to build a computer that’s high end for the long run, getting the highest end CPU is the most important thing you can do, because once your CPU gets outdated you’ll need a new motherboard, which means you might as well build a new system.

    I always build on the edge of the next performance bubble. 8 cores will be the first major tech shift since the first quad core i7s.

    My computer is almost 6.5 years old now. Think about that, 6.5…….. and it can still crank everything out at max settings because I was able to plan long term. In 2008 the very WEEK intel put out it’s first quad core i7 models I bought one and OC’d it to 4.0GHZ. There were plenty of articles like this about “Why you only need 2 cores”. Even a game like starcraft 2 which came out almost 2 years after I build my computer was only 2 threads. But know, because I build ahead of the curve I hadn’t had to replace my computer because my CPU is solid. There have been architecture improvements, but for the most part, a quad core at 4.0 can handle anything. I’ve still had to replace my video card every 2 years with a new top end one, but my system has been rock solid long term.

    Had I listened to the articles I would have had to replace my dual core machine a long time ago, but even CPU intensive games like Dragon Age Inquisition runs smooth maxed out on my 6.5 year old machine.

    So if you want a machine that will only last for a couple of years, get technology that’s been out since 2008, if you want something that will last over half a decade, buy ahead of the curve, the extra money will be less than the amount you’d have to pay upgrading the whole thing before you had to.

    • Thats no moon says:

      I think that’s what the article was getting at.

      It wasn’t a buyers guide for a new build, it was simply confirming what you have found to be true: even older quad core processors are fine and dandy for the time-being so we don’t need to rush out and buy new shinies just yet. In a year or two the story might be different but right now even an older processor runs most things well enough if hooked up to a decent GPU.

      Great news for me as I’m rocking an i7 860 on an old 1156 board. Was worried I would have to do a complete rebuild to play anything new but if a simple GPU update will do the trick I can spend all that money on loose women and fine champagne!*

      *nappies and washing machines.

    • pepperfez says:

      I don’t know that that math works out in the current market. To get more than four cores, you need to step up to E-series processors, which necessitate significantly more expensive motherboards and RAM. Taking present versus future value into account, I suspect it would be less expensive to get a good mainstream system and upgrade to another good mainstream system if and when you need to. (Math maybe forthcoming)

      • pepperfez says:

        The cheapest Intel 6-core system I can put together (8 cores requires the comically expensive X-TREME!!! edition) costs $690 for processor (5820k), heatsink, mobo and 16GB RAM; that’s with the cheapest options for all of them. A 4690k with 16GB RAM and an MSI gaming mobo comes in under $450. So, if you think you’ll get 2/3 the life out of a mainstream processor (e.g., 4 instead of 6 years), you’re breaking even. Anything more and you come out on top.

    • RegisteredUser says:

      Clearly you’ve forgotten the backdrop of “everything on Unreal engine and sod optimizing” that has become the mainstay of the gaming industry and has kept us locked firmly into DX9 graphics for about a decade before already.
      A similiar thing will happen here, and I don’t see us making any huge super optimized and every-game-individually-handtailored leaps at all. Its not in the budget, its not how game development works and its not efficient.
      So whatever main development platforms end up doing will determine things, not paranoia.

      The statement “getting the highest end CPU is the most important thing you can do” is blatantly false; 3 generation old CPUs hold up fine with a 200-250 quid current gen GPU, which, incidentally, has been and remaine the single biggest factor for 3D gaming since the Voodoo.
      And before you leap to that conclusion in turn: it also doesn’t need the highest end GPU either, just that sweet spot of money efficient midrange GPU that is both affordable and capable of running 95% of all current games at maximum settings for a year or two, then you do a sell and rebuy or trade-in and move on.

      3 Mobo generations(or 5+ year ownership cycles) are not a problem as long as the GPU interface holds up and currently PCIE isn’t going anywhere(and AGP lasted a loooong time as well).

    • Person of Interest says:

      Congratulate yourself all you want on your brave purchase of a four-core processor in 2008, but the rest of your system is creaking with age. SATA 3Gbps, PCI-E 2.0, USB 2.0. As for the CPU: cores aren’t everything. You don’t have AES-NI instruction support, your idle power consumption and load power efficiency suck, you probably don’t have bluetooth support. I know all this because I have a similarly-aged system. Lots of people are in the same boat as us, because as the article pointed out, nothing interesting has happened to desktop processor performance for the better part of a decade. Anyone who spent the extra $50-100 for the four-core version of their mainstream desktop CPU can feel just as smug as you and I can.

      Were we really so smart to predict that Moore’s Law would falter right as we made our CPU purchases? Or did we just get lucky that the upgrade treadmill slowed to a halt soon after we made our purchases?

      Four-core was an easy choice when we purchased our systems. Intel had been making Quads for the desktop for nearly two years, and the chips worked with the same mass-market motherboards as the dual-cores. Review sites talked about the improved responsiveness of desktop applications: the benefits of going from 2 to 4 cores were immediate. None of that is true today if you go over four cores.

      This article gives exactly the right advice.

    • Jeremy Laird says:

      The piece is about the realities of what you need for a gaming PC. You can take issue with the specifics if your experience differs. But the issue of industry stagnation is somewhat separate.

      I tried to address this to a degree with reference to console tech providing the development target.

      But moreover, the fact today is that the installed base of powerful 6+ core PC processors in gaming machines is tiny. It would take a truly profound innovation in gaming that absolutely required six-plus cores to motivate a developer to create a game which only a tiny proportion of gamers could play.

      That said, I’ve lamented time and again over Intel sand bagging with its CPUs. I’ve also clearly indicated that I think a six-core 5820K is a very sweet all-round proposition: link to rockpapershotgun.com

      But I maintain that, right now and for the foreseeable future, there’s relatively little benefit beyond a decent Sandy-Bridge quad.

      • CookPassBabtridge says:

        I am very fond indeed of my 5820K. After a duff x99 deluxe AND duff original 5820, the replacements cruise me easily and stably up to 4.5GHz on multiplier and auto voltage alone. I don’t run that 24/7, just when simming or video converting. H105 on cooling duty.

        Sometimes I consider hugging my computer because this is normal behaviour.

  17. LazyAssMF says:

    I have an i5-750 OC’ed to 3.8Ghz and it runs all games great. If there’s a bottleneck in games it’s always my GPU.

  18. RegisteredUser says:

    It is worth adding that just as four actual cores are enough, the whole concept of using hyperthreading to optimize/maximize core use has fallen flat and FPS results in games vary so near-randomly across benchmarked games with HT on/off, that it basically boils down to “don’t bother with it”.

    I bring this up because people WILL pay premiums for i7s for example, even when they are clocked identical to i5s at 30-50% more price with the same core count.

    A lot of people eat up marketing like crazy, and if you bring up one point of nonsense like no use in gaming, they’ll counter with “but this one single app that I use once every 50 moons gets +13% performance when I use that one plugin for one process once, so nyer” and feel they put in a good investment for the CPU premium when simply chucking another 50-100 quid into GPU power would have put them infinitely better off for what they do as their main pastime.

  19. airmikee says:

    I think my Thuban hexcore is why I’ve been able to run Cities XXL so well. Right after release the Steam forums were littered with people complaining that their i5 and i7 quad core systems ground to a halt before their cities reached 1mil population, whereas anyone with six or eight cores couldn’t understand the complaints because they game was running so well. My XXL city of 10mil only gets 1fps, but that’s still more than SC4 gets with a 1mil city, and when I’ve got that 10mil city running it uses 100% of core1 and 85%+ of the other five cores.

    Four cores might be the minimum standard now, but having more cores isn’t going to hurt.

  20. Gap Gen says:

    I have an 8 year-old Q6600 and it’s fiiiine. Well, I hope it is, I don’t want to buy a new CPU and motherboard any time soon.

    • TemplarGR says:

      I have a Q6600 OC @3.0ghz as well. Used to have 4gb of RAM and an ATI HD3870 512mb gpu. I added 2 more gigs of RAM and an OC R260x 2gb gpu last October. Total cost about 150 euros.

      I can play everything really well on my 1050p monitor. I need to sacrifice a few settings on newer games, but otherwise, this will be able to play any console port in the short term.

      I am thinking i won’t need to buy anything else until AMD releases a killer 16nm APU. I got bored with dGPUs and their power consumption and noise.

  21. racccoon says:

    My less than a month old i7 quad 4.4 gig cpu has made my life like a dream before I was struggle to operate in the old 3.3 gig intel windows took ages to load like over 2-3 minutes or so, and it was generally causing a tired out effort to do tasks and load games, now with the i7 in everything works so much faster, the comp loads up at incredible speeds less than 40 seconds windows is open and everything runs real smooth! I love it ! :) so i really don’t see your argument at all.

  22. tormeh says:

    You can’t benchmark CPUs based on frames per second! The rendering may happen on one thread with priority, and all you’re really measuring is how fast that thread gets executed, with total disregard for everything else. I had an AMD FX4100 (the cheapest quad-core of its time) and, to top it off, an ASUS AiSuite program I had foolishly installed underclocked it by half without properly telling me. Ran all games great, until Dragon Age: Inquisition showed up. The frame rates were totally fine, mind you, but the game world wouldn’t load fast enough. NPCs would talk back to me a minute after I clicked on them, the in-game-time would completely pause from time to time (I could rotate the camera but nothing else) and sometimes the game would straight-up give me a loading screen in the middle of a level because the processor couldn’t keep up. But the FPS was smooth all times.

    • TacticalNuclearPenguin says:

      Ah yes, gotta love that sneaky bloatware.

      Thankfully on desktop it’s more or less a “fool me once…” kind of deal if you’re installing stuff yourself, laptop/OEM users have it far rougher.

  23. montorsi says:

    Why you do: because you never want to close anything ever.

    • letoeb says:

      Isn’t that more a “get a metric fuckton of RAM” kind of problem, though?

  24. FecesOfDeath says:

    It’s not that more cores slow down computing. It’s that memory performance has not improved in bandwidth and/or speed at the same rate as other PC components have improved in order to make up for the higher chances of instruction or data cache misses as the number of cores increase. When there is a cache miss, that means the CPU must retrieve the freshest data or instruction from the RAM, and as some of you may know, the speed at which data is retrieved from RAM is much much slower than the speed of retrieving data from any level of cache (L1, L2, or L3). Basically, it’s the penalty of maintaining cache coherence that’s slowing down the game-performance of 6+ core CPUs.

    • Asurmen says:

      Then why does actually getting faster memory change nothing? There has been tests showing the difference between ‘slow’ memory and current fastest DDR3 shows very little gain on games?

  25. aliasi says:

    It also depends what you favor, gamewise. The person who likes AAA titles and/or sims needs the muscle, someone like me who is into retro and indie stuff or even MOBAs is sitting pretty for awhile. My Shovel Knight experience is not improved by 16 cores all that much

  26. zehooo says:

    While you can make do with 4 cores I disagree that there is no reason for more than 4 currently. Quite a few new games make proper use of my 6 core Xeon, and this trend will only continue since multiplatform games will need to be heavily multithreaded to take proper advantage of the weaker 8 core cpu in the consoles.

  27. KeyboardGato says:

    Does this mean my i7 2600 will still hold up?

  28. Ejia says:

    What am I looking at in the third picture? It looks a bit blurry. Of course it could also be my eyes, as I can’t find my glasses.

  29. petrucio says:

    I don’t think you’re really getting the picture of the future that Intel was trying to paint here. Either one or 16 cores might as well be the very same thing when we are talking massively parallel processing. Raising the number of cores is not about racing against Moore’s law – it’s about changing the paradigm and reaching something that more closely resembles human processing, which is massively parallel. This will require millions of cores, and this talk about a few cores or a few dozen cores makes no sense in that regard.

    I’m not saying that such massively multi-core vision is attainable (although I certainly think that it is, but not in a way that makes sense for us to talk about purchasing options now), just that your article is not compatible with the premise it set out to discuss.

    • Premium User Badge

      Don Reba says:

      We don´t need millions of cores to simulate human intelligence. Any parallel process can be performed on serial hardware.

      • petrucio says:

        Of course it can, but that was Intel meant with ‘massive multi-core’.

        Eventually we’ll get to several dozens and then hundreds of cores, and game engines will shift to take greater advantage of that. But talking about 4 cores and 8 cores as if there is or should be a difference between them is a bit silly.

        Though not entirely unpractical if you are currently thinking about spending extra money to get an 8 core processor, than by all means, don’t. Just don’t expect to be stuck with 4 cores forever because there’s no point of going further. There will be eventually , just not necessarily with 8, 16, or even 32.

  30. Premium User Badge

    Jiskra says:

    How does your recomendation to not upgrade to new 6/8 core 2011-v3 socket chips goes with your enthusiasm about numbers of PCI-E lines these ships have for SSD usage ? I am confused now.

    Last year I have bought new i7 5820K after your article here, so apperantly i am worse off then with usual quadcore 1150 socket chips?

  31. Lachlan1 says:

    “Check out these numbers on Bit-Tech for Battlefield 4, a game supposedly renown for scaling beyond four cores. Yup, thoroughly GPU limited.”

    Why oh why can’t people understand that it’s the MULTIPLAYER component of the BF games that are CPU intensive? The hardware sites never test it as they can’t get a run that they can reproduce as they can in multiplayer. I don’t know whether it scales well in multiplayer beyond 4 cores or not but linking single-player benchmarks (and I assume these ones are single-player benchmarks) is misleading.

    • Person of Interest says:

      Can you share some evidence for this? Something like a screenshot of a CPU utilization graph during a multiplayer session?

      • Lachlan1 says:

        link to tomshardware.com

        This is from the beta so may not be representative. However, just overclock your cpu and use the drawfps and perfoverlay options and you’ll most likely see the minimum fps go way higher. If you think about it, it makes sense…bullets are affected by gravity and a lot can be going on at once. On my 2500k at stock there is a significant difference in minimum fps vs 4.3ghz, as there was with BF3.

        • quidnunc says:

          Yeah, I’ve got an old phenom ii x4 in one computer and it crushes bf4 single player but I go into 64 player multi and it’s barely playable even when I turn down the graphics options

          • quidnunc says:

            Meanwhile the other computer with an i5 and old 6950 video card runs much better. That’s not extra cores just much faster per core performance but their engine supposedly scale well so you could get by with a weaker cpu in frostbite games if you have 6+ cores

  32. czerro says:

    +4 cores is kinda a niche…but not really.

    Lot’s of gamers nowadays have an interest in streaming/recording, and cross interests related in digital video editing/encoding. These things benefit hugely from more cores. It’s one of the reasons the 8320 was popular in it’s generation and still to this day. Cheap, highly overclockable on air with a decent heatsink and 8-cores makes it a swiss-army knife of multi-tasking.

    I know that’s a real niche choice, and an 8-core 8320 is not ‘truly’ 8 cores, but the benefits are quite obvious. This is a weird article choice when DX12 is right around the corner, and OpenGL/Mantle have similar methods of scoping or are expected to, to make management of multiple cores CPU/GPU less problematic. So…yes…in the near future, having more than even 4 cores should have an advantage, even if you aren’t doing anything especially tasky.

  33. OmNomNom says:

    As much as i love RPS i have been finding these ‘need’ articles miss the mark a little. They seem to mostly be written to reassure budget and casual gamers that their existing systems are fine.

    There is nothing wrong with playing older games with your few years old 4 core, dated 60hz IPS screen and battered old (non) gaming devices but as an ENTHUSIAST site RPS also caters to the people who want to spend a little more and go the extra mile for the best experience with new titles.

    It’d be nice to see some bemchmarks linked on other sites, or better still for RPS to conduct their own benchmarks… Unless they don’t have any of the hardware they are describing, in which case why are they writing these articles anyway?

    • OmNomNom says:

      Okay so they did link to articles this time, my bad. But as someone noted the only game here known to have decent scaling (because it wasn’t originally only designed with consoles in mind) is BF4. But they don’t include the multiplayer where the extra cores and HT are more noticeable.

    • amateurviking says:

      This article is absolutely for people worried that they might need an upgrade that they demonstrably don’t (for now: like it or not the prevailing console architecture will define the target specs for the majority of games for the next 4-5 years).

      Jeremy’s done plenty where he’s gotten overexcited about sexy DDR4, octocore, balls out hotness bits too. And there’s certainly no lack of ‘why you simply must have an i7 5820K, 32Gb RAM and 4 GTX980s or your not a true gamer’ articles out there.

    • Jeremy Laird says:

      I’ve done five of these stories. Three of them have encouraged readers to buy stuff / telling them why they need stuff, so am unclear how you are drawing your conclusions.

  34. SuicideKing says:

    Only cases I see at the moment for >4 cores: gaming + workstation work, recording gameplay or streaming.

  35. mikmanner says:

    I got a 6 core for work but I noticed a definite improvement in AI heavy missions in Arma and also FSX. But yeah it’s overkill for most games. That’s why they call me Captain Overkill.

  36. CookPassBabtridge says:

    There is one gaming engine that scales very well over 4 cores, and that’s cryengine. When buying my 5820 I remember a benchmark showing a 20-30 FPS jump in using 6 core over 4. There is stuff out there that will do it.

    I’ve not yet seen any VR in cryengine. That could be interesting with a capable rig

  37. big boy barry says:

    RPS tech reports are increasingly becoming,dont buy this or you dont need that. half the fun of PC gaming is the tech

    • CookPassBabtridge says:

      I think broadly speaking there are two varieties of PC gamer. Or at least the two ends of the spectrum: Gamers who want to game and their home PC happens to be a conveniently available platform (50% of users in the latest Steam survey are on dual core and onboard graphics, which could suggest ”it’s the PC in the corner we usually use to go on the internet”) and those at the other end for whom the tech itself is a hobby too.

      I would count myself at the latter end, I like finding out about shiny new tech and seeing what I can do with it. Whilst not an extreme overclocker, it was great fun trying to get the most of my new system and playing with lots of lovely coolers and fans and things, and putting together an aesthetically pleasing build. The project of building it and making it work was as big a part of the fun as using it, especially as it gave me some nice performance in my flight sims and VR (which is what I built it for). It was hard work and at times infuriating, but I love ‘my’ rig that I put my blood, sweat and stress hormones into.

      I think RPS probably has a good cross section of both types, but they will probably irritate each other like an elderly couple given that their aims and desires are different.

    • Jeremy Laird says:

      I’ve done five why you need / why you don’t need posts. Three out of five have been calls to action to buy stuff. But never mind the facts!

      Couple of posts ago there were people complaining the new series was only about telling people to go and spend their money. Can’t win!

      • big boy barry says:

        Yep my bad, sorry i feel just felt a little persecuted, I just built an x99 system having had a 2500k for 4 years. The last 2 tech reports have been “no need for ddr4” and now “no need for an i7”. haha, just reminding me what i already know really. My Mrs brought me the ram for Xmas. Selling the old parts brought the 5820k,so it only cost £300 out of my own pocket for the build, gotta be happy with that.

  38. Agricola says:

    Built a nehalem rig in summer 2009. i7 920 with 6 gigs of ram. Apart from two gpu changes and new SSDs, its going strong. Love to see articles like this which vindicate my reluctance to upgrade!

    I was more into the tech side of things 5/6 years ago than I am now. I suppose the dramatic slowdown in the PC upgrade race has made me lazy. I game less than I did then too. The last title I put serious hours into was Alien Isolation late last year. Playing at max res with everything (bar I think, shadows, maxed out) the system performed flawlessly. Very slick framerate, always in the low to mid 40s at least. And all this from a 6 year old cpu which I havent even bothered overclocking beyond 3ghz yet.

    I’ll hold off another while longer.

  39. drewski says:

    I’m having trouble aligning the author’s “most systems are GPU limited” attitude with his “AMD are rubbish apart from for totally budget boxes” position.

    If I can save $200 on an AMD CPU that’s still pretty beefy, and put that on a 970, isn’t that a better trade off than an i5 and a lesser videocard? Won’t it *still* be the videocard that is limiting my tasty tasty frames?

    • Jeremy Laird says:

      Apologies, possibly not totally clear what I was saying. AMD’s current (relative) strength with Bulldozer chips is multithreading, which is the opposite of what you currently need – which is a few strong threads, not lots of weak ones.

      • drewski says:

        So…I would be better still saving the money but going for the closest Intel CPU, even if that has many less cores?

        For example, I’d be better off with an i3-4160 versus the FX-6300 if they were roughly the same price. (And a GTX970).

  40. b0rsuk says:

    Jeremy Laird, I’ll remind you of this article when raytraced games become commonplace.

    Raytracing makes light look amazing, and lets you to do away with complicated hacks used to make many visual effects. In words of John Carmack, they these things become “embarrassingly easy”. And the kicker is, raytracing is trivial to calculate in parallel. There’s almost no upper limit how many cores you can add.

    Last estimate I’ve seen is that you could get 18fps realtime raytracing with 800×600 resolution, if you have 8 cores. That’s within reach.

    Raytracing is not a silver bullet, it has a few difficulties, but it comes with upsides too. For example, once you get raytracing it doesn’t care how many objects there are. With rasterized 3D performance drops quickly. With raytracing and associated techniques, things in front naturally obscure those behind and they’re not getting calculated at all.

    Global Illumination is another thing worth typing into youtube search box.

    • Jeroen D Stout says:

      When raytracing comes around we shall look at screenspace reflections and laugh and weep uncontrollably, while chanting “never again”.

      (I am already doing that right now, actually.)

    • joa says:

      I think if ray tracing ever becomes common place (and that’s a big if — even non-realtime applications, e.g. CG for movies don’t even bother with ray tracing) then it will undoubtedly be executed on the GPU.

    • Jeremy Laird says:

      Ray tracing will surely be done on the GPU, not the CPU. Global illumination already is.

  41. TemplarGR says:

    This article is wrong. While it makes some correct observations, the conclusions it reaches are false.

    There is a reason games don’t show a real improvement after 4 cores, and the reason is because they are not written for more cores… 6+ cores are not mainstream. Intel doesn’t have any aside from expensive enthousiast/server chips, and AMD has only a few that haven’t sold that well. It is obvious that all developers won’t really exploit more cores.

    For developers to fully utilize more cores, they need to become mainstream first. It’s the same situation like when dual cores and quad cores first appeared.

    Actually, gaming could really benefit from more cores, with richer AI, bigger and more destructible environments, better physics etc.

  42. Rack says:

    No, you’re going to need 8 cores. The consoles have 8 cores and if you think just because an i5 is twice as fast it can manage on half as many cores you obviously haven’t been gaming very long.

    Console ports will be designed around 8 cores, and if they can’t find 8 cores they’ll split the tasks designed for cores 1-4 across cores 1-4 then put the workload designed for cores 5-8 onto core 1. Which is also where windows is going to be throwing tons of irrelevant and unnecessary crap. They’ll also be horrendously poorly optimised in terms of how individual tasks are managed. The better ports will let hyperthreading work but otherwise that i5 4690k with a titan x is going to chug like crazy on low settings.

  43. luke_osullivan says:

    I have to say my own experience largely reflects the article. I have the i7 920 which was the first generation of the modern quad core cpus if I’m not wrong, and with a motherboard preset overclock to 3.2 ghz it still seems to run most games very well. I also have a Geforce 680 and 12 gigs of ram so it’s a reasonably well specced system. But I am thinking of upgrading because as other people have pointed out, although newer cpus don’t necessarily offer massive improvements they are much more power efficient and newer motherboards come with features like usb3 that my old asus rampage 2 just doesn’t have.

    The only game I’ve found in 2014 that lagged was Watch Dogs and I think by common consent that wasn’t a good port. I had to turn down the graphics so much to get it to run smoothly that I didn’t enjoy it. Dragon Age Origins still works nicely for me with most of the settings turned up. So I’m not feeling great immediate pressure to change anything. But then I am wondering whether this will change soon because I don’t know how well this system will cope with other things I’d like to play this year, especially GTA V and the Witcher 3. I actually built this current machine because my old dual-core AMD cpu couldn’t cope with GTA IV and I’m half-expecting that GTA V is going to have the same effect.

    So my question is, putting on one side issues of power consumption and other features, if I just put in another high-end video card (e.g. a 980) would I see a major jump in graphics performance which will allow me to run games with the graphics turned most of the way up and keep this current system for another 1-2 years, or will it be bottlenecked by the cpu? Advice from those who understand these things better than myself would be welcome.

    • Initialised says:

      Hey other Luke!

      I also have an i7 920 (clocked at 3.8GHz, released Nov 2008) and the AMD 7970 (clocked at 1GHz, released December 2011). My system should feel ancient and slow but it doesn’t thanks to diversification of the gaming market. I’m probably going to upgrade the card to AMDs second round of DX12 cards. 680 to 980 might be a decent jump in GPU limited games maybe 60% better. But I can’t see a CPU upgrade as being worthwhile for maybe 3 years. That said going from E4500 to Q6600 was quite a jump but I was running Crossfire at the time and I remember similar anti Quad Core nay saying at the time. They were wrong then and eventually the anti-eight core nay saying of this article will sound short sighted.

      • luke_osullivan says:

        Thanks :-) Exactly what settings are you using for the CPU to get it to 3.8, can I ask? It’s good to hear that a 980 would be a 50%+ level of gain on the current i920. I agree that at some point we’ll need 8 cores but if that’s up to 3 years away, maybe I should really hold off on the whole new system for a while longer. In another couple of months I’ll know exactly how GTAV and the Witcher 3 will play anyway…

  44. letoeb says:

    Hard to disagree with this, but from recent memory: Wasn’t AssCreed: Unity severely CPU-limited? I seem to remember that it wasn’t possible to reliably push the framerate beyond 30 no matter which GPU you threw at it. Kind of makes sense, too, given how the obscene numbers of NPC (and thus AI agents, apparently) would be pretty CPU-heavy. Also seems like the kind of thing that would benefit from heavy multithreading.

    But then again – it’s just Unity, so who gives a damn.

  45. Mint says:

    All these people arguing how they have 3 million tabs open and watch movies while playin and even have 2 games running at the same time because “some games have boring bits” are really pushing the argument as best as they can beyond any reasonable capacity.

    The point made in the piece here was and is “do you really need more then 4 cores for GAMES”

    answer: No

    If one chooses to have 4 billion tabs open with a HD movie running , a second game in the back ground and multiple office programs and a justin bieber mp3 playing then , sure , u might need 2 8 core cpu’s. But that was not the point of the article now was it.