Results 1 to 20 of 20
  1. #1

    Will 8 thread CPUs become relavent for gaming?

    *Disclaimer* I may have gotten some terminology wrong.

    Will CPUs such as the core i7 become more relevant for gaming now that PS4 and xbone have 8 threaded CPUs? Will this provide motivation to develop game software that can take advantage of this feature?

  2. #2
    Network Hub
    Join Date
    Feb 2013
    Location
    Wales
    Posts
    200
    Hopefully. I'm expecting this to take at least 2-3 years to become reasonably common though.

  3. #3
    Secondary Hivemind Nexus Boris's Avatar
    Join Date
    Apr 2012
    Location
    Netherlands
    Posts
    1,392
    I doubt it for now. The XBone and PS4 CPUs might be highly threaded, they're not all that fast.

    However, when developers figure out threading more (it's really hard) and it gets implemented in more engines, it'll become more and more needed. I would also say 2-3 years.

    A lot of stuff is just annoyingly hard to split up, and threading itself is not without pitfalls.

  4. #4
    Secondary Hivemind Nexus Grizzly's Avatar
    Join Date
    Jun 2011
    Location
    The Archbishopric of Utrecht
    Posts
    1,615
    They already are. Games like Crysis 3 and Battlefield 4 are already capable of utilizing 8 threads. take a look at these benchmarks for example. Or, more specifically this image.

    Considering that BF4 uses the "Frostbite 3" engine, you can most likely expect similar multi-core support showing up in other games (as, apperently, EA now has "Frostbite for everyone!" as it's motto). It's already here, but not everywhere.

    Do note that PS4 and XBOX One use 8 actual cores, which is different from 8 threads (for example, an i7 has 4 cores and 8 threads. The FX8350 has 8 cores and 8 threads).
    Last edited by Grizzly; 16-12-2013 at 12:16 PM.

  5. #5
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,330
    Quote Originally Posted by Grizzly View Post
    They already are. Games like Crysis 3 and Battlefield 4 are already capable of utilizing 8 threads. take a look at these benchmarks for example. Or, more specifically this image.
    That's highly unreliable information.

    And here's a newer game:



    No real benefit from having 8 cores.

  6. #6
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,330
    Quote Originally Posted by Lone Gunman View Post
    *Disclaimer* I may have gotten some terminology wrong.

    Will CPUs such as the core i7 become more relevant for gaming now that PS4 and xbone have 8 threaded CPUs? Will this provide motivation to develop game software that can take advantage of this feature?
    Not really, because 4 Haswell cores can run circles around the 8 Jaguar cores of the PS4 and Xbone. Those are low-performance cores. So there's not a huge incentive for devs to suddenly become serious about multithreading. It'll probably continue to improve at a steady (slow) pace.

  7. #7
    Secondary Hivemind Nexus rockman29's Avatar
    Join Date
    Jul 2013
    Posts
    1,491
    Doesn't that make them MORE inclined to use more threads?

    To extract more performance out of the machines they make most money on?

    Developers and publishers don't compete on consoles now?

    Making even more highly threaded applications is the future, how does that exclude consoles when they both use 8 threaded x86_64 CPUs?

    The console is a fixed platform hardware wise, and a good testbed to keep trying out more multithreaded coding, no?
    Last edited by rockman29; 16-12-2013 at 02:00 PM.

  8. #8
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,330
    Quote Originally Posted by rockman29 View Post
    Doesn't that make them MORE inclined to use more threads?

    To extract more performance out of the machines they make most money on?

    Developers and publishers don't compete on consoles now???

    Making even more highly threaded applications is the future, how does that exclude consoles when they both use 8 threaded x86_64 CPUs???

    The console is a fixed platform hardware wise, and a good testbed to keep trying out more multithreaded coding, no???
    It makes them more inclined to push multithreading on the consoles, but not on the PC where they can just brute-force it with 3-4 much faster cores. But multithreading was never a problem on consoles - the PS3 had an exotic processor cluster with 7 wannabe cores, and they squeezed out every little bit of performance it was capable of.

    Consoles use x86-64 CPUs now, but the games will still be coded differently from the ground up. If they can get their PC port done faster by not bothering to optimise multithreading, chances are they will.

  9. #9
    Secondary Hivemind Nexus Grizzly's Avatar
    Join Date
    Jun 2011
    Location
    The Archbishopric of Utrecht
    Posts
    1,615
    Quote Originally Posted by Sakkura View Post
    That's highly unreliable information.
    Wait what? Why?
    how about this then?. Article specifically mentions the game utilizing 8 cores and everything.

    Or perhaps it's just on my end, as to be frank, Cryengine 3 and Frostbite 3 are currently the only engines I know of that utilize 8 threads and their example should not be taken for the whole industry?

    And here's a newer game:

    No real benefit from having 8 cores.
    Hmm. I don't think one should be looking at Cod:Ghosts for this, since the game does have this reputation for being extremely poorly optimized for the PC.

  10. #10
    Secondary Hivemind Nexus rockman29's Avatar
    Join Date
    Jul 2013
    Posts
    1,491
    As far as I know, KZ SF from Guerrilla Games already uses 6 cores... and I really don't see how DICE would have got 64 players on consoles without using a highly multithreaded thing on the weak CPUs of the consoles...

    I thought one of the biggest things of DICE's Frostbite 3 engine was to be scalable... to me that indicates they want to be as efficient as possible, and to me that sounds a lot like wanting to do lots of multithreaded stuffs.

    Quote Originally Posted by Sakkura View Post
    It makes them more inclined to push multithreading on the consoles, but not on the PC where they can just brute-force it with 3-4 much faster cores. But multithreading was never a problem on consoles - the PS3 had an exotic processor cluster with 7 wannabe cores, and they squeezed out every little bit of performance it was capable of.

    Consoles use x86-64 CPUs now, but the games will still be coded differently from the ground up. If they can get their PC port done faster by not bothering to optimise multithreading, chances are they will.
    But the PS3 used PowerPC and in-order architecture, and it was 32 bit, and technically it was really only doing 1 to 2 threads...

    Wouldn't the same architecture on PS4 and PC mean there is more translatable stuff going on here?

    That's highly unreliable information.

    And here's a newer game:
    But Call of Duty Ghosts is the same PC game that requires a minimum of 6 GB of RAM.... that doesn't sound like a very representative case.... Activision is famous for crappy PC ports of COD....
    Last edited by rockman29; 16-12-2013 at 03:05 PM.

  11. #11
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,330
    Quote Originally Posted by Grizzly View Post
    Wait what? Why?
    how about this then?. Article specifically mentions the game utilizing 8 cores and everything.

    Or perhaps it's just on my end, as to be frank, Cryengine 3 and Frostbite 3 are currently the only engines I know of that utilize 8 threads and their example should not be taken for the whole industry?



    Hmm. I don't think one should be looking at Cod:Ghosts for this, since the game does have this reputation for being extremely poorly optimized for the PC.
    Are you kidding me? Look at that graph. The Core i3-3220 gets 95 FPS, the Core i7-4960X gets 98 FPS. The FX-8350 and 6350 both land on 96 FPS. And so on... there's far too much of a GPU bottleneck to properly evaluate CPU performance, and there's certainly no evidence that the game benefits from 8 cores.

    Crysis 3 is one of the more technically advanced games around. It shows that games definitely can be developed to take advantage of 8 cores. But plenty of other games don't.

    COD:Ghosts is very poorly optimised, yes. But that's par for the course for console ports. What it does show is that you can't just expect the new consoles to result in perfect 8-core CPU support.

  12. #12
    Secondary Hivemind Nexus rockman29's Avatar
    Join Date
    Jul 2013
    Posts
    1,491
    Does that mean it's a major GPU bottleneck... or more that games generally do not need that much CPU power... and maybe why the consoles opted to shift more budget to GPU rather than CPU?

    As far as I understand lately, the CPU performance for consumers have really outpaced the software development, in games and entertainment and office programs.

    Splitting the stuff into threads seems to be both for making code really scalable over time and also to start running programs with less need for brute CPU power.

    I still see it as an advantage that the PS4 has eight x86 cores, benefit for my PS4 and benefit for my PC.
    Last edited by rockman29; 17-12-2013 at 12:14 AM.

  13. #13
    Secondary Hivemind Nexus Zephro's Avatar
    Join Date
    Aug 2011
    Location
    London
    Posts
    1,620
    Yeah CPU power has clearly outstripped actual software development for a few years now. My CPU is barely ever at anything more than 40% utilisation in game. The bottle neck is all GPU and memory based.

    Also it should be noted that caching becomes less efficient the more cores you have on the BUS, the MESI protocol starts to really saturate your BUS when you hit 16 cores and after that starts to actually degrade performance. If you have slower/simpler cores (like the Jaguar or an ARM) you will relatively use up less bus bandwidth as you're clocked slower. So there's a trade off of 4 fast cores against more slower cores.

    It'll probably become really common when Intel sort out a micro-architecture that suits it better. That and programming tools catch up, writing threaded code is a doddle but debugging it.... ick.

  14. #14
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,330
    Depends entirely on the type of software and games. Console ports have naturally been held back by the old console hardware, but many newer games can certainly still be very CPU-demanding. Memory, on the other hand, has become largely irrelevant. Everybody's buying DDR3-1600 even though DDR3-3000 has been around for a while.

    The consoles tend to have restrictions on gameplay, so you won't need a super-powerful CPU. The 8-core Jaguar SoC is an excellent solution, especially when it allows them to just set aside entire cores for their OS tricks. And the fact that everyone is on x86 certainly doesn't hurt, it just doesn't necessarily automagically lead to better console ports or better coding of games on PC.

  15. #15
    Secondary Hivemind Nexus
    Join Date
    May 2012
    Posts
    1,438
    Quote Originally Posted by rockman29 View Post
    Does that mean it's a major GPU bottleneck... or more that games generally do not need that much CPU power... and maybe why the consoles opted to shift more budget to GPU rather than CPU?

    As far as I understand lately, the CPU performance for consumers have really outpaced the software development, in games and entertainment and office programs.

    Splitting the stuff into threads seems to be both for making code really scalable over time and also to start running programs with less need for brute CPU power.

    I still see it as an advantage that the PS4 has eight x86 cores, benefit for my PS4 and benefit for my PC.
    Some games max out the 36 core servers, in Single Player mode:
    http://forums.keenswh.com/post/why-6...ighlight=cores

  16. #16
    Lesser Hivemind Node
    Join Date
    Jun 2011
    Posts
    586
    As CPU power increases, some more tasks will be dedicated to it, I'm sure. One excellent example is how graphics developers have been implementing simplified software rasterizers or raytracers on the CPU to supplement the GPU. Basically run a very, very low resolution, depth only render (so the render is just grayscale, with 0 indicating very close and 1 indicating very far) and use this data to optimize rendering. Many cool things can be done with that, but it's quite expensive on the CPU which was never designed for this sort of thing.

  17. #17
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,330
    Quote Originally Posted by FriendlyFire View Post
    As CPU power increases, some more tasks will be dedicated to it, I'm sure. One excellent example is how graphics developers have been implementing simplified software rasterizers or raytracers on the CPU to supplement the GPU. Basically run a very, very low resolution, depth only render (so the render is just grayscale, with 0 indicating very close and 1 indicating very far) and use this data to optimize rendering. Many cool things can be done with that, but it's quite expensive on the CPU which was never designed for this sort of thing.
    I think they're putting little coprocessors on the GPU to handle those sorts of things. Like the Maxwell GPUs will have ARM cores in there.

  18. #18
    Lesser Hivemind Node
    Join Date
    Jun 2011
    Posts
    586
    Quote Originally Posted by Sakkura View Post
    I think they're putting little coprocessors on the GPU to handle those sorts of things. Like the Maxwell GPUs will have ARM cores in there.
    Yes and no. I don't think ARM coprocessors will be able to efficiently do things like this for a little while more, and this counts double when you consider that there currently is no standard for this. You'll have more power and wider support by just using a CPU core, on top of pleasing the geeks by being more multithreaded :P

    Also, am I alone finding the idea of a coprocessor getting its own coprocessor funny? All we need now is for the coprocessor to be a full SoC with its own GPU to come full circle.

  19. #19
    Secondary Hivemind Nexus Sakkura's Avatar
    Join Date
    Jul 2012
    Location
    Denmark
    Posts
    1,330
    You're not.

    Also interesting to see both AMD and Nvidia betting on integration of CPU and GPU cores, just in these different ways. So you can effectively build a system with an AMD APU and an Nvidia APU. \(_o)/

  20. #20
    Obscure Node
    Join Date
    Jun 2011
    Posts
    13
    From a developers point of view, properly multi threading a game is no easy task and I think will be something we will (if at all see in the near future) from the so called "AAA" developers/publishers.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •