Week In Tech: Hands On With Those New Games Consoles

By Jeremy Laird on May 27th, 2013 at 1:00 pm.

Ha, sorry. Not really. But it got your attention. And there’s a thin tendril of truth in it. It’s been a busy week in hardware and in my mortal hands I hold a laptop containing AMD’s Jaguar cores. The very same cores as found in the freshly minted games consoles from Microsoft and Sony. So what are they like and what does it mean for PC gaming?

Meanwhile, Nvidia drops a price bomb of the bad kind and Intel has some new chips on the way. Read on for the gruesome details.

AMD Jaguar, then. The little lappie AMD sent out for evaluation has four Jaguar cores running at 1.5GHz. Both of the consoles are rolling eight Jaguar cores at 1.6GHz.

On a core-for-core basis, then, the AMD A4-5000 chip I have in hand (also known by its codename Kabini, if you care about that kind of thing) is extremely close to the consoles. What’s more, I’ve an inkling it might actually be close in terms of the core count available for games.

Dunno about you, but when I watched that quick-switching, multi-tasking demo for the new Xbox One, I distinctly got the impression that the trick involves keeping everything running all the time.

And that makes me think the CPU cores are going to be partitioned. In other words, games are likely only going to have access to a limited number of cores. On that note, the Jaguar CPU architecture essentially groups cores in modules of four.


The new Xbox One / PS4. Well, bloody nearly…

Communications between those modules are sub optimal compared to core-to-core comms between cores in a module. That only lends weight to the idea that both consoles will use one quad-core module for games and the other for everything else.

If that’s true, then the thing lying on the floor next to me right now is almost bang-on in terms of the gaming CPU in the Xbox One and PS4. Is it any good?

In a word, no. Not in a gaming context. To be fair to AMD, the A4-5000 is a nice chip for its intended market – mobile devices at the cheaper end of the market. It’s got miles more horsepower than existing Atom chips (though Atom is due a major overhaul to its cores very soon).

But as a gaming CPU? Let me give you some numbers. In raw processing terms, these four Jaguar cores have slightly less than a quarter the grunt of a Core i5-3570K. It’s the same story on a core-by-core basis. Less than one quarter of the performance.

Really, this is no surprise. The Jaguar core is a dual-issue item running at roughly half the speed of Intel’s quad-issue desktop cores. It’s a competitor for Intel’s Atom core, not the full-fat Core, er, core. It was always going to be this way.

On the graphics side, the Kabini chip shares 3D tech with consoles, too. But the gap is pretty big in terms of functional units – just two of AMD’s GCN units to 12 for the Xbox One and 18 for the PS4. The PS4 in particular has a very different memory architecture, too, so the raw 3D rendering comparison isn’t hugely revealing.


Does make you wonder why the Xbox One is as thick as a whale omelette…

As for what it all means for the PC, well, we’ve touched on this before, but you can make the argument pretty much any which way you fancy. On the one hand, it does rather look like you’ll pretty much never have to upgrade your CPU to cope with the next decade of console ports. Almost any half decent CPU you currently have will be game enough.

It might also encourage anyone who’s thinking about really pushing the envelope of gaming to focus on the PC. I just can’t see how developers are going to really ramp up game engine technology with these CPU cores. Next-gen AI, fancy physics, don’t see how it’s possible.

At this point, somebody will pitch up and opine that they’ll shove some of that onto the GPU. But the graphics grunt in the new consoles is merely OK, so where’s the spare headroom? Pinching GPU resources will limit graphical fidelity.

Then again, it might just mean nobody bothers at all and we’re doomed to suffer a future largely populated with console-compromised ports.

Nvidia and other stuff

Meanwhile, Nvidia dished the dirt on the first part of the new GeForce 700 family, the GTX 780. It’s pretty much as we discussed in posts passims, save for two details. Firstly, pricing. It’s £550 minimum, which I find pretty objectionable and robs it of much of its appeal.


Too expensive. Next!

Secondly, they’ve knobbled double precision processing. This isn’t hugely gaming relevant. It’s about preventing people from bagging 780s to use for professional number crunching. As for the rest of the 700 series, let’s assume the 780 is a harbinger of things to come and say that the technical details we’ve already been through are correct but the pricing is a lot higher.

Given that it was pricing that was the main attraction (the 700 series are not new GPUs), I’ve just gone off Nvidia’s new graphics cards in a big way. Oh well.

Finally, Intel’s new Haswell CPUs are imminent. I’ll cover them in more detail when I’ve had a proper play and the NDA has lifted. But I spoil everything right now by saying that for desktop gaming it’s a case of jog on, nothing to see.

That said, Haswell might just have something to offer in the cheap gaming laptop department. Watch this space.

__________________

« | »

, , , , , .

134 Comments »

  1. ResonanceSD says:

    Registered just to reply to this article! I’m totally with you on the pricing of the new GTX 780 as the previous x80 chips were ~$500. Bring on the HD 9000 series of Volcanic Islands!!

    However, won’t the x86 nature of the XBone and the PS4 result in better ports for PC games?

    • Grey Poupon says:

      In theory yes, in practice.. Well, it’s always up to the developer. Doubt either console manufacturer will provide tools for easy porting. While the devs will have less problems with the porting, they might just end up using less effort and money on the ports to maximize profits.

      Often the quality of the port hasn’t been about the difference in architecture anyway, but in the difference of power of the platform and the fact that PCs have a huge number of different configurations while the consoles have one. They sometimes end up hardcoding things for the consoles which then cripples PC ports unless they’re rewritten.

      • ResonanceSD says:

        Will I thought Microsoft *might*, given that XBone ports to PC would still result on their system being used over the Penguin Brigade or the Starbucks Squadron.

        • Spakkenkhrist says:

          They want us to move over to their console, not to continue using our PCs.

          • Apocalypse says:

            No they don´t,. And yes they do.
            Different branches in Microsoft want different things. They fight each other quite often.

        • Grey Poupon says:

          If XBox exclusives get good PC ports, why would anyone with a decent PC buy an XBox over a PS4 (apart from all the TV crap that is).

          • Cinek says:

            Why anyone with a decent PC buy a console? That’s a real question. Cause I see no reason what so ever (especially when kinect – the only appealing part of consoles – will be available on PCs).

          • InternetBatman says:

            Because Nintendo manages to publish several good and unique games every generation.

          • Cloudiest Nights says:

            And don’t forget Sony! They have the second best exclusives available next to Nintendo.

          • blackmyron says:

            If you’re really a person that will buy a console because of an exclusive… I don’t even know what to say to that.

          • ResonanceCascade says:

            Just an exclusive? No. Multiple good exclusives and some other good features? Sure. I dropped 300 bones on a PS3 and never regretted it for a second. It has some great games, it’s the best Blu Ray player on the market, and it’s a very good way consolidate the various streaming video services I use.

            That said, with the PS3 being a thing, it makes the PS4 a tougher sell. It has to convince me on games alone. That’s going to take a while.

          • andytt66 says:

            Only reason I own a console-box is for a genre that you simply can’t get on the PC.. namely rhythm games. But I’ve got my copy of Rock Band 3, and my DLC songs, and my plastic instruments.. and the ex-bone isn’t backward compatible. So there is zero reason to upgrade.

            A situation I am perfectly happy with!

          • mickygor says:

            Eh, there are quite a few decent rhythm games on PC.

        • Tacroy says:

          Microsoft has been intent on screwing over the PC gaming market as much as possible ever since Halo 2.

          They’re not going to make porting easy. I think that’s why they intend on having developers use the Kinect as much as possible.

          That being said, PC / XBone / PS4 sharing a common architecture means that if a game comes out for more than one of them it’ll come out for all three (the marginal cost of porting to a third platform is going to be negligible compared to porting to two of them, so why not?), and honestly given the power disparity the PC version is probably going to be the best version.

          Essentially, the only reason to buy a next generation console is either because you don’t want to spend an equivalent amount of money on your PC, or because you want the exclusives.

      • Archonsod says:

        It wouldn’t really matter if they brought the consoles up to the same spec and made porting as easy as simply selecting whether you want a PC binary or an Xbox one. The main problem with the ports is that developers seem to be hell bent on flooding that particular market with tepid, shouty man games possessed of features slightly less interesting than watching paint dry.

        The only decent stuff to come from console land over the past few years has generally come from the Live Marketplace, and been fairly low demand on hardware in the first place.Judging by the Xbone trailers I doubt that’s going to change this time around either.

      • Kinth says:

        Well so far most “next gen” third party titles are actually being developed on PC and then ported to consoles.

        Watch Dogs (by extension we can assume any Ubi third party next gen title) and even Dark Souls 2 (Sequel to the worst port in the last decade) Have been confirmed to have PC as the lead platforms.

        With architecture of the new consoles being so similar to PC it is just easier for developers to stick with what they know and learn the quirks of the consoles as they go.

    • stahlwerk says:

      The fact that Microsoft resorted to running the Mini-Windows Kernel for Skype etc. as a VM alongside the “Gaming OS” speaks volumes how much they know performance suffers when running intensive tasks from inside Windows*.

      *) in comparison to the famed all-access programming model of console hardware.

      • xavdeman says:

        Seems inefficient to me to run three different Operating Systems next to eachother just to support some “apps” that nobody will use.

        • iniudan says:

          Actually the 3 OS are quite simple, first on is the Hyper-V hypersisor, second is the windows kernel based OS VM, for everything that not gaming and final is the fixed spec gaming VM.

          The reason for that division is so they can update everything but gaming side of the system down the line. This also give them backward compatibility for virtually any future generation gaming system also, has they just need to make the VM available on it.

          But that type of architecture also make it very easy to emulate, has all you need to do is replicate the VM, so could end up the most pirated gaming console or if Microsoft not stupid they are actually planing to license the VM on Windows in the future, once hyper-V performance become viable when not running on a dedicated machine.

          It actually one of the few good decision they presented about the xbone during that conference (too bad they spent almost no time on that technical point), let just hope for them, they are not retarded on how to use it. (which I think they will be if you consider the rest of the presentation)

      • LionsPhil says:

        Hunh, got anything technical on this?

        I know at one point Microsoft Research were playing about with a minimal NT environment, with no Win32 on top, but disappointingly played down it ever being used for anything. NT already has API personality support, so I’m wondering if they’ve built a game-oriented one on top to sit alongside something next-gen-Win32-ish for the applications.

      • Grey Poupon says:

        My guess would be that the third “backbone OS” is there so that the two others would be independent from eachother so they could be patched independently. Rebooting one wouldn’t influence the other. When the mini-windows viruses start spreading from console to console, the gaming side of the OS is likely to remain uninfected.

        If both sides get their dedicated core cluster from the CPU, that’s already 50% of processor performance down the drain. You really have to cripple your Windows to get anywhere near that level of overhead.

        • LionsPhil says:

          I am not sure why you would hard-allocate a proportion of cores to anything unless there are serious hardware issues behind it. (Game developers may want to assert certain core affinities depending on memory capability, but making it a fundamenal constraint of the console seems an utterly barmy expectation. Full-blooded desktop Windows running a heap of background services and general idling apps doesn’t need four cores to itself, for pity’s sake.)

          • Grey Poupon says:

            It’s not something I’d imagine them doing either, just went from what I read in the article. Was just trying to make a point that if such were the case, it’d be obvious the “mini-windows” isn’t running on a VM to save resources.

          • Archonsod says:

            HyperV does tend to suck when it comes to dynamic resource allocation.

            I suspect the decision was more to keep it in console territory rather than pseudo-PC though. If you were going to let it shunt resources back and forth then you’re going to get variable performance depending on what each VM is doing. Not only could that affect development, but it’s going to encourage people to start trying to tweak it; and for many people spending ten minutes tweaking Windows to play a game is why they went to consoles in the first place.

  2. Lord Custard Smingleigh says:

    You really are pushing the boundaries of cutting edge photomanipulation with that title picture. It’s like I’m really seeing those hands.

    • ResonanceSD says:

      Dude you can’t just up and say those hands aren’t real, I thought they were real! What’s next on the Lord Custard Smingleigh agenda, telling kids that Santa’s not real either? What a cad!

    • analydilatedcorporatestyle says:

      I get the feeling a lot of hard work went into getting the end result. Firstly Jeremy had to position Cara, then she had to remain motionless while Jeremy ignited the flash powder from under his cover, times have moved on from using a camera obscura to paint a lifelike picture.

      • Sleepymatt says:

        So… the question is, are Cara’s souperpowers more like those of Popeye, or those of Blade – does the pea-infusion cause or prevent the transformation?? Either way, just think how many glass-plates poor Jeremy must have wasted trying to time the soup-ingestion to perfection!

    • guygodbois00 says:

      This whole site is cutting edge, don’t ya know?

    • stahlwerk says:

      I wonder who at Microsoft had the smashing idea of adding a “hovering ca. 10cm above the table” shadow to those XBone press photos.

      • LionsPhil says:

        Afeared of a resurgance of the red-ring overheat woes, the Xbone now has enough cooling fans to function as a crude hovercraft.

        This also allows it to carry its kinect sensor into the bathroom to watch you shower.

  3. Jerion says:

    The Nvidia lineup is shifting upwards. I haven’t got any idea what they’ve opted to do with the rest of their lineup, but I suspect that they’ve opted to simplify the lineup bit and push performance as far as it can go within the constraints of the Kepler design. That way they can increase emphasis on the mid-range and high end while reducing the number of models in the lower brackets (since Intel can supposedly match them there with Iris Pro). Aside from the price hikes, It seems like exactly what was predicted when Intel started making significant generational strides in IGP performance.

  4. karaluh says:

    How about “Hard Choices: keyboard/mouse/gamepad?”

    • Kobest says:

      That would be nice! :)

    • Widthwood says:

      I know it highly depends on country you’re in, but imho best deal out there for keyboard is Logitech G710+. Is one of the cheapest mechanical keyboards in itself, but also a great gaming keyboard with usual Logitech’s profiles, macros, etc. Been using it for couple for months now and I can hardly any fault in it at all.

      Oh, and if you don’t need numpad, extra buttons, in-key backlighting and ok with noname Cherry red-like switches – there’s Zalman ZM-K500 / Rapoo V-7 for half the price.

      • godofdefeat says:

        The Logitech´s gamepad´s are just horrible.
        Srsly though: The joysticks compared to the Xbox 360 one´s are terrible.
        The buttons are somewhat on the same level.
        The only great thing about those Logi´s gamepad is the possibility to change beetwen the Direct (so that the gamepad is usable on older games) and X mode (for the never games).
        And my ye old Saitek gamepad works almost fine, just the 2nd gamepad is sadly broken :(.

        • Widthwood says:

          I was talking about keyboard, not gamepad :) Btw their other gaimng keyboards (rubber dome ones) are also kind of meh.

          • Wedge says:

            Buying a gaming keyboard that is not mechanical is the height of stupidity now anyways, considering the popularity of mechanical has brought them into practical price parity.

    • povu says:

      If you don’t need any fancy features and want something cheap but reliable this regular 360 pad is probably pretty good for playing stuff like Dark Souls on PC: http://www.amazon.co.uk/Microsoft-Xbox-Common-Controller-Windows/dp/B004JU0JSK/ref=cm_cr_pr_product_top

      There’s a wireless one for 10 pounds more.

    • Didden says:

      If you have one, you can also make your Playstation 3 controllers work perfectly with Windows and games as a controller, you need to plug in the USB cord and use the Motioninjoy drivers to make it all work, but I prefer the layout of the controller on PS3 to Xbox. Hate the fact the analog sticks are asymmetric.

      • iucounu says:

        You can, but my experience with the Motioninjoy drivers is that every now and again they spawn a browser window that opens on some Chinese site – which may or not be dodgy, but is breathtakingly rude and unauthorised behaviour. I junked it as soon as I realised what was going on.

        • I Am Thermite says:

          Look up BetterDS3, it is an offline no-ads no-nonsense replacement for the DS3Tool management window thing. You still have to activate the motioninjoy drivers through the DS3Tool’s shady interface, but after that you can just use BetterDS3 to manage your settings.

    • Vorphalack says:

      The Roccat isku is a good choice for a non-mechanical gaming keyboard. Feature rich, good looking and has the half press laptop key feel making it a breeze to type with. As a non-mechanical board it’s also significantly cheaper than most of its competition. Supposedly the trade off is durability, but as someone who has been gaming for 17 years and still never worn out a keyboard, that hardly seems to matter.

      For mice, I like anything by Logitech. G400 or G500 are great for their price, and Logitech customer service is actually pretty good.

    • Chorltonwheelie says:

      Steelseries. Fifty squid in asda. Daft not to.

  5. Duke of Chutney says:

    i would like a look at mice, as i’m thinking of picking one up soonish.

    on the subject of gamepads, my experience is that you need one that is either an Xbox controller, or that your PC thinks is one. I have a Trust controller, which is a fake ps2/3 one. Games like supermeat boy and most other indies with controller support don’t recognise it, so i have to use software to fool my pc into thinking its a keyboard and mapping the buttons.

    • grundus says:

      I would agree, the 360 controller is basically all you need as far as gamepads go, but I would like to know if there are any particularly good controllers that aren’t seen by the OS as 360 controllers. I’d love to swap the guts of my 360 pad into something with better sticks (mine are all loose and shitty) and triggers that don’t pinch me. I’m fragile.

      • MasterDex says:

        If you like the general shape of the 360 controller, you’ll be pleased to know that you can get new analog sticks and triggers, etc and replace what you want. If you do a search on Ebay or Amazon, you should find what you’re looking for.

    • Kobest says:

      I second that. I’m using a Logitech F310 for my gamepad games (Super Meat Boy, Trials Evolution, Castle Crashers, etc.) and I find it good enough really. (On the back of the controller, you can switch between X360 or DirectInput)

      If I have a quick look at prices, a new one costs around 25 USD, but a X360 controller for Windows is around 30 bucks, so going with the latter might be a better option (better positioning of analog sticks, etc.).

      On the mouse side, I’m using a Gigabyte GM6880 for around 20 bucks nowadays. With a mousepad, it does the trick for me for many of the twitch shooters.

    • xavdeman says:

      Just get a Logitech G700S (http://gaming.logitech.com/en-us/product/g700s-rechargable-wireless-gaming-mouse). It’s the latest update of the Logitech G700. You can’t go wrong as you can use it in both wiresless/wired mode, dpi can be set to ludicrous heights, adequate software, saveable profiles.
      They have other mice in the same series. You can also look at Razer mice. Be sure to get a good, large mousepad.
      As for a gamepad, I use the XBOX 360 Wired Controller, you can get PC drivers from Microsoft’s website. Almost all games (especially multiplatform ones) will support it, and change the button indicators (ABXY, triggers etc.) to the correct images during tutorials, or god forbid: Quick Time Events. No other controller can claim to do the same.

      • Widthwood says:

        Razer tends to be very overpriced for what they offer. Couple of years ago their sensors were better and they were the only ones with in-mouse profiles, but now imho the only reason to choose them is if you absolutely adore shape of some particular model…

      • VelvetFistIronGlove says:

        Especially good is that the mouse itself acts as a USB keyboard, so any keybinds you’ve assigned to buttons and saved to the mouse will work on any operating system and in any game.

        I also really like that the mouse switches to wired signalling as soon as you plug it in. And since I’ve got two PCs on my desk, I have two of these mice: one mouse I use only wired, and the other I use only wirelessly. I have both wireless receivers in the wireless-mouse PC. And when the battery runs out, I unplug the one mouse and plug the other in instead, for just a few seconds of downtime.

      • cunningmunki says:

        I second the Logitech G700 (the ‘s’ is exactly the same, I’ve read, but with go-faster stripes). I’ve been through two of them and still haven’t seen anything I’d even consider as a replacement. They’re far from perfect; the battery life is poor and some of the keys aren’t very ergonomic; but it’s totally customisable and in terms of it’s price-to-buttons ratio, there’s none better.

        And don’t beleive any of that ‘wired mice and keyboards are better for gaming’ nonsense. That hasn’t been true since the 90s.

    • Jupiah says:

      I’m a big fan of the Cyborg R.A.T 5. Ludicrously high DPI, 4 custom DPI settings, 6 customizable buttons, 3 different “modes” each of which can have their DPI and buttons customized separately which basically gives you 18 custom buttons, you can adjust the length and weight of the mouse to fit any hand comfortably, a cool “sniper button” that lowers DPI as long as it’s held, and it comes with some really good software. I bought mine 2 years ago for $50, and if you are willing to spend some more cash you can get the upgraded model that lets you adjust the shape of the mouse and swap out different grips, or get a wireless model.

      • boyspud says:

        I’ve had the R.A.T. 3 for about a year now. Even though it does have less functionality than the R.A.T. 5, I must admit I am quite pleased with it. Though to be fair, I can’t tell how much of that is actual appreciation of the quality of the mouse or the fact that I feel like I’m using a decepticon to interact with my rig.

  6. Krazy Gus says:

    Now Nintendo! Now! I know the wii u was just a smokescreen. Now drop that new super secret system and show them what console wars are all about. I know that’s not going to happen, but it would be cool.

  7. golem09 says:

    ok, so the prices suck. I planned to get a used 660 on ebay (and sell my 560) in about 2 months, and then a 760 next year on oculus launch (and selling the 660).
    Does that still sound like a good plan now?

    • fish99 says:

      The oculus rift is only 1280*800 (640*800 per eye), so it doesn’t need much GPU horsepower at all. Bare in mind my 660 can handle pretty much everything in 1920*1080 in stereo 3D (i.e. 1920*1080 per eye, four times as many pixels as the Rift). TBH I would get a 660 and skip the 760 since it’s the same chip as a 660 just presumably with a bit higher clocks. Or just stick with the 560 until you actually find something it can’t handle, it’s still a good card.

      • MattM says:

        Aren’t production Oculuses going to hive higher res screens than the prototypes?

        • fish99 says:

          I’ve read lots of people say it needs a higher res screen, I haven’t read anywhere that’s it’s going to get one though. Also whatever panel they put in there is still split between two eyes, so it’s still not a lot of pixels compared to say 3D Vision on a 1080p screen.

          I don’t know how demanding the other stuff lucian mentions is like the distortion effect.

      • lucian says:

        The Occulus not that light on resources: it has to render everything twice, with at least one extra shader pass (distortion), slightly more intense scenes (must use 3rd person models) and is less tolerant to latency.

        • golem09 says:

          The vorpX driver will probably use a less performance demanding way for the second image, which looks fairly good (which remains to be seen). According to them, the usual performance drop is something like 5-10%. So I hope this will be supported by Oculus games as well, maybe even as an option.
          What I’m concerned about is not running these games. I can run all games fine on my 560.
          But what I want to do is run them at steady 60fps. I can’t do that with skyrim + mods, and many other games also have more or less frequent dips in framerate. That is something I want to avoid. Which is why I planned a 660 for the 720p Oculus, and a 760 for the 1080p Oculus.

        • fish99 says:

          When you say render everything twice, you mean once for each eye?….because I’d already taken that into account. Like I said 3D Vision on a 1080p screen is 1920*1080 per eye, whereas the rift is 640*800 per eye. I saw a video where they were testing the rift and they definitely gave the impression it wasn’t taxing on your GPU.

  8. Wolle says:

    WTF’s a whale omelette and why would it be thick?

  9. jimmydean239 says:

    I was really excited for the 780, thought I would upgrade from my two 570s with this generation. Then I saw it was basically a cut-down version of the frankly ridiculous titan, with a price tag to boot. Enraged, I headed to my favourite hardware retailer with the intention of spending my hard-earned on AMD’s 7970 Ghz Edition (or whatever it’s called) in defiance of those green-tinged moneygrabbers. Then my credit card had an accident and I ended up buying a Titan and a 30 inch monitor. Bugger. This will be a difficult one to explain to the missus.

  10. Low Life says:

    And that makes me think the CPU cores are going to be partitioned. In other words, games are likely only going to have access to a limited number of cores.

    That does seem to be the case: http://kotaku.com/the-five-possible-states-of-xbox-one-games-are-strangel-509597078

    “six cpu cores [of eight], 90 percent of GPU processing power, and 5 GB [of eight] of memory.”

    If I’m not mistaken, the PS4 has an additional processor to handle at least some of the background stuff, so it might provide an even larger gap between their performance (Xbone already has a slightly less capable GPU).

    • Wisq says:

      “Save points can be a thing of the past on these consoles.”

      Hah. Hah hah. Hahhh.

      Are crashes also going to be a thing of the past? Or unwinnable game states (possibly due to bugs) that require you reset the game and go back to the previous save?

      Oh boy. I hope for Kotaku’s sake they were just paraphrasing the kind of PR crap that Sony and Microsoft had already been saying, rather than coming to that conclusion themselves.

    • marach says:

      Both systems have a whole host of custom chips in them, in the XBone we know of the ESRAM processing and custom display chips for the HDMI in line, Custom sound and Video DSP’s are in there too… tbh we don’t even know what can be offloaded from the APU in either system

  11. Arkh says:

    Great Article, Thanks Mr. Laird!

    Currently looking forward to an upgrade, but I probably do better by waiting.

  12. Bremze says:

    I’m trying to put this nicely, but the console/Jaguar part of this article was just abysmal. No, you don’t dedicate a set amount of hardware to the VMs, why the hell even have the Hypervisor if you do that, no an idling stripped down Win8 instance is not going to hog four cores, no, communication between the modules is not going to take ungodly amounts of time, it’s not even an MCM configuration.

    The consoles won’t threaten fragile egos built upon overkill hardware, but then again they’ve never have. They might do some cool things that general purpose PC hardware might pick up later down the road, and generally punch above their specs, but that’s pretty much it.

    • FlowState says:

      I’m sorry, but I disagree. First off: unless there’s evidence that MS have started using a hypervisor worth a damn, their load balancing is going to be crap. Secondly, they’ve already stated that the RAM is going to be split, with 5GB to games and 3GB to the OS processing. If their horrid threading from Windows is any evidence, this split processing from the CPU side should be anything but performant.

      Your statement that consoles have never been about awesome hardware is true. This has been traditionally true because the ‘OS’ had far less maintenance tasks than a PC of the same era. Given what we’ve seen from the XBone, I’m not sure this is the case anymore.

      Everyone keeps talking about how there are going to be all these advancements in AI and whatever. Any programmer knows that’s a bunch of bollocks. As resource gets cheaper, most programming just gets lazier.

  13. Ein0r says:

    “If that’s true…”
    Lets find out when more details come to light and compare to even more advanced hardware available by then.

  14. Wisq says:

    Seems weird that nVidia would want to avoid people using their card for number crunching. Surely that would just mean more money for them?

    And more money means they offset their R&D costs faster, they reduce manufacturing costs by making more cards, they can either pass on the savings and sell even more cards or just take it all for themselves, etc.

    • Lord Custard Smingleigh says:

      Because I suspect any day now they will release an ungimped £5,000 version that will be bought by, say, university bioinformatics departments.

      • Wisq says:

        Ah, okay. So I guess our days of cheap bitcoin mining are going away, then. Not that it was particularly profitable any more anyway, compared to electricity costs.

      • Morzak says:

        Uhm because the ungimped version already exists and is called Titan? I don’t like their pricing but common.

    • bsplines says:

      Nvidia have released the Tesla series for that particular reason and it costs at least 3 times as much as the equivalent GeForce card. So they want you (well, companies and universities) buying these cards and not the cheapest consumer versions.

  15. mad monkey says:

    Well, according to the DF analysis (killzone postmortem), the PS4 will be able to use at least 6 CPU cores for gaming. Very interesting read: http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall. Ram is another interesting thing: we know that the xbox is capped at 5 gigs of ram for actual game usage, but the ps4 might not be. What the DF article shows is that even a first gen game utilises ~4.3 gigs of vram (which is ridiculously massive), so if we’re going to make any predictions, i’d guess that saying that the available vram on all GPUs is going to incrementally increase to 4 to 8 gigs pretty soon would be a safe bet.

  16. Colej_uk says:

    I think the ‘not having to upgrade PCs for the duration of this console cycle’ thing is a bit optimistic.

    If you try to run modern current gen console ports on a PC with hardware comparable to what’s in the 360 then you’re not gonna have a fun time. I bought a gaming laptop in 2005 with comparable specs to the 360 in terms of performance (geforce 7900), by 2008 it was struggling to run most modern games. By 2010 there were only a handful of new releases it could run properly.

    I get that they are using x86 now, which may help a little, but it’s still the case that the main advantage of a console’s hardware is that games designed for them can be optimized to oblivion to squeeze performance out of their hardware. You can’t achieve that level of optimization on PC, even with really decent ports. We’re still gonna have to upgrade a couple of times this console cycle I reckon.

    • TormDK says:

      I doubt that very much.

      If you have any Ivybridge CPU Currently, you wouldn’t be needing an upgrade anytime soon. We (The glorious PC master race) have not been CPU Capped for quite some time, so I very much doubt we will be because the next gen consoles went with some crap AMD solution.

      Hopefully being an X86 platform, we will be seening better, and faster ports to the PC. It’ll be way easier for the devs to manage the ports, and we’ll be seeing DX11.1 ports because the consoles run that per standard anyway.

      So lesson learned – if you’re not using a DX11.1 Card yet, consider starting saving up for a new rig so you can be ready once the next gen consoles land.

      • Colej_uk says:

        I agree, I don’t think those with i5s or 7s will have to upgrade for a long time. But to say we won’t have to upgrade for the duration of the entire console cycle? I think those with powerful systems today may make it half way (5 years) at a push.

        Us PC gamers were marvelling at our powerful new 64-bit AMDs and P4s with DirectX 9 cards back in 2005, but no serious PC gamer is still wielding one of those.

    • Christo4 says:

      That is becasue they aren’t optimised for the PC. For GTA 4 at launch you could have had a core i7 and it still stuttered, only after several patches did it work properly.

    • blackmyron says:

      You have to wonder if the awful port was almost deliberate on Rockstar’s part, if it hadn’t been a blatant attempt to release an unfinished version early as soon by the ridiculously sized “patch” they released.

      And I’m sure GTA5 will be the exact same thing. There’s no way Rockstar is not going to try to get some PC money for that game, but that they’re still acting like they won’t is absurd.

    • fish99 says:

      What resolution is your laptop? Bare in mind some games on 360/PS3 don’t even run at 720p. Halo 3 (on 360), GTA4 (on PS3), render at a lower res and upscale. Also your laptop has a 7900*M*. And what CPU does it have?

      • Colej_uk says:

        It may of had a M GPU, but the the 360′s GPU was closer to a previous gen 7800 anyway. The CPU was a (high end at the time) intel core duo. It also had 1Gb of ram, twice the amount of the 360. The resolution was 1440×900. Even dropping down the resolution to 720p or so it was struggling with modern games by 2010.

        • Vorphalack says:

          ”It also had 1Gb of ram”

          Vista and Windows 7 use about that much RAM just for the OS. No wonder that old thing was struggling, 4 gigs has been considered a baseline for gaming PCs for years now.

          • Colej_uk says:

            I was running XP on it back then as it was before vista, but I should say it came with 1GB which was still considered fine for gaming back in 2005. I did later upgrade to 2Gb, but it just couldn’t handle UE3 games very well at all which became one of the staples of the current gen games.

            But this kinda demonstrates my point really, we can’t properly compare PC specs to a console. The 360 has 512MB of RAM, and that amount was pretty much unthinkable on a gaming PC even by 2006.

          • fish99 says:

            That’s surprising because the 8600M GT in my Vostro 1500 (with T8300 and 3GB ram) can very nearly handle UE3 games, and a 7900 Go GTX is about twice as quick.

  17. Morzak says:

    The first part was just bad….. Keep ignoring that the main problem of the current gen consoles is the RAM, which both consoles have adressed, even 5 gb should be enough to push 1080P textures and allow for way bigger areas and more interactivity. And as long as console games go in that direction it will help a lot.

    • LionsPhil says:

      Yeah, I am VERY happy to see 8GB on both. That half-gig limit has done horrible things to the size of areas and range of interactive simulation around the protagonist in cross-platform games.

    • andytt66 says:

      Waitaminute, hold on. I haven’t been following much about the latest consoles, but surely it’s going to be running at a higher res than 1080p?

      *sound effect of furious googling*

      Oh for god’s sake. THAT is the roadmap for the next generation of consoles?

      • fish99 says:

        Well the TVs they’re going to run on are all 1080p (or lower). I know there’s higher res screens on the way, but I suspect uptake will be very slow with no media at that res to take advantage of them.

  18. f69 says:

    You should have talked about what GDDR5 and fast access shared CPU/GPU memory means for PC gaming. I’ve seen it said that this is where PCs are at a disadvantage.

    This is what I want to know. We already know The CPU and GPU viewed on their own are better on PCs.

    • Rufustan says:

      Not fully up on this, but the memory on both consoles is a bit of a compromise as everything else is. DDR3 has the low latency that the CPU needs, and DDR5 has the bandwidth needed in graphical stuff. A top end PC is going to have 8 Gig+ of DDR3 dedicated to the CPU, and 2-3 Gig of DDR5 built into the card.

      As I read it, the new Xbox is going to suffer because it only has DDR3, which is going to hurt it for gaming. The PS4 could get an advantage over PCs because it can assign more DDR5 to graphics than any current card has (unless you are using a Titan). How much of an impact in either case; who knows?

      If it is a major issue, whats the odds that the next generation of Both AMD and Nvidia cards suddenly have 5-6 gig (or more) of Ram built in?

      • Vandelay says:

        I’m not really into this tech talk or fully understand it all, but remember that it is “G”DDR5 that is in the PS4. As I understand it, this is basically DDR3 optimised for graphics. There is no such thing as DDR5 yet, although there will be soon PCs with DDR4.

        Where I start losing understanding is what the actual difference between DDR3 and GDDR5 is. The other thing I don’t get is what is the PS4 using all of it for? At the moment, we are playing games on our PCs and barely filling up 2GB of GDDR RAM. I would have thought, even when you are mainly focusing on games, the majority of RAM usage would be better suited to regular DDR3. Is GDDR5 primarily used for storing textures?

        • Rufustan says:

          As far as I can tell, GDDR5 is DDR3 optimised for what graphics cards need, which is Bandwidth — pushing huge amounts of data in and out of RAM as fast as possible.

          The size of graphics RAM amongst other things is used for the frame buffer- how much data you can put into a single image, which limits resolution and textures. If the PS4 can use 5Gig, then it could use better textures than a PC card can handle.(As opposed to the reverse right now).

          The price of GDDR5′s setup is that it has really high latency, which it doesn’t care about. Standard DDR3 needs low latency, so that the CPU can find and access each memory location as fast as possible, which is what the processor needs.

  19. Iskariot says:

    I have a hard time believing the AMD A4-5000 chip, with its mentioned shortcomings, will be the one in the next gen consoles. It would be such a stupefying stupid move that I won’t even try to comprehend it.
    That means my current PC already outclasses it and my new PC, which I will buy this year will annihilate it. Incomprehensible if this were true. I have no words….

  20. ScubaMonster says:

    Nobody even needs a 700 series GPU for gaming anyway, it’s overkill. It’s just for elite power users who want the biggest and best of everything whether it has a practical use or not.

    • Iskariot says:

      That is what some said about the 200, 300, 400, 500 and 600 series too. Nevertheless we already see games that can even tax a Titan, like Metro Last Light.
      I own a GTX295 now and my next card will probably be a GTX780. I think it will be a great upgrade that will secure me good gaming at high settings and resolutions for the next 5 years.

      • Vorphalack says:

        I think it’s important to remember that what you ”need” is not what is most optimal. A relatively cheap pre overclocked 460 will game just about everything on the market right now, You might have to compromise quality on the most demanding games, but not by much. Staying behind the curve is quite viable, and with what they are charging for the 700 series cards it will also be cheaper than trying to future proof. The ”early adoption tax” for that generation is just too high.

        • MattM says:

          I think you are underestimating the difference between a 460 and a 780.
          http://www.guru3d.com/articles_pages/geforce_gtx_780_review,19.html
          The 780 is 3.5x faster than the 460. It’s a luxury, but so is any graphics card.

          • Vorphalack says:

            That really isn’t the point at all. Retailing at over £500, the 780 will be about 5 times as expensive as a pre-overclocked 460. By the time it is released, you can substitute 460 with 560, and the performance gap will be even smaller. An overclocked 560 will probably be fine for at least 2 or 3 years, by which point the 600 series cards will be down in price.

            When people talked about future proofing in the past, it meant buying a more powerful, expensive component now to save money on more frequent, less powerful upgrades over time. Buying a 780 now will future proof you on performance, but you will lose out financially compared to someone who stays behind the curve. As no one is going to need a 780 for gaming for years to come, I don’t think its a particularly sound investment at this price.

          • MattM says:

            I think you can get a significantly better experience in games already out (Far Cry 3, Battlefield 3, Witcher 2, even old Crysis) by choosing a more expensive card. If you want to get 1080p @ 60fps with the nice lighting and shader effects then a 460 isn’t going to be enough in many games. I agree that buying for the future usually doesn’t pay off, for instance you really won’t get better gaming performance from a $600 cpu vs the $220 i5-3570k. Even if games take advantage of 6 or 8 cores in the future its probably a good idea to wait to get an 8 core until its a bit better supported.
            There is nothing wrong with choosing a budget card, but more expensive cards have real benefits.

    • Heighnub says:

      I take it you don’t have a 30″ display.

    • Audiocide says:

      You only need about 2000 calories in a day, and you don’t ever need any vodka or chocolate.

      The good thing is that we can actually choose to enjoy a couple of treats every now and then.

  21. Rufustan says:

    The thing that really worries me about the consoles, as we get more details about their hardware, is what is this going to do (or not do) for PC hardware development?

    Intel has been giving out small incremental increases in CPU power for a few years, largely because they don’t need to do anything more to stay ahead of the game.

    Have AMD or Nvidia made any big changes for a while in graphics cards? AMD simply seem to be re-branding old chips, and as highlighted above Nvidia are just modifying old tech.

    Its nice that PC tech is stable, but you get the feeling that the new consoles would drive some innovation so that the PC market would have to catch up. From the details released so far, I guess I will probably have to finally upgrade my graphics card, but to something off the shelf now (GTX 660/70 or HD 7870/7950). I’d guess an i5 2500K will probably cope with anything designed for these bits of hardware.

    If all the hype– that these consoles can far outgun any high end PC (which looks increasingly unlikely), turns out to be untrue, what happens with PC hardware for the next few years?

  22. fish99 says:

    I wonder how Sony are going to bring Planetside 2 to the PS4 with that CPU when the game needs a small number of very very fast cores and can barely take advantage of 3 cores, let alone 8. They haven’t shown any evidence they can optimize PS2 on PC for more cores. It’s gonna run 15 fps in big battles with that CPU (since it runs 30fps in big battles on a 3570K).

  23. Kinth says:

    Think I’m just going to get a 680 and be done with it. The price will drop soon with the 700 series coming out and It runs those snazzy Unreal 4 demo’s well enough. Plus the new consoles look to be pretty weak so it doesn’t look like we will be needing a whole lot of extra power for the “next gen” games.

    It looks like both Nvidia and AMD have hit the wall in terms of power leaps for a while. Kind of like when the 8000 series came out and everything after it for a while was just a rebrand with a tiny bit more kick.

  24. honky mcgee says:

    I’ll just leave this here…

    http://www.youtube.com/watch?v=SyD1VlQdj2s

  25. MattM says:

    Every GTX 780 review ended with a huge wink about another impending GPU release. It is almost certainly the GTX 770 and the sites probably already have the review samples in hand. They indicated that the embargo lifts less than a week from now. Given Nvidia’s pricing for the 780 it doesn’t seem likely that the price/performance of the 770 will be amazing but its probably worth waiting a week to see where it falls if you are considering a upper end GPU purchase.
    The 780′s price is disappointing. The 670 and 680 were good values on release but didn’t fall much from their launch prices in the following year. AMD’s offerings dropped prices severely when the 600′s launched and driver updates substantally improved their performance. But now I was hoping for more. If you bought a year ago you got the same price/performance as today. Hopefully the 770 and 780 will receive some rapid price drops.

  26. SherryMorris27 says:

    Ethan. even though Elaine`s st0rry is really great… I just got a top of the range Jaguar XJ when I got my cheque for $8999 this-last/5 weeks and-also, ten k last-munth. this is really my favourite-job I’ve ever done. I began this seven months/ago and immediately began to make minimum $86, p/h. I use the details on this web-site,, Bow6.com

  27. Liudeius says:

    I…
    I don’t know what to think.

    On one hand, it’s looking like my current laptop will be able to play next gen console games fine, but on the other, my year-old LAPTOP is better than the next decade’s console.

  28. aircool says:

    Console games haven’t really progressed since the PS1. The graphics are better, but the games are still the same, car games, sports games and F/TPS’s.

    How old is WoW now? Almost ten years? Very popular, made tons of money and still represents the MMO genre to those outside of gaming, yet nothing like this has ever appeared on a console.

    What am I playing at the moment? Planetside 2. How much does it cost? Nothing. How awesome is it? Awesome +1. I even get to play alongside RPS members, several hundred at a time on occasion.

    And why would I want to play videogames on my TV? How can I watch the football/cricket at the same time as playing games with a console.

    I’ve bought and tried every Playstation and XBOX. Not worth the money. My PS3 is used as a DVD player for the TV, and was taken with me on holiday last year so I could play XCOM.

    Now I feel like doing something outrageous that only a PC gamer can do… go and play FTL for a few hours :)

    • Liudeius says:

      I disagree. I’ve only bought the Sony consoles, so I can’t comment on the others, but PS1 and into 2 had many JRPGs. Late PS1 and all of 2 had lots of third person platformers. PS3 has mostly first or third person shooters as its big titles.
      The games have also gotten far more automated, not that it’s a good thing, just a change. (Compare AssCreed with Sly or Uncharted with Jak)

      Gaming has changed, it’s more about what the industry considers people to want than that consoles are restrictive.
      If anything, I would say console gaming has seen a significantly expanded breadth of games.
      We have the Wii/U with plenty of casual and party games.
      The One as an exclusively bro console.
      and the PS4 as core console.
      Some big name indies were even on console before PC. (Bastion and Super Meat Boy, unless I’m mistaken.)

      True, most big titles follow similar roots, but you’re not about to spend 60 million dollars developing FTL, instead you would develop something big and 3D.
      (I don’t know if PlanetSide2 is the best choice for displaying PC innovation. It’s just another FPS, consoles simply aren’t capable of running such a huge open world.)

      • MattM says:

        I would be down for a $20 million version of FTL.

        • Liudeius says:

          But my point is that for $20 million it would cease to be FTL. It may posses some core mechanics of FTL, but it would have much better graphics, much more content, probably wouldn’t be focused on replaying (I refuse to use that compound word).

          Can you really think of a way you could invest $20 million more in development of FTL without significantly changing it? Even if you did just add more ships, races, events, and weapons, with $20 million worth, it would probably be too bloated to be as enjoyable as it can be.

          There are certain things which you work towards when you have more money (orchestral music, cutting edge graphics, massive scale) and certain things you work towards when you don’t have lots of money (memorable music, unique graphics, short-but-entertaining or replayable).

          And really, if you look at many indie titles, they copy off of each other as much as big names, they just have a different goal since they have less money. Look at all the indie puzzle-platformers.

  29. Shadowcat says:

    So the new consoles are all powered by Atari Jaguars? Who knew.

  30. waltC says:

    I liked the picture accompanying this article a lot–because I’ve got hands just like that and I wanted to ask you (presumably the hands belong to someone @RPS) what you do about those unsightly callouses on your knuckles? I use lentil seed wax and camphor oil, followed by a vigorous application of fire and number #3 sandpaper–but I still can’t get the smooth, creamy skin I want there. I’m at my wit’s end. Can you help me?

    Thanks.

  31. SuicideKing says:

    Epic! I was thinking the exact same thing about the cores being divided b/w games and all the social stuff! Especially the xbox, running to operating systems and a hypervisor.

    It’s actually pretty brilliant that this is one of the first points you made since no hardware site seems to be talking about that possibility.

    I started thinking along those lines primarily because of the performance hit that’s incurred b/w modules.

  32. SuicideKing says:

    Two things: the rest of the 700 series are simply rebrands, with the 770=680 (tweaked a bit), 760 Ti=670, and so on.

    Also about Haswell: for the desktop, while it’ll be useless as an upgrade for Sandy/Ivy owners, for everyone still using a Core 2 Quad/Duo or Phenom II x6/x4 (or weaker than these), it’ll be a pretty big jump (100% or more in synthetics).

    I still play FreeSpace 2 (with the open source improvements) and for a mission in which i get 30 to 50 fps (because of a CPU bottleneck, FS2 Open is single threaded), my friend’s 3570K gets a solid 120 fps (it’s capped at that).

  33. HisDivineOrder says:

    When AMD was first to launch, they launched at $550 and the performance at the time was barely 20% ahead of what the Geforce 580 was putting out, but it did so at less performance per watt. For that, AMD decided a $50 premium over the 580 pricing of $500 was warranted.

    Gleeful, nVidia found that AMD’s performance part was only equal to their mainstream part, their 560 Ti replacement. So nVidia did one of the two things nVidia does best, quickly changed the up-till-then rumored 660 Ti into a 680 and focused all their resources on making “Big Kepler” (aka GK110) into a part for the Titan cluster only for the whole of last year. This let them charge $500 for a part designed and fabbed to be sold for $300, which means every chip sold was a huge profit. Plus, we welcomed it because it was a price drop from what AMD wanted for their higher heat, higher noise part. And it took AMD six months to release a decent driver. Hell, their launch driver didn’t even support Crossfire.

    Fast forward to today and AMD isn’t showing up at all with a new part. nVidia gets to basically win by default. Sure, there are some outstanding bundles out there, but AMD’s given up on the high end performance race, so nVidia gets to again do that other thing they do best. Charge whatever they like.

    Not since the days of the Geforce 280 and the 8800 Ultra have nVidia been free to price their GPU’s as they see fit and at last AMD has fallen behind again. But nVidia doesn’t want the bad press from the 7970 launch where people complained bitterly about the price almost universally, so what to do?

    Release a $1k part that isn’t part of the main line. Wait a month-ish, then release a “value” version of that not-numbered part, but number this one and mark it down a few hundred down from the ridiculous one. Even though it’s a solid $150 over the last high end part’s pricing, it’s still hundreds less than that other part that people were lusting over for just over a month.

    Bam, nVidia’s made the price point $150 higher than before and SOME people are actually calling THAT a deal.

    nVidia’s not a GPU company. They’re a marketing company, through and through.

  34. Artamentix says:

    You are automatically making the assumption that your laptop’s chip is very very similiar in power to the custom chips in the PS4 or XboxOne. If you go by the raw FLOPS that they say. “But here’s the thing: AMD’s current top-end APU only delivers around 700 GFLOPs of compute power from its CPU and GPU combined. We’re told the PS4′s processor delivers nearly 2 TFLOPs from its GPU alone.”

  35. Statix says:

    This article is making the assumption that HALF of the 8 Jaguar cores in the PS4 will be devoted purely to OS, which is a monumentally stupid assumption. At most, it will be 1 or 2 cores that won’t be used for games; mostly likely, developers will be able to use all 8 cores, but simply have to allow the OS to take over a core or 2 only when it needs to.

    This article is also ignoring the fact that the PS4′s CPU and GPU are on a single die, with 8GB of GDDR5 unified memory, which eliminates a ton of bottlenecks that are inherent in standard PC systems.

    Let’s not also mention the fact that consoles are closed-architecture platforms, which allows for incredible amounts of optimization to be done, rendering it basically impossible to compare two given components on a raw-power basis alone. The 8-core APU present in the PS4 is going to be able to produce far greater performance in actual games dedicated for the console than an equivalent PC part (although an 8-core APU with unified 8GB of GDDR5 doesn’t actually exist for PC, and will likely never exist). Just as an example, take a look at the absolutely ANCIENT PS3 hardware, which was released back in 2006. The PS3 is only equipped with an ancient Nvidia 7800GTX and 512MB of TOTAL DDR3 RAM. An abominably outdated GPU, right? Yet through extreme optimization and hardware utilization, bypassing any OS or DirectX overheads and coding directly to the metal, this old-as-dirt PS3 can achieve graphics fidelity that you wouldn’t even dare to think of–just take a look at beautiful games like The Last of Us, Uncharted 3, Killzone 3, and Infamous 2. Try to see if you can get Battlefield 3 or Metro: Last Light to look that good–let alone run–on a 7800GTX and 512MB of total memory.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>