Week in Tech: Will It Play Crysis, & More On PS4

By Jeremy Laird on March 13th, 2013 at 9:00 pm.

Ahhh, the quest for PC games with graphics so good, they look pre-rendered. When will it end? Actually, it’ll end when PCs are capable of graphics that look pre-rendered. It’s going to happen. And say what you want about the gameplay or narrative, but Crysis 3 is a reminder that we’re getting ever closer. Think it’s also worth a recap on the Sony PS4 launch now that its beating, PC-derived heart has been officially revealed and tell you why I’m increasingly convinced it’s good news for the PC.

Right, will it play Crysis? The mere mention has my heart sinking. When Crysis first came out, it was breathtaking to look at. No question. I played it for at least two hours, just for the visuals.

Doesn’t sound like much, but it’s pretty much a record in terms of my tolerance for tedious shooting galleries. So, Crysis wasn’t fun. It wasn’t even very well coded. And the more you looked at it, the more the visuals felt a bit unfinished.

Ultimately, it was a pants metric of PC performance. But it became the gold standard and for years it was unrivalled for both absolute visual fidelity and homicidal levels of GPU abuse. When you think about how fast PC tech can change, it was a deeply impressive run in many ways.

Crysis 2 was a much more consoley affair. Way more polished in every regard, but dumbed down and by at least some measures, actually less detailed and less capable in graphics terms.

Now there’s Crysis 3 and, dare I say, brief acquaintance suggests it might just be the best of both worlds. And like the eye-candy crack whore I so surely am, I find myself sucked into playing it just for the goggle factor.

I’m bored already, obviously. Don’t ask me what I think of Crysis 3′s gameplay. I’m barely aware of it and definitely don’t care. And it doesn’t actually pass the pre-rendered test overall. Not even close, to be honest.


Dead behind the eyes: Pixar would be proud

But, my science, it looks good. And, critically, there are elements that for fleeting moments look like the real Pixar deal. You catch a glimpse of something, maybe a face on a character, and for an instant it really does feel pre-rendered.

Needless to say, I can hardly ask you to unload £40 just to look at the pretty shaders for a few hours. But I’m a hardware guy at heart and I love seeing the game being moving on in terms of graphics quality. If you can find a way of seeing it run on a powerful PC, do. Those video caps just aren’t the same.

And so the PS4

As for the Sony PS4, the official launch has confirmed that the leak specs we discussed yestermonth turned out to be bang on the money. We’re talking eight AMD Jaguar cores, 1,152 GCN-style AMD graphics shaders and 8GB of actually pretty impressively quick memory.

Technically speaking, two new interesting details emerge. For starters, they’ve stuffed the whole shebang (well, not the memory, but the CPU and graphics) into a single chip.

I’m not sure if it fully qualifies as a system on a chip. But it’s one hell of a CPU-GPU fusion processor and far more powerful in terms of graphics than any combined processor you can buy for the PC.


PlayStation 4: It’s a Radeon HD 7850, but not as we know it

Then there’s the full shared memory space with 176GB/s of bandwidth shared between the two processor groups, so to speak.

In many regards, it’s a glimpse at what PC processors will look like in a few years time. Fusion chips like this will eventually replace the existing discrete CPU-and-GPU model, even in high performance PCs.

The shared memory space answers one of the big questions that raises, which is the problem of bandwidth on fusion chips. GPUs need tonnes. CPU memory controllers aren’t nearly good enough. It’s also a reminder that heterogeneous computing is on the way to the PC.

Sounds like bullshit, but it really just means having a big chip subdivided into areas of specialisation in terms of workload but at the same time integrated in terms of programming and memory space.

Ideally, what you want is to make some code, shove it on the chip and have it automatically run on the most effective and efficient part or parts. You don’t want to muck about making special code paths, which is effectively what game devs do today for GPUs.

It should make for more efficient and ultimately even more powerful and capable systems. I think we’re a long way off true heterogeneous computing, but the PS4′s design is a reminder that it’s on the way.

The other part of the equation is the impact the PS4 will have on PC gaming in the shorter term. From what I understand, the fact that both PS4 and the next MS Xbox are PC based in terms of chips means that from here on in, most of the big games devs will work on PCs and then hit recompile buttons for the two consoles.


Fusion in a box

It’s more complex than that, but the point is that the PC makes a very natural focus point for games development now that the consoles are based on pure PC hardware. That has to be a good thing for PC gaming.

On a final note, I wanted to revisit those AMD Jaguar cores in the PS4. Like I said last time, they’re horrible. Clock for clock, they probably do, I dunno, less than a third the work of Intel’s Ivy Bridge CPU cores. And clocked at 1.6GHz makes them roughly half as fast in terms of clockspeed.

Not to put too fine a point on it, but do the maths and single thread performance is going to blow goats. That upset me before. But I’m over it now. There are eight of them. And if devs pull their fingers out, that’s probably enough to get some decent work done.

What’s more, it should help even things out for AMD. Its Bulldozer CPU design sucks at single-thread but is much more competitive at multi-thread. So if game devs really do crack the multi-thread problem, it will make AMD processors much more attractive for gaming PCs. In a word, yay.

Whether all this will happen in time to help AMD survive, I don’t know. But it certainly won’t hurt.

, .

96 Comments »

  1. rockman29 says:

    Yay PlayStation 4 :):):)

    • int says:

      Just you wait for PC2!

      • Premium User Badge

        Cinek says:

        PC2 is what we have now. Wait for PC3! It’ll blow PS4 out of the water, not just defeat it like the PC2.
        ;)

    • Torn says:

      If anyone hasn’t read the following breakdown of the PS4 architecture, now would be a good time: http://www.bradfordtaylor.com/insert-blank-press-start/ps4-vs-the-great-discord/

      The APU combined with that amount of super fast RAM is going to be a big deal ™, and it’s interesting to see people comparing things like clock speeds to their own PCs (I mean, GHz as a metric of how strong a CPU is hasn’t been relevant since Pentium 4).

      Comparing clock speeds or amount of RAM (hey, I have 8GB too!) is pretty useless when the architecture in the ps4 is different enough from current PCs, especially as it’s doing away with slow buses like PCI-E.

      We’ll have to see how it turns out of course, but I think the PS4 is going to give even high-end gaming PCs built in 2013 a run for their money in terms of output devs will be able to squeeze out of the system.

      • frightlever says:

        That’s nothing new. You start out with the consoles having games as good or better-looking than top-end PC and then over time the consoles start to lose their shine. Same old same old.

        EDIT: just finished the article. Definitely worth a read, but I still don’t see anything game-changing in the PS4.

        • rockman29 says:

          The most significant stuff was not the hardware itself in the PS4 presentation I thought (except the 8 GB GDDR5, which is apparently a big deal, even just from an engineering standpoint, like no one thought it was possible quite yet).

          What was probably most “game-changing” (not that I think yet there is any certainty to it’s success) is the Gaikai streaming of games.

          Also the other neato feature was the instant resume (I think games saved into RAM in a low power state).

          I think key advantages for consoles as time goes forward is not hardware, but software implementation.

          Like how phones are so rapid to access content, compared to a computer. While I love my PC for being modular, being PC, and letting me play Red Alert in HD whenever the heck I want… the closed aspect of consoles seems to be drawing on some unconventional advantages that are broader than the literal hardware you purchase in stores.

          In other words…. I am rather excited for PS4 :)

        • Sparkasaurusmex says:

          I wonder how long until PS4 is emulated in my computer

      • Solidstate89 says:

        There’s really nothing remotely special about the CPU architecture. It’s bog-standard x86. Sony has just decided to use 8 mildly powerful cores instead of 4 or 6 high performance cores to get the kind of performance they need.

        The only thing special about the Playstation is that it’s using GDDR5 for shared memory between both the CPU and GPU. However, while GDDR5 has insane bandwidth compared to its more contemporary, lower clocked DDR3 cousin, it also suffers from very high latency compared to DDR3. It is afterall, just a higher clocked, slightly modified DDR3 and if you’ll notice when shopping for RAM, the higher the frequency, the higher the CAS latency.

    • Jackie-Rojas says:

      my friend’s step-aunt makes $74 hourly on the computer. She has been out of a job for nine months but last month her pay check was $14799 just working on the computer for a few hours. Go to this web site and read more… http://www.youtube.com/watch?v=diP-o_JxysA

  2. ResonanceCascade says:

    “Doesn’t sound like much, but it’s pretty much a record in terms of my tolerance for tedious shooting galleries. So, Crysis wasn’t fun.”

    This is a pretty inaccurate characterization of the Crysis 1 that I played. The slow burning one with huge open levels and tons of gameplay options. Didn’t remind me of a shooting gallery AT ALL. It was pretty much at the opposite end of the FPS spectrum.

    • SuperNashwanPower says:

      I too had a fair-to-medium sized grump over this statement also, but then remembered that Crysis was a game where you made your own fun most of the time. The suit functions, the tools and the changeable weapons all made for great experimentation. Personally I liked trying to do ghost runs, or combining suit powers in fun ways. If you treat it like a straight forward shooter, it IS dull. To me Crysis was a directed, semi-linear sandbox. I’ve replayed it many times, and STILL don’t have a rig that can 1080p it on highest settings with HD texture pack.

    • Captchist says:

      Gameplay options without a lot of integration.

      You could be stealthy. Or you could be strong. Or you could be fast.
      But you couldn’t chain them together much without running out of energy and due to the clumsy control system.

      So for me it just because forgettable shooter where you could run faster than normal. I liked the bit with the jets at the end, but I may be getting confused with the bit with the jets from battlefield.

      • ResonanceCascade says:

        I wasn’t just referring to the suit powers. The various vehicles and explosives coupled with the destruction system created a lot of opportunities for creative mayhem. I’ve played the game a half-dozen times and I still find insane new ways to take out enemy bases. I can understand not liking the game, I just can’t agree that it was a “tedious shooting gallery.” And I see it said so much that I feel like I have to stick up for the game a bit.

      • SuperNashwanPower says:

        @captchrist – YOU couldn’t chain them. Thats not the same thing as it not being possible. Sorry to sound elitist but the suit powers were easy to learn but difficult to master. Nonetheless, you could pull off amazing combos if you liked the game enough to be patient / learned energy management.

        • Koozer says:

          Double tap crouch; sneaky sneaky, double tap punch; watch Korean sail over a building, double tap spacebar; follow him over the rooftop.

          • SuperNashwanPower says:

            I actually didnt know about them. I practiced loads with the middle mouse button method :) My have to go back and try them, cool thx

          • Premium User Badge

            particlese says:

            Wait, seriously? I was going to make this a Skyrim night, but I have to try that now, and I’ll probably get sucked in as a result. Fun game, that. And Warfacehead, too. Darn it, RPS, I had to look up the name of the expansion. >:(

          • Koozer says:

            Be warned, I seem to remember the shortcut keys being off by default in the options.

          • Premium User Badge

            particlese says:

            Yup, there it is in the “look” section, of all places. And it’s a beautiful thing. Thanks!

        • Snargelfargen says:

          Yeah, the controls weren’t the issue so much as the difficulty and level design. I really wanted to be a stealth-puncher/demolitionist but every fight deteriorated into a long-range shooting gallery (sniping gallery?) the moment guards halfway across the map heard a shot. It was frustrating to be given all those awesome powers and then be punished for playing with them in silly ways.

          Should really finish the game at some point, but I’ll have to either lower the AI’s alertness or give myself more health somehow. Or just be a sniper, but why not play Far Cry then?

          • malkav11 says:

            That was my problem with Crysis, yeah. You’re so fragile and the AI is so numerous and accurate and aggressive that the only way I’ve been able to succeed at playing either Crysis or Crysis 2 (which at least makes using the suit powers more approachable) is to be slow sneaky cloaked snipetyman. And while that’s not a gameplay style I have anything against (I adore quite a few stealth games and sniping), I don’t feel like it does those things as well as games that specialize in them, and the lure of superjumps and superpunches and hurling large objects around and being superfast and all that is ultimately a lie.

          • Droopy The Dog says:

            I played it like hotline miami, every encounter was repeated countless times until I finally perfected the graceful flowing ballet of death and korean choking.

          • Premium User Badge

            particlese says:

            …which reminds me of:
            http://www.youtube.com/watch?v=LJkCWQcZYKo
            …which requires a link to its companion:

    • Grape Flavor says:

      It’s a “mainstream” game, a mainstream shooter at that – RPS is practically obliged to dislike it.

      If they didn’t from time to time play a AAA game for 5 minutes so they could come online and sneer about how much they hated it, how else would they show us how much more sophisticated their tastes are than those lesser gaming websites?

      • Toberoth_Syn says:

        Hahaha oh dear.

      • scatterbrainless says:

        Far Cry 3? Spec Ops? Max Payne 3? They do alright by mainstream shooters, provided they’re not CODBLOPS the 3rd: Lord of War of Fighterface.

  3. Brun says:

    Whether all this will happen in time to help AMD survive, I don’t know.

    The 8-ball says, “Outlook not good”. There are a few reasons for this:

    1) The PS4 and Xbox 720 chips will not be high profit, high margin products for AMD. MS and Sony will have beaten them up pretty bad on the price because they need to keep the price of the console down.

    2) The next-gen consoles are expected to sell on the order of 70-90 million units over the course of their lifetimes. That may seem like enough to keep AMD afloat, until you realize that some smartphones are shipping 30-50 million units per month. AMD failing to make inroads in the mobile space is what’s ultimately going to be their undoing.

    • joel4565 says:

      Yeah unfortunately I agree with you. I don’t think it is near enough volume to save AMD. The only part of AMD that is really making money is the graphics card division. Most of the company is bleeding money.

      According to Anandtech’s podcast there is some cool stuff coming down the pipeline from AMD in the next year or two, but nobody knows if it will get here in time to save the company. According to Anandtech AMD lost 473 million in Q4 2012, and 157 Million in Q3 2012. 630 Million bucks is a lot to bleed in 2 quarters.

      I myself was a huge AMD fan for CPU’s but switched over to Intel when I got my 2500K as there was nothing from AMD even close in performance, especially in lightly threaded programs like some games.

      • biggergun says:

        Not looking to start a CPU flame war here, but from what I’ve heard recent AMD chips really shine in multithreaded workloads, I think I’ve seen a benchmark not long ago where 8350 slightly outperforms a 350$ i7 in Crysis 3 and Far Cry 3. If a trend for more multithreaded games continues, and it obviously will now that the new consoles have an AMD octocore, it might be way too early to bury AMD.

    • Moraven says:

      Last Apple quarterly report stated they sold 47.8 million iPhone. 22.9 million iPads.

      So they are at ~15.933 phones a month. And they do not state how many are older models.

    • BULArmy says:

      I really can’t imagine what will be the market without AMD. Intel and nVidia monopoly, this can’t lead to anything good.

      • Brun says:

        Intel has ARM to keep them competitive. I’m sure AMD’s graphics division will be spun off or bought, as that actually does fairly well compared to their CPU business.

    • SuicideKing says:

      Smartphone/tablet chips have tiny margins and high volume, x86 stuff has much higher margins, so even though volumes are less, profit should be comparable.

      Don’t forget that the PS4/720 won’t be containing the only chips AMD’s selling. They have OEMs and retail with them too, plus they’re looking to enter the smartphone tablet area.

      Consoles at least assure a steady income.

    • analydilatedcorporatestyle says:

      It will be interesting to see if they go down the same route as Intel in soldering chips to motherboards in the future(this info may well already be out there).

      If anything can save them it’s having the proprietary gaming platform for the front room. Even if they are not making money on the console hardware I’m sure they will use the production facilities and adapt the products for the consumer PC market and make money.

      Personally I think they will be round for a good while longer. Big Corps. like Sony, Nintendo, Microsoft etc don’t link up with a company that will be going tits up anytime soon! They just wouldn’t take the risk on problems within the supply chain. They will have looked at the books and been happy. Profits/loss’ are cyclical in this sort of consumer product.

    • Tams80 says:

      They do also have a finger in the Wii U pie. The Piston also uses an AMD APU and may hopefully help AMD chips become the defacto HTPC choice.

      They’re also working on an ARM design. Whether this will do well is questionable, but at least they are doing it. It’s a shame past AMD executives were shortsighted enough to sell ATI mobile graphics to Qualcomm.

  4. mehteh says:

    “most of the big games devs will work on PCs and then hit recompile buttons for the two consoles.”

    But if its made for consoles and their audience(like majority of the multiplat games are) then it wont be any change at all that i care about. theyll still be shooters meant for slower controls and audience aka boring and unchallenging to long-time true(PC) FPS fans. I havent been excited for too many FPS games in the last few years and BF3 was a huge letdown. Hopefully Rekoil is good since its made by a PC dev and its focus is back to the core elements of a shooter(movement, skill, etc)

    • Giuseppe says:

      I don’t want to diss your comment, but I always chuckle when I see an argument that contains “true -insert social group here-

    • Skofnung says:

      Can we find some true Scotsman among the TRUE ELITE xX_H@rdk0re_Xx PC gamers?

  5. Lugg says:

    As a tangent on the teaser: More than once already, video game screenshots have been used as illustrations for news reports. Because they looked even better than real, apparently! Or maybe because they looked realistic, but also artistic. Reality is boring. Hyperreality is in!

    Also, stupid news services need to check their sources, and not just google image search.

    Here, Assassin’s Creed’s Damascus in coverage of the Syrian civil war: http://metro.co.uk/2013/03/11/danish-tv-sorry-for-using-assassins-creed-screenshot-in-report-on-syria-war-3535916/

    And here someone mistook ArmA footage for real war footage: http://metro.co.uk/2011/09/27/itv-mistakes-video-game-clip-for-footage-of-ira-taking-out-helicopter-165023/

    Journalism these days…

  6. JoeGuy says:

    So does weak CPU’s mean that GPU performance will be more important
    next gen for multiplats on PC?

    • Premium User Badge

      amateurviking says:

      It most likely means that physics and AI will still be pish, but the shiny will be shinier. All that RAM means *much* bigger playing space though.

      • Asurmen says:

        Surely with that many threads to play with, if computer devs finally get their arses into gear and fully support multi threads, the exact opposite is true and physics/AI will flourish?

        • Strabo says:

          Both XBox and PS3 are already multi-core systems, where you can’t get anything near full performance if you don’t utilize multi-threading really well. Didn’t lead to the magical multi-thread/multi-core world in 8 years, won’t probably with a new generation either.

      • Deadly Sinner says:

        Don’t know about AI, but the GPU can be used for physics these days. They were using the GPU for that physics demo during the PS4 reveal. Then there’s the TressFX thing that also uses the GPU to calculate physics.

        Hopefully, PS4 and Durango development will lead to fleshed out physics systems that run on both nVidia and AMD cards, rendering PhysX obsolete.

    • Demontain says:

      Jaguar may be weak compared to higher end desktop parts but it does deliver a nice amount of performance for its size and TDP. It will definitely be a nice step up from Cell. People like to dismiss Jaguar because Cell does more FLOPS on paper, Jaguar does however run circles with it when it comes to general purpose tasks and is overall more efficient.

  7. Kyber says:

    Is the individual weakness of the Jaguar cores gonna make multi-threaded PC games more common and better done? I reckon so.

  8. Pliqu3011 says:

    A bit off-topic, but why does everybody keep saying that Crysis 1 was a bad game?
    The first half had some of the finest moments I’ve ever had playing FPS’s. The creativity and freedom of the suit and the maps still boggles my mind. I still frequently replay those first few levels – and every time it feels different. You set your own goals, create your own experience.
    I know it’s a terribly clichéd thing to say, but if you think Crysis was a _bad_ game, you simply haven’t played it right.

    • vandinz says:

      Not a clue. I loved C1. Like C2 and love C3. Each to their own I suppose.

    • Kilometrik says:

      Because of the concept of Design. It comes from the latin word “Designare”, which means “To assign a function”. When i buy a game i am providing game designers with money, so i’m implicitly telling them “i like what you designed”. In crysis there isn’t much design to speak of. Sure, you’ve got all these cool interactions and stuff but you are never actually compelled to engage in them unless you were previously interested. Your options are never limited in such a way that you have to plan around those limits within the sandbox, In Deus Ex there were never enough multitools to hack every terminal and you didn’t even had enough skill, even if you had enough multitools so you were compelled to use the other game features to solve the problem. That situation never presents itself in crysis. It’s lacking design. And that’s what i’m paying for.

      By you providing your own limitations, objectives and means then you are the one that’s doing de “designing”, not the game’s creator. And i’d rather just design my own game altogether (which is something i do sometimes) than pay for a half designed, empty, toy sandbox.

    • Vorphalack says:

      From the tone of the article it just sounds like Jeremy doesn’t like shooters much. I suppose if you don’t particularly like the genre then you wont find much to praise even in the better examples.

    • Grape Flavor says:

      Yeah, I was kind of taken aback at that, because as far as shooters go it doesn’t really get much better than Crysis. The “shooting gallery” quip particularly rings hollow, as while not truly being open-world, Crysis is much less linear or scripted than most games of the genre. Maybe he just hates shooters in general? Who knows.

      I don’t actually begrudge people their own difference of experience but people are awfully fond around here of taking “X” critically acclaimed, 90+ Metacritic game, declaring it to be utter rubbish, and not so subtly implying that all the people who liked it are deficient somehow, and it gets old.

      I hope for the day where people can enjoy different games without denigrating others who don’t share the exact same experiences, and if I’ve unfairly been guilty of that myself here, then I apologize.

  9. webwielder says:

    >Actually, it’ll end when PCs are capable of graphics that look pre-rendered. It’s going to happen.

    Except that what “pre-rendered” means to people keeps changing. Today’s machines can easily produce prerendered-quality graphics…from 1993: http://www.downloadcheapapp.com/iappimg/23400/myst-screenshot-3.jpg

    • Shadowcat says:

      Quite. Every computer in history capable of producing graphics has been capable of producing graphics that “look pre-rendered”. It just depends on what it is that you are rendering.

      Of course, what people actually mean when they talk about “pre-rendered graphics” are visuals which needed to be rendered in advance, because the hardware of the day couldn’t render it in real-time; and by that definition no PC will ever be capable of producing real-time graphics equivalent to pre-rendered versions, unless there was never a need to pre-render them in the first place.

      So there’s your answer. Either it’s “all computers in history” or “no computers, now or ever”. Anything inbetween doesn’t make any sense.

      • webwielder says:

        Well said.

      • jonfitt says:

        Exactly.

      • spindaden says:

        I disagree. There will come a point when a singularity is achieved and pre-rendered versus real-time rendered will be indistinguishable.

        It likely is donkeys years away though.

      • Jeremy Laird says:

        Think there is actually an end game. It’s when CGI is indistinguishable from real, live video or film footage. When that happens, there’s nowhere else to go.

        You could argue we’ll never get there 100% in all regards, but as we approach that target, at the very least the progress becomes increasingly incremental and real-time rendering closes the gap.

        Also, I think there’s still a certain overall look and feel to pre-rendered even if it’s not the best quality possible. You spot it in cut scenes and promotional footage that you instantly know aren’t in-game but also aren’t threatening to set any new highs for realism. That’s at least partly based on things like zero jaggies and no visible “geometry” if that makes sense. Curved things looking perfectly curved, not made up of polygons. Etc.

        I think PCs that achieve that level of pre-rendered feel aren’t too many years away…

  10. Hahaha says:

    Still worth upgrading to a rampage IV extreme and a i7-3930K?

    • Premium User Badge

      El_MUERkO says:

      I’ve been thinking that myself, my q9550 is starting to show it’s age.

      The 3930K is actually an 8 core chip with two cores deactivated by Intel due to not wanting their retail CPU’s to interfere with Xeon sales & Intel have promised support for x79 through to 2016 so ‘yes’, you’ll have the upgrade option of Ivy Bridge-E and/or Haswell-E before you’re looking at a new Mobo and maybe new RAM.

      AMD are focused on APU’s at the moment and are unlikely to suddenly release a CPU that competes with Intels enthusiast range, their roadmap puts their next revision in 2014.

      If developers start coding with more cores in mind and If AMD release a chip that takes the performance crown from Intel then Intel can respond by selling you a chip you already own with it’s two cores reactivated, but that’s what their market dominance gets them.

  11. vandinz says:

    “In many regards, it’s a glimpse at what PC processors will look like in a few years time. Fusion chips like this will eventually replace the existing discrete CPU-and-GPU model, even in high performance PCs.”

    Nope.

    • jonfitt says:

      Based on?

      • BulletSpongeRTR says:

        Heat

        • Cognitect says:

          Exactly. Two separate chips are much easier to cool than a single large one. A high-end CPU/GPU hybrid would also be very difficult and expensive to manufacture due to its enormous size. The PS4 avoids these problems by using a midrange GPU and relatively low clock speeds.

      • Mad Hamish says:

        based on the novel push by sapphire

      • NeuralNet says:

        Based on simple common sense.

        For starters there is no single company that can produce a chip that has a fast CPU and a fast GPU. APUs as AMD calls them should replace low end machines because they will be fast enough to fulfill basic tasks which aren’t demanding faster hardware year on year. The enthusiast and PC gamer market are completely different beasts.

        It would also make upgrading a gaming rig very inconvenient – buying a new APU for a faster GPU would not only be inefficient but would also most likely cripple the GPU straight out of the gate due to older memory architecture starving it for bandwidth.

        It’s a great solution for a console, but not for the PC market.

      • Premium User Badge

        Cinek says:

        Specialized units are always better than generic :) ;)

    • Jeremy Laird says:

      I should probably have said “eventually” rather than “a few years”. But it almost definitely will happen.

  12. El Stevo says:

    More RAM! The end of tiny levels!

    • -Spooky- says:

      [insert random alert sound here] Nvidia Titan anyone? VRam are more important.

      • theleif says:

        And with 8 gig of shared DDR5 the PS4 got lots of it.

      • Strabo says:

        The 8 GB GDDR5 of the PS4 are VRAM and system RAM at the same time (although nobody knows how flexible the allocation will be). Similar like your smartphone or tablet has 1 or 2 GB RAM, which is then split in 768/1536 MB for system and 256/512 MB for VRAM.

  13. Alextended says:

    A PS4 advert on RPS?

    Of course the as of yet unreleased hardware isn’t too shabby compared to current PCs (although the 8 cores really aren’t better than a current high end 4 core PC CPU and the GPU is worse than the high end models too, though the fact it’s a closed system will help developers get more mileage out of it and the RAM and unified architecture will help with that). The PS3 and 360 weren’t too bad either at launch. But it’s another system intended to last several years, so it will of course be outdated soon enough. Probably a year past its launch, therefor before the bulk of its install base even materializes. Which won’t matter if it’s eventually successful because hey, lots of publishers love them consoles. Although its power won’t matter that much either if Microsoft’s system isn’t up to snuff since everyone’s making their games multi-platform anyway, outside first party studios and the odd not-merely-timed exclusive that’s becoming more rare with every generation thanks to development costs.

    That’s still got nothing to do with PC gaming though. We’ll keep getting multi-platform console games with, on the higher end systems, better visuals and frame rates, and we’ll keep getting our exclusives and we’ll love it and console gamers will love their exclusives and mutli-platform games and continue ignoring the greatness of PC gaming while they talk about how awesome their graphics are compared to that other console that has 2fps less in this scene and a worse texture in that or whatever, etc. There’s no paradigm shift to see here or anything. Just the usual bump up for the lowest common denominator, which is nice to finally see on the horizon, but it was obviously bound to happen anyway.

    I suppose it does hint at where the hardware race is going these days but as said, it’s merely the reminder, if people who really care could even forget that, not the pioneer that everyone else has to follow. Although I’m not sure I’ll be happy with buying systems on a chip for things other than tablets/laptops and the like so I hope they find a way to bring this efficiency while still allowing people to purchase and upgrade different parts separately for desktops. I’d hate to be locked out of getting a more powerful CPU or GPU based on my needs both for gaming and outside that because the manufacturers only offer certain combos and only pair high end parts together to sell them at a high cost rather than mix it up. At least as long as the specs still matter, which they will for a long time.

    • Strabo says:

      The Jaguar CPU should be about half as powerful as an Ivy Bridge i3 (maybe even as fast in heavy multi-core/multi-thread situations), so nowhere near high end 4-core CPUs. The GPU on the other hand is about as powerful as a downclocked AMD7870, so very competitive with today’s upper mid-range/lower high-end GPUs.

      With lower overhead, better use of hardware I can see the PS4 rival high-end PCs in November for a short time. Of course, end of 2013/early 2014 we get a whole new GPU-generation (also Haswell, but that’s just 10 % faster per clock than IB CPU-wise, so a moot point), which should open up the gap considerably again.

  14. Arkh says:

    I do hope they get their ass around using multi core support more. I have a fucking AMD Phenom II x6 1090T and I barely get the four cores to work. Shit.

    • Snargelfargen says:

      “I have a fucking AMD Phenom II x6 1090T”

      I’m so sorry.

      • Arkh says:

        Me too, my friend. Me too. Maybe I will buy a AMD FX-8350 Eight-Core. My old AM3 socket supports AM3+ processors thanks to a bios upgrade. And Intel processors are expensive as fuck and they change CPU socket every other day. I’m still running an old GeForce 450 GTS. Being poor (or living in a country where this stuff is expensive as fuck) is suffering.

        • biggergun says:

          If I were you, I’d look at FX-4300 or FX-6300. A lot cheaper than 8350 and still more than enough to run almost anything.

        • cjlr says:

          I went Intel on my last build. I still have occasional regrets.

          AMD has ALWAYS had better sockets. Always. LGA1156 is an abomination.

      • IgnitingIcarus says:

        I agree with the FX-6300. I’ve been running on one for about a month so far and it’s a smooth ride.

  15. 1Life0Continues says:

    And so the e-peen grows…

  16. Jraptor59 says:

    Wow, using AMD/ATI for processor and video. What a bummer. Apple tried the cheap way, using this, in their iMacs. They learned the lesson and changed back to Nvidia. Nvidia has always put out a better looking, better running game experience. Sony is making a mistake.

    • Ignorant Texan says:

      Apple was/is competing with other micro-computer manufacturers. SONY will be competing with MicroSoft(if rumors are to be believed, they will be using the same architecture in the next XBOX) and Nintendo. Since AMD needs the business much more than either Intel or nVidia, they will be much more responsive(price, service, prioritizing of manufacturing schedule, etc) to SONY’s demands/desires.

    • theleif says:

      ” Nvidia has always put out a better looking, better running game experience.”
      Eh, no? Nvidia and ATI/AMD has been trading pretty much even blows for the last decade, and the AMD 7000 series is at least as capable as the NVIDIA 600 series.
      “Sony is making a mistake.”
      Like the mistake Microsoft did when they used an ATI card in their 360? Yeah, that console turned out to be a big flop.

    • Strabo says:

      Nvidia doesn’t have anything in the APU market (they do have SoCs with the Tegra, but those are far, far too slow for what Sony and MS want here), not to mention they don’t even have a x86 licence to start making anything in this area.
      You would need a dedicated CPU (Intel) and a dedicated GPU (Nvidia) in this case. Intel wants a lot of money for their stuff without it being worth the money in the low-end CPU segment (where the Jaguar cores are operating) – Intel of course rules supreme in the high-end segment.
      Intel does have an integrated GPU in their CPUs, but even ignoring the higher price, these are not in any way competitive with what AMD has. So the choice for AMD was a no brainer for MS and Sony alike, especially since AMD is surely far more interested in giving discounts for getting this big order than Intel or Nvidia is.

    • Tams80 says:

      Well I guess someone had to make that asinine comment. ¬.¬

  17. gravity_spoon says:

    I’ll be honest here. I game on a laptop right now (due to job and moving around) which is an AMD A63400 APU, and tbh, I am satisfied with the performance. Sure I can’t play BF3 or Crysis 2 at max settings (don’t play either of those at all). But what I do play (mainly TBS and arpgs), I am happy with the performance. This machine is also slightly OC ed and with 8GB RAM so it fits all my needs. Also, looking at the releases for the next 2 years (what I am interested in), an A10 5800K APU or its future variant might be just what I need. No need to pay an atrocious amount for a graphic card when an APU might do the trick. Going for an APU like architecture by AMD will only help the low budget market and maybe they will achieve some breakthrough for gaming market too for coming years.

  18. SuicideKing says:

    “So, Crysis wasn’t fun. It wasn’t even very well coded. And the more you looked at it, the more the visuals felt a bit unfinished.”
    My thoughts exactly. I’ve always felt it was not very well optimized.

    Do you know which game really looks like Crysis, yet better, more optimized and without aliens?

    Yes, Far Cry 3.

    On the PS4:
    1. It’s strange that no one realises that all current APUs have a shared address space (they use system RAM), difference being that they use DDR3 instead of GDDR5.

    2. Also, mainstream Sandy/Ivy Bridge processors and AMD’s Llano/Trinity CPUs are all APUs, and all of them have integrated memory controllers and a few other things (PCIe controller? can’t remember right now). So in that respect they’re almost SoCs, though since the South Bridge, network and legacy PCI controller is off-chip, they’re not SoCs yet.

    3. PC APUs have lower bandwidth because the address lines are 64-bit or less for DDR3 RAM, whereas GDDR5 uses 128-bit lines or above along with higher frequencies. I think i may be slightly inaccurate with this description though. But yeah, unless Intel/AMD decide to use a different bus architecture, i think we’re sort of limited here. Maybe DDR4 changes things? From what i’m hearing, it wont.

    The rest of what you’ve said is more or less what my analysis has been too…nothing to add.

    • TheManko says:

      Far Cry 3 isn’t well optimized compared to Crysis. Because it’s so old now, Crysis runs much faster than Far Cry 3, it has destructible trees and buildings, and doesn’t have the same obvious console memory limitations as Far Cry 3, ie the horrendous bit rate on enemy voice samples etc. They’re not directly comparable I guess since Far Cry 3 is open world, while Crysis has levels. But I played Crysis 1 directly after Far Cry 3 and the impression I was left with was that Crysis is a step above Far Cry 3 in technical refinement.

      • SuicideKing says:

        Well, i’m playing on a core 2 quad with a GTX 560, and while Crysis still plays like complete crap (sub 30 or around 30 fps with everything set to high, 2xAA) FC3 remains between 30 and 80 fps with high-very high settings and 4xAA.

        Far Cry 3 seems much easier on the CPU, at least.

  19. mr.ioes says:

    “It wasn’t even very well coded.” – Now why would you say that? I had zero FPS drops nor more than ~3 seconds savegame load time. How was this not well coded? I always thought it was the most refined shooter ever.

    Remember Painkiller loading?

    I’d really like to read more on this thought.

  20. Feferuco says:

    Given the negativity the word “integrated” has gained over decades, won’t a cpu-gpu be a hard sell?

    • InternetBatman says:

      Like an integrated sound-card? Integrated machines have come a long way from the 90s, and most of the people who want an easier computing experience are probably already using one.

  21. Jeremy Laird says:

    Do we know for sure that half the cores are always invisible to the game engine on the PS4? I’ve not read anything official to that effect.

    • Jeremy Laird says:

      I would be amazed if they use the general purpose CPU cores to encode video. A tiny DSP circuit would cost almost nothing to put into the chip and do the job much more efficiently.

  22. ankh says:

    Stopped reading at Crysis = Shooting Gallery.