Week In Tech: Haswell Mobile, AMD Goes GHz Gaga

Some of you were a teensy bit miffed by my unceremonious defenestration of Intel’s new Haswell CPUs as desktop chips. In fairness, when you’ve only played with the desktop iterations, that’s going to influence your outlook. And Intel really was asking for it. Anyway, while I mentioned Haswell has some serious mobile chops, it’s worth having a closer look at what it all means for mobile gaming and what you should be looking out for when bagging a laptop. In other news, AMD has annouced a 5GHz processor. Surely this can’t be the beginning of a new GHz war…?

Haswell as a mobile gaming chip, then. Frankly, I blame Intel for the avoidably negative coverage Haswell has generated. M’learned colleague on PC Format Magazine, Dave James, and I were discussing this only yesterday.

Haswell has plenty to offer. But Intel utterly failed to give us good reason to shout about that thanks to the products it supplied for review.

Imagine what the hack community would have said had Intel fired out reference tablets running its fancy new 40-core graphics, Windows 8 and some serious software to play with. Instead, we got crap desktop CPUs that were no faster at stock clocks and don’t overclock as well as the previous generation.

Incidentally, I fully do not buy the argument that Haswell as a desktop chip had to be a pitiful step forward because everything is now dominated by mobile. If AMD had more competitive chips to offer, I’m certain things would be very different. More on that in a moment.

Mobile or desktop, they’re all cut from the same wafers

As ever, the launch materials I got from Intel were incomplete. But I’ve been digging around and I’ve finally found what I’ve been looking for.

The thing about Haswell is that the vast, vast majority of its derivatives are flatly uninteresting. That includes most of the mobile efforts. We’re talking incremental upgrades, even to power consumption.

Put another way, Haswell isn’t exciting for proper, full-power gaming laptops. Most variants are only a bit more power efficient, it’s really no faster and the improvement in integrated graphics is irrelevant. The 3D performance remains ultimately pretty pants in a serious gaming context. You still need a discrete GPU.

Where things get a but more interesting involves the bequeathing of a new class of devices with moderately useful gaming chops. I’m talking Ultrabooks, tablets and tablet convertibles.

This is the one you don’t want. It doesn’t have 40 graphics units…

In other words, we’re in ultra-low voltage territory. Where things get complicated is idenitifying Intel’s new 40-unit graphics core. It wouldn’t be an Intel launch without some simultaneously idiotic and cynical branding. And sure enough, Haswell’s graphics branding is a minor atrocity.

Intel has introduced the new Iris brand. You’d think that would apply to all Haswell chips, but no it’s just the ones with the most powerful graphics.

Fair enough, apply Iris to all chips with the new 40-unit graphics. Ha! That would make too much sense. Instead, Iris only applies to certain 40-unit graphics cores. There are some that retain the old Intel HD Graphics branding. If you went out of your way to confuse customers, this is exactly how you’d do it.

Anyway, as far as I can work out, the 40-unit graphics that will give you half a chance of gaming joy are indicated by the numerical suffix. Anything starting with a ‘5’ has 40 units. So that’s Intel HD Graphics 5000, Iris 5100 and Iris Pro 5200. Geddit? Phew.

Next question, are there any of these cores on offer in an ultra-low voltage chip? Not among the launch chips according to the docs I have from Intel. But I saw reference to a few such chips on the web and some digging around on Intel’s site throws up this comprehensive list.

Hope for Hasell: Genuine PC gaming in something the size of an Android tablet

And joy, there are some ‘U’ ultra-low voltage CPU models with the 40-unit graphics. Here’s the bad news. They’re all very expensive. $342 bucks a piece when you buy them in bulk (1,000 units), currently.

Any machine with the 40-unit graphics is therefore going to be pricey. For the foreseeable, I doubt you’ll be able to get one for less than £1,000. That’s a shame because the idea of a tablet convertible with enough gumption for some games is super seductive.

To be clear, we’re not talking about graphics grunt to lay waste to AMD and Nvidia’s finest GPUs. But going on Anandtech’s test of Iris (bear in mind that was the most powerful Iris Pro iteration with 128MB of eDRAM), it should be just about gameable.

Anyway, if you’re in the market for that kind of thing, stick that Intel reference page in your favourites. It will come in very handy when you go shopping.

If I get a chance to take a ULV Haswell with 40-unit graphics for a spin. I’ll let you all know. The only device I’ve seen in the metal is the new Macbook Air. But that would mean Bootcamp and that’s a bit complicated.

AMD goes GHz gaga
Next up, AMD’s new 5GHz FX processor. What are we to make of that? Firstly, it’s symptomatic of Intel’s sand bagging. If Intel had even made the slightest effort to move the game on the desktop, AMD wouldn’t have bothered with this GHz ruse. It would have been futile.

But I’m hoping that Intel’s complacency has finally caught up with it. If you look at benchmarks of overclocked FX chips running near 5GHz, you’ll see that AMD will have something highly competitive with Intel’s best LGA1150 processors.

In truth, I don’t think the new AMD FX models will actually be worth buying. They’ll be super expensive. They’re just speed binned and upclocked versions of the existing 32nm Vishera die, as far as I can tell.

Indeed, they’ll be expensive because I doubt AMD can knock out very many of them. May as well price them to discourage sales but maintain the PR vitctory. Anywho, for the record the new top chip is known as teh FX-9590 and tops out at 5GHz. There’s a slightly lowlier FX-9370 that’s rated at 4.7GHz.

Note these are Turbo frequencies, not nominal clocks. Baseclocks are 4.7GHz and 4.4GHz respectively. I doubt they’ll offer much overclocking headroom and you can get near those speeds by overclocking an existing FX. Price wise, I believe AMD has whispered something along the lines of $800 for the top chip. Call that £600-700 in the UK and there’s no point in actually buying one.

Symbolically, however, they’re significant. Like I said, they only exist because Intel has been scorning the desktop so comprehensively.


  1. povu says:

    From what I’ve heard the 5 GHz CPU runs really really hot, has extraordinary high power consumption and won’t be available to the general public. Unless I’m confusing it with something else. But it’s a cool milestone.

    • Parge says:

      Yeah 220TDP – vs the 84W TPD of Haswell. Phew!

      • xavdeman says:

        Don’t confuse TDP (Thermal Design Power, the maximum heat generated that a cooling solution should be able to deal with) with average power consumption. The AMD chip may actually be more efficient in idle, yet provide extreme headroom in cases where all 8 cores are used at the same time with 100% activity (e.g. encoding or adding heavy effects to 4K video files).

        • SuicideKing says:

          Well, sites that overclocked their FX-8350s to 4.7 or 4.8 GHz saw 80-100w increase in power draw, which would almost directly translate to increased heat over a 125w Vishera part. So 220w does mean power hungry and hot.

    • stampy says:

      i heard they just put two 2.5GHz processors in the same box and marked it 5GHz.

      also, i wouldn’t put doing this past either intel or amd.

      • kafeuhgfja says:

        Basically this. Anyway, those new 5GHz CPUS from AMD will be shit, they will melt. My 1090T already overheats at 3.4GHz.

    • Artist says:

      Yes, with 220W against the brick wall. I think this will be AMDs final nail in the coffin. Unfortunatly.. I miss the days where the underdog hounded the behemoth Intel.

  2. Meat Circus says:

    Railing against the dying of the desktop light is pissing in the wind at this stage. I guess in desktop PC grief you’re at stage 2-3, somewhere between anger and bargaining?

    Haswell was designed for mobile. And at that, it exceeds expectations quite spectacularly. Getting all angry Internet man that it doesn’t offer any significant step forwards for desktop CPUs, when it was neither designed for that nor did Intel ever make any claims that it would, just seems dumb and a waste of energy.

    I guess what I, and everyone else is saying, is that Haswell is a microarchitecture designed for mobile. You should probably stop whining and deal with it.

    • Low Life says:

      Replace a few words and you have the perfect reply to anyone complaining about bad PC ports.

      • Meat Circus says:

        It’s important to remember that Intel are a vast multinational corporation, not a charity for overclocking enthusiasts.

        The mass market has made its feelings about desktop PCs quite clear, Intel are targeting their engineering expertise to where it’s able to achieve the most, which is where they can make most money, which is where the demand is, which is… mobile.

        Not wishing to put too fine a point on it, Intel doesn’t give a shit about the niche within a niche within a niche which is overclocking enthusiasts within desktop PC owners within gamers. There’s no gold in them thar hills.

        • Low Life says:

          Yeah, I wasn’t disagreeing with you. And my point still stands (with the addition of your reply to me).

      • VelvetFistIronGlove says:

        Not really. The percentage of people playing games on PC instead of consoles (only counting for games on both) is undoubtedly much greater than the percentage of PC users who care what CPU they have.

    • Jeremy Laird says:

      I disagree, Meaty old chap. Like I said in the story, the state of Haswell has as much to do with Intel’s near monopoly position as it does the inevitable dominance of mobile. Intel still sells a lot of desktop CPUs. They could very, very easily be much, much more regards desktop performance. But there’s no competition forcing them to do it.

      Unless, of course, your basic position is to simply quietly accept anything and everything Intel does?

      I know this topic is a bit of a broken record. But that doesn’t stop it from being true. Put your hands together and pray for Steamroller.

  3. biggergun says:

    In my experience, FX is already worth buying. Synthetic tests look bad, but when you look at real-world applications (Far Cry 3 maxed out with video streaming for instance) it is on par with an i7 while costing about two times less. I mean, yes, Intel chips are 5-10% faster. But does it matter when they are 50% more expensive?

    • surv1vor says:

      Do you have anything to support this? I was planning to go Intel when I upgrade my ageing Phenom II, but I don’t expect to have a ton of cash. After hearing initial reviews of the FX chips I kind of assumed them to be a write off. Staying AMD would be a far better option for me as I could get an AM3+ mobo keep my chip for the time being.

      • newprince says:

        I’d say that in either case, whether AMD or Intel, it’s better to sit tight for next year’s chips. Unless you have no system to speak of right now. I have a Sandy Bridge 2500K and have absolutely no plans to replace it until I see the 6 or 8 core Intels and AMD’s offerings next year. It will save me the trouble of having some marginal Haswell chip for a year, and then feeling stupid about it looking at (finally) some 6 or 8 core Intels. And hey, AMD might finally get it together.

        • surv1vor says:

          Fair enough, i just worry my CPU won’t be able to keep up, especially with the next consoles about. Its currently clocked at 3.8Ghz? Anyone reckon they know if that’ll be enough?

          • Baboonanza says:

            The next gen consoles have pretty lousy CPUs so I wouldn’t worry about anything written for them pushing a recent, decent PC CPU. They will have more cores but I doubt they are going to be fully utilised at launch and given the poor per-core performance I doubt it will ever be an issue.

          • chris1479 says:

            Come on let’s get serious here for a second, i5 2500ks and above are giong absolutely nowhere any time soon. For better or worse consoles and their technical capabilities set the pace of hardware requirements and by the looks of the consoles us PC gamers, when it comes to CPUs especially, have no reason to upgrade for the forseeable future.

            And as I say that I think it’s actually very disappointing because I’m getting seriously fucking bored of PCs being SO RIDICULOUSLY FAR AHEAD of the console competition that owning a PC sometimes feels like owning some kind of obscure supercomputer that 99.999% of software (except for unzipping RAR files and transcoding) will never ever make use of.

      • biggergun says:

        link to teksyndicate.com

        This review, mainly. Also, this.

        link to cpubenchmark.net

        FX chips are really not a writeoff, I’d say they soundly beat Ivy and Haswell in price/perfomance, especially if you consider that newer games will have better multicore support. I would agree that waiting for the next generation is worthwhile though.

        • RvLeshrac says:

          This is what everyone says with each AMD release… then AMD completely fails to live up to every single expectation, soundly underperforms a comparably-priced Intel chip, and everyone says they’re going Intel next time.

          Until AMD makes another announcement, and claims the same improvement as before….

        • Solidstate89 says:

          Except TekSyndicate is literally the only site out there that claims any gaming wins in AMD’s favor. Games these days are so bound by single-threaded performance that architecturally speaking; AMD can just not compete against Intel. It doesn’t matter how much you love AMD or how unique their current architecture is – it simply isn’t built to compete against Intel in single-threaded applications. Which is basically nearly every game ever.

          I suspect you’ll only start to see games take true advantage of multi-threading with the next-gen console ports. Why? Because with 1.6Ghz Jaguar (low-power) cores to rely on, the game developers have literally no choice to create their games with multi-threading in mind. Only then will you begin to see AMD’s processors compare.

          • biggergun says:

            Well, it is the only site I’ve seen that actually tested 8350 in such an environment (after windows patches, graphics-heavy titles in 1080 and 1440 wile streaming, a lot of games, not just a couple of most popular titles).
            Actually, the whole CPU performance debate is rather pointless by now – I, for instance, have an i5-2400 and while I’d love to buy a fancy new chip, there is simply no reason to. Hell, even 50$ ancient Phenoms still keep up pretty well.

  4. astronaute says:

    Too bad Intel goes mobile aswell.

  5. dangermouse76 says:

    So am I to surmise from this that you would not recommend building a Haswell based gaming rig then, as previous I5’s ( 3470 etc ) and I7’s will do the job better ?

    Edit: And you see no benefits in the 1150 socket set over 1155 ?
    Interested in your thoughts ( or anyone’s )

    Second edit: I should also add I am an I5 760 owner. So 1150 – 1155 is relevant to me in terms of where to go with a new rig.

    • Meat Circus says:

      Intel are no longer willing to expend their transistor budget on improving the performance or the number of their CPU cores. The things Intel expended their transistor budget on:

      1) Moving as much of the chipset on die as possible
      2) Increasing the performance of the on-die GPU
      3) Improving energy efficiency and performance of low-power states

      None of these things have any real importance to enthusiast-grade desktop rigs.

      • roryok says:

        None of these things have any real importance to enthusiast-grade desktop rigs.

        Maybe not to people who like PCs the size of a Fiat Uno, but I’d love to see a game capable desktop PC the size of a next-gen console. Fanless CPUs and integrated GPUs are the way forward for that.

        • dangermouse76 says:

          I would at some point like to have a home server build. A hidden away box networked to whole house for multi-screen. At the mo the pc sits under a desk in the livingroom a can be passed out to the xbox, any of the laptops and our projector.

          But a hidden home entertainment – gaming capable – always on solution is the holly grail for me eventually.

          • InternetBatman says:

            Absolutely. Quite honestly I think most of the tech is up to it right now. My three to five (lightning strike wiped out half the parts) year old computer can play Tiny and Big and The Cave at the same time. Neither of those stress the system, but most games don’t, and the number that do stress a relatively current system will probably be rarer over time.

      • SuicideKing says:

        Not entirely accurate.

        Intel 2014 Haswell-E to pack 8 cores, DDR4, X99 PCH and more
        link to vr-zone.com

    • FrankGrimesy says:

      You should buy a Haswell based processor. If only to get the measily 5-10% more percent of performance over the equivalent Ivy Bridged CPU at no price increase (outside of sales ..).

      Note that Intel changed the Socket because they changed the CPU-Die to include some of the Voltage regulators. So all newer processor will be more likely to use a 1150 Socket then the older 1155 Socket.

      • dangermouse76 says:

        Agreed, if your back in the 760 – 750 era it is definitely worth it I think. The 760 is clocked as high as I can get it with a £30 cooler ( 4ghz ). Still doing well though after all this time.

      • SkittleDiddler says:

        “So all newer processor will be more likely to use a 1150 Socket then the older 1155 Socket.

        “Likely” being the key word here. Remember we’re talking about Intel — they generally aren’t concerned with socket legacy. Their newer processors could use an entirely different socket for all we know.

    • Baboonanza says:

      You could grab a bargain and go for a second hand i5/i7. The performance margin is pretty small and I’m confident that a decent 3rd gen will last a good few years yet.

      • chris1479 says:

        “Baboonanza says:

        You could grab a bargain and go for a second hand i5/i7. The performance margin is pretty small and I’m confident that a decent 3rd gen will last a good few years yet.”

        Anything above an i5 2500k is just bragging rights and has been for a long while now. Until games start utilizing the enormous potential of these processors to really take graphical fidelity to the next level (and they’re nowhere near doing it, the PS4 trailers were nice but weren’t further ahead than any AAA PC title) people are just wasting their money buying this stuff. And I say that as a hardware enthusiast… but someone who’s pissed off with how, i dunno, lazy software developers have become.

        I mean, has anyone actually read some CPU reviews recently and looked at the performance benchmarks and the differences in games? The differences are getting to be so minor as to be almost meaningless since the games just aren’t using their full potential, I’m reading these reviews with 10+ processors and you look at the performance differences and they’re miniscule as well as incremental.

  6. Radiant says:

    Not a slight on Jeremy but I don’t give a shit about any of this. It makes no sense anymore.

  7. binkbenc says:

    I bought a Surface Pro recently (crazy discounts at an MS conference) and absolutely love it. It’s not a top-end gaming rig by any stretch of the imagination, but then I don’t play many top-end games. It’s flawless with every one of my GOG/Steam backlist games I’ve thrown at it so far. Anyway, the Haswell chip seems the absolute perfect partner for a Surface – a little more graphical grunt and better power consumption – yes, please! In fact, I get a sneaking suspicion we may see an announcement round about June 28th of a Surface 2 with one of these jammed into it (and maybe another USB port…other than that, I’m good). Anyone else tried out the Pro?

    • surv1vor says:

      Even if cost was no issue to me, I’m really put off by the lack of a removable battery, and teardowns have shown that it really is impossible to replace. Whilst I’m just about okay with a phone having the battery inbuilt, it’d really pain me to have to shell out that much for a device that have its lifespan so significantly diminished by its battery.

      Other than that, I really like the look of it, but for me there’s no getting past the battery.

      • roryok says:

        Agreed, I hope they take that criticism onboard. (although I doubt it. It’s not hard to upgrade by accident)

    • roryok says:

      In fact, I get a sneaking suspicion we may see an announcement round about June 28th of a Surface 2 with one of these jammed into it

      That’s a very specific date to throw out there. Do you have inside info?

      • binkbenc says:

        No, no sly insider info, I’m afraid. It’s just the date of their next conference (BUILD). It’s a developer conference, so there’s no reason why they would necessarily announce it there, it just seems like as good a time as any.

        • roryok says:

          I wouldn’t be too sure. There was a lot of fanfare for the Surface roll out, seems like they’ll do that again for the Surface 2. That said, the dates do make sense as the original rolled out in June last year

  8. Innovacious says:

    I upgraded to an i5 Haswell the other day from an i5 750. If you already have something newer than that, then i can see Haswells not really being worth an “upgrade”. But from what i could find, some of the Haswells performed about the same as some older CPUs but were cheaper.

    • dangermouse76 says:

      That’s what I am seeing, also my old gigabyte board halved the PCIe lane to 8 if you tried using USB 3 or sata at the same time as a graphics card. Booooo!

  9. newprince says:

    This is all rather silly. Sure, reducing voltage and thus heat should always be one of the goals of a new generation of chips, but the sole focus? Even in the desktop varities? Just plain silly.

    Holding back performance gains so you can shove these things into mobile, and spending all that R&D for the integrated grpahics only makes sense for tablets. And even then, why can’t we just admit that the best, most elegant, and probably least profitable solution for Intel is simply streaming our games to whatever device we want via our gaming/media PCs? No compromises need be made; we are simply dealing with what screen you want to play on and what control scheme you find useful. We know this can be done, and I’m looking forward to some open source solution or standard that does this so I can ignore efforts like Haswell, SoCs, or ARM chips altogether.

  10. Arkh says:

    Incidentally, I fully do not buy the argument that Haswell as a desktop chip had to be a pitiful step forward because everything is now dominated by mobile. If AMD had more competitive chips to offer, I’m certain things would be very different. More on that in a moment.

    Basically this. Anyway, those new 5GHz CPUS from AMD will be shit, they will melt. My 1090T already overheats at 3.4GHz.

    I was thinking of doing an upgrade but I will be hold out a little.

  11. kwyjibo says:

    Everyone already knew that Haswell was a mobile chip.

    The unceremonious defenestration was a waste of time.

  12. RvLeshrac says:

    What the hell are “40 units”? 40 shaders? 40 cores?

  13. SuicideKing says:

    Continuing a long history of innovation, AMD announced today that they were the first to reach a critical CPU power envelope threshold. AMD CEO, Rory Read, was visibly pleased with his company’s progress as he was announcing this new development.

    “Through creative innovation and company-wide alignment, we have now achieved something our competitors can only dream of. We are proud to announce that our new, 8-core flagship FX-9590 has reached and surpassed the previously unattainable goal of 200W TDP. The demand for this new product is strong; the first large-scale server deployment contract in Northern Alaska will be announced early next week.”

    AMD is also planning to continue developing this new product line with annual refreshes. Said Read: “We have a clear roadmap for these products for the next three years. Our 2014 platform, codenamed Sauna and fabricated on a 32nm process, will reach 250W TDP. In 2015, we will release a 20nm refresh codenamed Nucular, that will increase TDP by up to 30% or more – this will be available for Back-To-School season. Finally, by 2016 Holiday season, we will release a product codenamed Solara – a new architecture that takes TDP to unprecedented levels. More details about Solara will be announced at Computex 2014 – stay tuned!”

    After trailing their main competitor, Intel, for years, it looks like AMD is back in the game. Intel has been slow at improving TDP lately; the new desktop Haswell CPUs were released with a disappointing 84W TDP. The recently announced Ivy Bridge E platform promises to improve TDP, but the new FX 9000-series products from AMD seem to have leapfrogged the competition. Intel representatives declined to comment on this story.

    As always, I would wait for independent benchmarking results, but judging from the heat and excitement in AMD’s demo room, this reporter has a feeling that AMD has finally nailed it – these new CPUs sure look HOT!

    -NeelyCam, TR Forums
    link to techreport.com

  14. onodera says:

    A 5GHz CPU? Finally, a processor to run Dwarf Fortress on!

    • TyrOvC says:

      That’s exactly what I was excited for! Until I read on and saw that the new processors aren’t actually worth getting excited for.

  15. mr.black says:

    Hello! I just wanted to chime in saying Week in tech has become (yet another) one of the staple features of RPS I read. The writing is informative and interesting and I appreciate the chance to learn some learned gamer’s perspective on new hardware.
    Keep up the great work!

  16. Uninteresting Curse File Implement says:

    I continue to be baffled as to this practice of cutting rectangular chips out of a circular wafre. Is the stuff on the edges just thrown away? This seems like such a waste!