Week In Tech: Intel’s Haswell-E Is Actually Interesting

Remember when Intel’s top platform was relevant? When proper CPUs didn’t come with nonsense like integrated graphics and the Core i7-920 D0 was the weapon of choice for gamers and PC enthusiasts in the know? Good times. More recently, the LGA2011 socket and its CPUs have been irrelevant unless you had money to hose about with nonchalant abandon. Yeah, yeah, they’ve been the fastest PC platforms you could buy. But at a premium that massively outweighed the real-world benefit. No longer. Those good times are back. With its new ‘Haswell-E’ Core i7s, the new X99 chipset and revised LGA2011-v3 socket, Intel has finally delivered the goods that I, at least, have been waiting for. Haswell-E is something you’ll actually want to buy. Ride your rodents to the other side to find out why.

Firstly, the new top chip is a real step forward and sports eight cores. But even more critically, the cheapest of the new Haswell-Es offers genuine value as a step up from the best of Intel’s mainstream LGA1150 offerings. Admittedly, even this overall package isn’t as good as Haswell-E could very easily be. And there are barriers to immediately dashing out and unloading on a Haswell-E rig and reasons why Haswell-E isn’t an immediate slam-dunk for gaming. But there’s absolutely no doubt as a family and as a platform, Haswell-E is the most compelling new high end proposition from Intel since 2008. Yay.

But let’s get some of the speeds and feeds out of the way. At launch, there are three new Haswell-E processors. The eight-core 3.5GHz (Turbo) Core i7-5960X and two six-core models, the 3.7GHz (Turbo) Core i7-5930K and the 3.6GHz (Turbo) Core i7-5820K.

The first thing you’ll notice is that every chip has an unlocked multiplier and is thus fully overclockable. Yay. All of them sport quad-channel memory controllers that require new-fangled DDR4 memory. More on that in a moment.

Next up, the two priciest models get 40 on-die PCI Express 3.0 lanes, the entry-level 5920K just 28. Actually, that latter number is just fine. PCI Express is the interface used most notably by graphics cards and in an ideal world you’d have 16 lanes for each card. More lanes means more bandwidth and better performance.

Increasingly in future, PCI Express will be used for storage, too – your hard drive – so having more than 16 lanes on your CPU will come in handy. Anyway, that lower 28 number is still far more than the 16 you get on-die with the mainstream LGA1150 socket and it’s plenty for single-GPU graphics plus something fancy like the quad-lane M.2 solid-state drives that are beginning to appear. To be honest, it’s probably enough for dual-GPU and that M.2 drive, which is certainly more than you can say for LGA1150.

Eight cores. Count ’em.

Speaking of pricing, we’re talking $999, $583 and $389 respectively, which compares to $339 for the top LGA1150 chip, the Core i7-4790K. In old money, the numbers look like roughly £760, £430 and £290. The 4790K is about £250 here in Blighty.

Yep, £140 less for the 5820K than the 5930K at the cost of 100MHz and those PCI Express lanes. Now do you begin to see what I’m talking about? Hold that thought.

These Haswell-E chips are 22nm items and they all slot into the new X99 chipset and its LGA2011-v3 socket. It’s worth noting that this new socket completely breaks backward compatibility with previous LGA2011 iterations. If you want to jump on the Haswell-E bandwagon, you will need a new CPU and you will need a new motherboard.

You also need some pricey new DDR4 memory in quad-channel format as that’s another new arrival for Haswell-E. Now, DDR4 has plenty to offer. Higher speeds (in the long run, anyway). Lower power consumption. Greater data density. But, in truth, the last thing the quad-channel LGA2011 platform for desktop PCs needed was more CPU memory bandwidth. Especially, given LGA2011 chips don’t have integrated graphics competing with the CPU cores for bandwidth.

Instead, DDR4 is more of a benefit for multi-socket servers, which is really what LGA2011 platforms are about. That said, DDR4 will be a boon for mainstream CPUs with integrated graphics when it arrives with Skylake, probably at the end of next year.

In that context, the benefit for general PC users of DDR4 in Haswell-E is that it gets the DRAM fabrication plants up and running punching out DDR4 chips, giving time for capacity and yields to build and thus see pricing gradually tumble between now and Skylake’s arrival. If you go with Haswell-E any time soon, give yourself a pat on the back. You’re helping make DDR4 cheaper for everyone else.

Loads of PCI-E lanes on the X99 platform, pity there’s no native M.2 SSD support…

The other part of the puzzle is motherboards. A quick online scan puts the required starting price for an X99 motherboard at $209 bucks for the Asrock X99 Extreme3, which is nearly $100 more than the cheapest Z97 board for LGA1150 chips. In the UK, I can’t see the Asrock for sale and the fun starts with Gigabyte’s perennial UD3 model at around £160.

So there’s a premium to pay over a Z97-based rig, no doubt. Then there’s the DDR4 memory. Crucial will do you a 4x4GB kit for $208 or £170. This is not nearly as bad as I had feared even a few days ago.

Now, if you really want to split hairs, you could argue Intel could very easily clock these chips 500MHz higher across the board. It could offer even more cores. It has such chips in its Xeon family and these Haswell-E Core i7s are indeed really server chips.

Regards the clock speeds, Intel will say it wants or needs to hit certain power targets. But for me, the more likely explanation is that it wants to be able save as many CPU dies as possible. In a six or eight core chip, it’s all too easy to find five or seven cores, respectively, that will happily do 4GHz or even 4.5GHz.

But one rogue core often won’t clock up, rendering the whole thing unusable at those speeds. Instead, you keep the clocks modest and instantly improve yields and profits. Kerching.

The final snag to all this is ye olde multi-threading problem in games. Yes, even now not that many games scale well across many CPU cores. There are a few examples that do, like Total War: Rome II, but for most games, four cores running really fast is arguably all you need. At stock clocks, a quad-core 4790K will be a bit faster than any of these Haswell-E chips in some game benchmarks. And for that reason, some of you will immediately dismiss these new chips as overpriced irrelevancies. And I do not entirely disagree.

What have the Romans ever done for us? Thread-heavy games, that’s what

My counterpoint is that a Haswell-E running at 4GHz-plus will deliver all the per-core performance you need but also give you a healthy dollop of future proofing and should be killer for all-round system performance.

Anyway, I’m much encouraged by the way Intel has specced and priced the 5820K model. I really like the idea of running a six-core chip at 4GHz-plus. If you’re spending £250 on a quad-core, £290 for an unlocked six-core chip looks pretty compelling. I was actually expecting the DDR4 pricing thing to be a serious spoiler. But as it turns out, it’s not nearly as bad as I thought.

Overall, then, I’m distinctly upbeat about the overall LGA2011v3 package. It’s easily as good as I realistically was hoping for, especially the 5820K. DRAM prices tend to fluctuate, so even that could improve further in fairly short order. I doubt anything terribly exciting is going to happen on the mainstream LGA1150 desktop platform in the meantime (including the arrival of 14nm Broadwell) and AMD certainly doesn’t have anything imminent that will have any impact.

So is it Core i7-920 D0 all over again? You know what, I really think it might be.

38 Comments

  1. laggerific says:

    I love the idea of all those PCIe lanes. The big question is, how does this all help me achieve my 4k gaming? I’m assuming the PCIe lanes will be helpful in this regard. But, will this falls round of GPUs seriously be able to get us there?

    • DrManhatten says:

      I think there is no easy answer as it depends on your setup and the games your playing. In general single GPU 4k gaming on modern games with this falls round of GPUs a definite NO! The days of large jump in GPU performance are over so do not expect more than maybe a 20-30% more performance at best.

      • TacticalNuclearPenguin says:

        If they can’t improve that much just by dropping the nanometers, they’ll just make a bigger chip and board.

        That wouldn’t help the prices, though.

      • Sakkura says:

        The days of large jumps in GPU power are definitely NOT over.

        CPU power, maybe… because CPUs run software that can be hard to parallelize, so you can’t always just throw more cores at it, you need to make the cores faster individually (which gets really, really hard).

        GPUs are meant for embarrassingly parallel problems, so you CAN always just throw more “cores” (shaders etc) at it. Which is exactly what they’re doing. GTX 780 Ti with 2880 shaders, GTX 680 with 1536 shaders, GTX 580 with 512 shaders (functionally similar to 1024 shaders due to a “hot clock”), GTX 480 with 480 shaders (960), GTX 285 with 240 shaders… it’s roughly exponential, and it’s staying roughly exponential for the foreseeable future.

        • SuicideKing says:

          Exactly. We’d see a much bigger jump this fall had TSMC’s 20nm not been delayed, but the GTX 980 (aka 880) should be pretty sweet.

    • SuicideKing says:

      Assuming memory bandwidth doesn’t prove to be a handicap, yes, two GM204 based GeForces might allow decent 4K gaming. 4GB and 8GB VRAM configs seems to be the standard this time around, too.

  2. TacticalNuclearPenguin says:

    I might be tempted especially if this socket will be compatible with either Broadwell-E or SkyLake-E ( whatever happens ) , as it was for SB-E to IB-E.

    Reason for this is that, since the middle priced option doesn’t offer the same number of cores as the 1000 bucks one ( which was the case with previous generations, only with less cache ), a reasonably priced 8 cores with possibly even a higher OC potential might happen with the next generation, and at that point i could choose if i want to upgrade or not for a cheaper and more clockable 8 cores.

    It’s not like my 2600K @4.6 is breaking a sweat as it is now, it’s perfectly fine for a lot of extra years, but i like the idea of chipset native support for USB3 ( a first ), quad channel ( DDR4 is a plus ) and the lack of a stupid integrated GPU, which honestly ruins the “poetry” for me. And yes, PCI lanes are an actual concern for me, let alone the fact that i’m on PCI-E 2 now.

    All in all it’s a perfectly futureproofed chipset ( and platform as a whole, off course ) that has more or less everything needed and a lot of native support.

    • SamfisherAnD says:

      Intel has said that 2011v3 will support at least 1 more generation of CPUs. All Intel sockets have had at least 2 generations of CPU support.

  3. steves says:

    What DrManhattan said. You’ll be needing at least 2 GPUs (and not cheap ones either) for anything demanding at 4k.

    If I was that rich, I’d get a 4K monitor for super-crisp desktop/work stuff, and run games at half-resolution, but aiming for 120Hz, preferably over G-Sync. Is that even possible yet?

    Either way, the CPU has nothing to do with any of that. It’ll probably help with things like Dwarf Fortress though;)

  4. mtomto says:

    I’m stuck on a 2600k from 2011, and see no reason to upgrade. I would need a new motherboard and RAM anyway, so might as well wait and buy an entirely new PC. Maybe in 2 or 3 years… things really slowed down the last couple of years.

    • egg-zoo-bear-ant will e 91 says:

      But you keep the PSU probably, and maybe the case, and the mouse and keyboard, screen and speakers. Modularity is still a great thing. Don’t let the alienware laptop owners forget it! Though if augmented reality outside VR takes off it will be pretty hard to slip our towers into backpacks…

  5. The Dark One says:

    I don’t agree with the future proofing idea. If you want a chip that can handle the ultrathreaded games of tomorrow, buy a new CPU then. The non-E chips offer higher single-threaded performance while that’s still important: right now, with the benefit of not requiring a new motherboard and RAM.

    link to techreport.com
    link to techreport.com

    • Damn Rookie says:

      I’m not saying I agree or disagree with your wider point, but those two charts you linked are focusing on the $1000 i7-5960X, not the $390 i7-5820K that Jeremy (and many other reviewers) are particularly enthused about.

      I don’t think you’re going to find many that would recommend the far more expensive (and lower clocked) 8 core chip if gaming is your goal, but the 6 cores, especially the 5820K? A case can certainly be made.

      • TacticalNuclearPenguin says:

        With the i7-5820K you’re getting a slightly more costly CPU than the latest i7, except the fact that it actually has 2 more cores, is housed within a far superior and future proofed platform that has everything you can ask for, alongside quad channel and more PCI-E lanes.

        Sure, you could argue that most of these things don’t really matter that much, but the price is incredibly interesting.

        Please also remember that this CPU can go above 4.5 ghz, there’s really no reason to compare singlethreaded performance with the stock clocks, because this kind of stuff is really aimed at a different audience. You’re also getting more cache by the way.

        EDIT: Clicked the wrong button, this response was actually aimed at The Dark One.

      • Geebs says:

        (Or £620 if you include the cost of a new mobo and RAM)

        There is no future-proofing in tech, only a bunch of stuff you throw out 5 years later with those neat forward-looking features unused.

        Given the stagnation of the current consoles, I’ll worry about upgrading when the next gen comes out and can finally do 1080p60, with a following wind.

  6. joa says:

    I think they should just calm down at this point with the CPUs and GPUs and so on. Graphics look good enough now. I think they looked good enough four or five years ago.

    • nrvsNRG says:

      riiiiiiiiight…technology should stop because you think the graphics in games look fine.

      • PopeRatzo says:

        The graphics are fine. The games on the other hand, look like crap.

    • Sakkura says:

      We need more power just to run current games at higher framerates, which is necessary for VR headsets like Oculus Rift.

      • CookPassBabtridge says:

        Graphical fidelity is also highly important. When you’re in the rift and have the ability to walk up to and admire anything in the world from all angles and full stereo 3d, its deeply disappointing when you realise you’re inspecting a polygon. Detail just suddenly stands out so much more in an almost tactile way. 3d and head tracking is one part of the equation but in the rift, the better the graphics, the more overwhelming is the sense of presence. Comparing UE4 demos to most of the unity ones is a heck of a difference and the ‘shock’ of presence quite palpable. You still get presence in unity, but UE is just breathtakingly realistic.

        Oculus is also apparently about to release some update to the SDK that will bring large performance gains, so hopefully it will be easier to get those all important 75fps (future – 95) with more modest equipment.

      • joa says:

        Who has the money for a VR headset for pete’s sake.

        Half the games I play are old and they often surpass current games in some areas. So graphics aren’t everything.

        • CookPassBabtridge says:

          Have you looked at the cost of one? As hardware goes they are inexpensive and set to come down. You don’t have to buy one, and loads of games will run on your computer at a level you are happy with. Meanwhile, tech will advance and people who want that experience will buy it, which they will also be happy with. Freedom of choice eh?

          You have a slightly selfish point of view given that you stand to lose nothing. Tech goes out of date. I was annoyed when I had to chuck out my VHS cassettes too. I got over it.

          • pepperfez says:

            I can definitely understand grumpiness at the push for VR. It’s not a new way of storing media like VHS to DVD, it’s a new way of consuming media like arcade to console or PC/console to mobile. Both of those moves had very serious downsides for lovers of the previous experiences with, at least in the latter case, limited upside. No one has to buy a VR headset, sure, but we have to live in the world with them. Nothing wrong with not being excited about that.

          • CookPassBabtridge says:

            I can see that POV for sure, its going to have effects on the status quo, and nobody likes losing things they are used to. Though given that most games feature settings sliders, the likelihood of having your machine disappear off the bottom of the scale due to VR is tiny, unless you are running something ancient. Any game with VR will engage a VR mode, which will turn on the new shaders and stereoscopy. It won’t impact a game in non-VR mode, and that’s before any upcoming performance improvements to the SDK ….however … to your point about it being just a new form of consuming media … have you tried VR?

            As someone who owns a DK2, and has spoken with a great many other owners, I can say that once you have tried it you’ll never think of it as being a complicated replacement for a pc & monitor at all. The experience inside the thing, when done well, goes way beyond gaming that we are used to. Take Half Life 2. I have played it more than 30 times from start to finish. I know it inside out, where all the triggers are, how to beat each section, and as is to be expected – at least on a normal monitor, I am over it.

            But once the Rift is in place, it’s transformed. There’s an emotional weight to the characters and events that flat 2D cannot provide. I think the word I am groping for is “gravitas” – meeting Barney for the first time, you can feel his physical presence as a person. He’s not a cutout anymore. Simple combine grunts are now genuinely intimidating, their dystopian actions far more disturbing. When you first round that corner before Barney gives you the crowbar, the sense of scale from Breen Towers is genuinely shocking. It gives you that “I am tiny, looking way way up” feeling that standing in front of some massive object gives you in reality. On a monitor, its a picture. And when the manhacks start flooding into the subways later on, on a monitor, its just “oh FFS, manhacks pew pew pew”. In the Rift, the sense of aloneness, fear and exposure of being trapped in a dark, dirty, cramped location, with spinning blades being launched at your head, suddenly becomes enormous. Even if you weren’t a fan of HL2, the effect it would have on your favourite games would still be something to behold.

            Developers are having to rethink the way they are creating games for it, because of the impact it can uniquely bring. Some devs DO treat it like just another monitor, and those experiences are flatter. But there are ones that are realising there’s gaming, and then there’s VR gaming, that its a different paradigm. If you get a chance, try it out, perhaps starting with something like the awe inspiring Titans Of Space, or similarly spacey “Spacewalk” (fancy an EVA round the International Space Station?). As noted by Alec on this site, VR is very much a ‘felt’ experience that defies description.

          • joa says:

            Well I haven’t used it, if you say it’s that good I’m interested to try it. But I also remember how people were going crazy about 3D only a few years ago – and all that will give you is a headache.

            But if an Oculus Rift headset is only $350, that’s pretty accessible. I thought those things would be closer to $3500.

        • Sakkura says:

          VR headsets are going to be affordable. The Oculus Rift Dev Kit 2 costs $350, which is certainly not outlandish compared to PC monitors. The Asus ROG Swift comes in at $800, and that’s not even 4K.

    • FriendlyFire says:

      “640k ought to be enough for anybody!”

      You sound like that. Also, if you think the CPU and the GPU are only used for graphics, you should read up on how games work and on the latest developments in computer architecture.

    • SuicideKing says:

      What about the frame rates? 60 fps @ 1080p under all circumstances still isn’t widely affordable.

    • Carra says:

      It can always look better. >1080p resolutions are nice.

      And they can use the extra horse power for other things too. Better AI comes to mind. Physics maybe. If the CPU power is there, I’m sure smart people will find a use for it.

  7. cardboardartisan says:

    I’m probably not really in the target audience for this column, but I tend to like the work Intel’s put into integrated graphics – that might be a selling point for me on the next generation. I recently built a PC specifically for gaming, and I liked that I didn’t have to bother investing in a graphics card. In the couple years I’ve been following RPS, I don’t think I’ve seen a single game that was recommended by the writers here, looked interesting to me and wouldn’t run just fine on my Intel HD 4600. Granted, I’m not really into AAA titles, and I’m presently running at 1366 x 768 resolution, but I suspect most of the games covered here at RPS (interesting to me or not) would do just fine at higher resolutions on integrated graphics.

    Supposedly, the integrated graphics for Broadwell chips will be able to do 1080p on games with much nicer graphics than I tend to be interested in without much trouble – if that’s the case, sign me up and I’ll finally buy a nicer monitor too. That and the lower power consumption make the Broadwell chips pretty attractive for me.

    • steves says:

      “most of the games covered here at RPS (interesting to me or not) would do just fine at higher resolutions on integrated graphics”

      Whilst I play a lot of old games*, and plenty of undemanding indie titles, both Elite Dangerous and Witcher III are games covered here, and very interesting to me. If by “higher resolutions” you mean > 1366×768, then integrated graphics won’t come close.

      It would be nice if everything could be played at an acceptable level on just CPU+motherboard graphics though, if only not to exclude people from PC gaming, but I don’t think we’re far off that.

      Intel has basically zero competition in the consumer desktop space, so they are (sensibly) pushing low power & better onboard graphics for mobile stuff these days. Which means their agenda aligns with your gaming tastes, and the short-term future of technology, so be glad;)

      Me? I want zero-latency 4K VR ‘goggles’ the size & weight of sunglasses @120FPS. For each eye. That will maybe be ‘enough’. For a while…

      *I think Baldurs Gate 2 replays just about edges out System Shock 2..let’s not mention Deus Ex

    • SuicideKing says:

      AMD’s IGP stuff is way better than Intel’s, however their problem currently include weak and unexciting CPUs and an exceedingly old chipsets.

  8. kael13 says:

    Been waiting for Haswell-E for a while, decided on a 5820k as I’ll only ever run two GPUs at most. Regarding the platform, it’s just DDR4 that’s a real sting in the wallet. Luckily, you can find GSkill stuff (who are a really good manufacturer) much cheaper than other brands. Weirdly enough, the thing I’m most excited about is the X99 motherboard. Have my Asus Rampage V Extreme out for delivery today, cannot wait!

  9. TacticalNuclearPenguin says:

    Well, the biggest sting in your shopping cart seems to be the mainboard!

  10. Shooop says:

    If you already have a quad core CPU, this is a completely unnecessary upgrade unless you’re doing something actually CPU-intensive like video encoding and Photoshop. Haswell-E doesn’t work in previous LGA 2011 sockets and requires the v3 motherboards which in turn require DDR4 RAM which makes an upgrade to it staggeringly expensive.

  11. Megakoresh says:

    Sounds good for when I need an upgrade. But to be honest I am still running on Sabertooth, i7-2600K, GTX770 and 8GB RAM @1600MHz and I am able to run most games at high to highest on my 2560×1080 monitor. So I dunno…

    As far as graphics goes I think we are slowing down and reaching the limit of what we can do. There needs to be something major that can open a new generation in graphics and physics, and I have a feeling software won’t be that. We’d need something new in hardware. Maybe optical computers or something.

    Remember the times when every year games looked 4x more advanced? Yeah, those times are long past. On one hand, we might not be getting new awesome graphical fidelity. On the other we don’t have to upgrade as often. I feel like my rig still has quite a bit of overhead before I have to upgrade (i.e. have to start switching graphics to medium).

  12. NeuroBug says:

    I’m still rocking one of those i7-920’s…. Though this may push me for an upgrade once the RAM prices level out and timings improve a bit. If only some new gpu’s would warrant an upgrade as well.

  13. Apocalypse says:

    I think it is worth mentioning as well that PCIe 3.0 is twice as fast as PCIe 2.0, meaning you translate 8 lanes on 3.0 to 16 lanes of PCIe 2.0 … and 16x 2.0 is perfectly fine and mostly overkill for GPU already. ;-)