Hard Choices: Intel’s ‘Orrible New Haswell Chips

By Jeremy Laird on June 3rd, 2013 at 9:00 pm.


Move along. There’s absolutely nothing to see.

Still here? Fine. Intel’s new Haswell CPUs are a non-event for the desktop PC. In fact, with Haswell Intel’s indifference to the desktop might just have been upgraded to spite. If you really must have an explanation, here it is.

The NDAs have lifted, the reviews are out and I’ve had my own hands-on time with Haswell. The familiar refrain when introducing a new Intel CPU architecture is to explain that the main priority is mobile but nevertheless the net result is still pretty peachy for the desktop PC. Not this time. This time the main priority is mobile. The end.

For starters, Intel’s latest PC processor family – and let’s be clear about this, it’s being pitched as a major architectural upgrade, no less than the 4th generation of Core processors – boils down to two key advances, one of which is almost entirely irrelevant to the desktop and the other doesn’t even appear on the desktop – at least not in the first batch of socketed desktop chips.

Oh no, not laptops again
The first is mobility. Haswell might just be revolutionary in that regard. There are some as yet unreleased system-on-a-chip versions of Haswell which promise a 20x reduction in power consumption, in some scenarios at least.


A 22nm Haswell wafer, yesterday

That’s bonkers. Indeed, Intel reckons Haswell is the biggest step forward in processor efficiency in the history of its x86 CPUs. Yup, even bigger than the transition from Pentium 4 to Core.

Thus, if Haswell is anything like as good as Intel claims, it’s going to shake things up. Think tablets that combine iPad proportions with proper PC levels of processing prowess. Microsoft’s Surface Pro, as nice as it is in some ways, doesn’t quite deliver on that. Haswell, according to the hype, will. I’ve long wanted a tablet convertible that’s no bigger than an iPad but has the power of a proper PC. So I’m genuinely excited by the prospect. But it’s got naff-all to do with gaming PCs.

It’s got graphics, apparently
Then there’s graphics. To cut a long story short, Intel has more than doubled up on graphics power in Haswell. The maximum graphics execution unit count leaps from 16 to 40 units, and the clocks are up slightly, too. But at launch, that 40-unit graphics core – usually but not always sold under the new Iris brand name – isn’t available with socketed desktop chips. That includes the intriguing Iris Pro version with 128MB of eDRAM.

Just one desktop chip gets Iris and it’s a soldered-down BGA chip, not a socketed LGA model (more on sockets in a moment). Moreover, the execution units are pretty much the same as before in terms of architecture, so the 20-unit Intel HD Graphics 4600 effort in the desktop chips launched in the last few days are of zero interest.


4th generation, same old experience

I’ve also had trouble getting the new graphics core to run certain games, so the drivers are still half baked. Anyway, mobility and graphics are essentially non-relevant to the first desktop Haswell chips.

The new Core, er, core
Which leaves the CPU cores and the platform. Both of which basically suck.

OK, that’s unfair. The CPU cores don’t suck. They’re just not clocked any higher. There are no more of them than before. And on a core-for-core and clock-for-clock basis, they’re only a tiny bit faster.


Spot the difference…

There are new models to mirror most of the previous Ivy Bridgers. So the new Intel Core i7-4770K replaces the Core i7-3770K. And the i5-4670K replaces our old favourite the i5-3570K. Just as before, the K means unlocked, the i5 doesn’t have Hyper-threading, the clockspeeds don’t change. You don’t even get any extra cache.

I’ve done my benchmarking bit and the results are of virtually no interest. Haswell is a little bit faster. But it’s not even close to being enough that you can actually feel the difference.

Overclocking, sockets ‘n stuff
Then there’s overclocking. The chips I’ve tested so far (three of them) have been pretty much identical to their Ivy Bridge progenitors. But rumour has it that actual retail chips are much more variable than before. The engineering samples I’ve been playing with may not be entirely representative.

As for the new baseclock strap overclocking option, I could see how it would be worthwhile if it was widely available. But it’s reserved for K series chips which already have unlocked multipliers. So, I’m really struggling to care.

Next up, sockets. Yup, there’s yet another new one, namely LGA1150. That breaks backwards compatibility. Again. In recent years, we’ve had LGA1156, LGA1155 and now LGA1150. Honestly, I haven’t asked Intel why we have to have a new socket. I can’t bear the answers they’ll roll out, I’ve heard it all before. I’m unapologetic in that regard. I know they’ve moved more of the video interface kit onto the CPU proper. But I simply don’t believe it has to be this way.


Haswell gets yet another new CPU socket. I thought you’d be pleased…

Put simply, why can’t Intel show just a little solidarity for its faithful desktop customers? More cores or clocks. Baseclock strap overclocking for all. Just something!

It’s just not interesting
Thus the only thing I can think of that might make ‘orrible Hassie a little bit interesting is that a BGA quad-core model with Iris Pro exists and in something like the teensy Intel NUC it might make for an interesting Steam Box sort of system. But that’s assuming games will actually fire up correctly on the thing. Which is to assume a lot. And the chip alone will cost a fortune. Forget I even mentioned it.

And that’s all I’ve really got to say. Haswell isn’t interesting on the desktop because Intel isn’t interested in desktops and it’s willing to exploit its current advantage over AMD to the cost of customers. You and I, in other words. Saying that kind of thing won’t endear me to Intel. And these new chips remain the best gaming CPUs by far, I certainly can’t deny that. The Core i5-4670K enters the world as the new RPS gaming CPU of choice.

But the fact is, Intel could very, very easily be doing so much more on the desktop. There’s no point in pretending that isn’t true.

The only hope is that AMD’s upcoming Steamroller CPUs (due early next year, allegedly) are good enough to wake Intel from its slumber.

__________________

« | »

, , , .

171 Comments »

  1. Paul says:

    Disappointing. I would really like to have 8 core intel finally. With upcoming 8 core consoles (albeit with much slower cores) it might be useful.

    • povu says:

      Judging from all the extra stuff the consoles will be doing (livestreaming, social stuff) it wouldn’t surprise me if some of the cores are dedicated specifically to that and games will continue to be optimized for 4 cores.

      • Strabo says:

        Also, one i7 or even i5 core has more power than the 8 Jaguar cores combined. There is not really much need for 8 cores if playing games – even next gen – usually means that 2-3 of your cores are idling around because game developers have to program for consoles and 2 cores have enough power to run 3 of those games at the same time. Still, I too love power for power sakes and 8 cores i7 would have a nice ring to it.

        • kazmakoze says:

          Sophia. although Marjorie`s st0ry is flabbergasting… on tuesday I bought Jaguar E-type from making $6849 this-past/month and-in excess of, ten-grand lass month. it’s realy the most financialy rewarding I have ever done. I started this four months/ago and right away startad bringin home minimum $77… per/hr. I follow the instructions here, == http://WWW.BUZZ90.COM ==

        • demagogue says:

          Patently untrue. If you look up any performance benchmark, while an AMD Bulldozer/Jaguar core is slower clock for clock than an intel processor, the cores are by no means *slow*. Additionally, claiming a SINGLE intel core can beat 8 AMD cores is simply false in every way.

      • 00000 says:

        I think most of that (except maybe streaming) can be done by the secondary ARM processor.

        As for optimizing for 4 cores. Consoles are still the primary source of revenue for most game developers. So they will probably opt to parallelize for 8 cores in such a way that code won’t hang in a quad-core environment – but only for the sake of making PC ports redundant.
        If my suspicions are right, next gen consoles will be the beginning of the end for the dual-core gaming PC. The time to replace our E8400′s is near.

        • honky mcgee says:

          I’m still using an e8400 (4.0ghz) + Radeon 4890 combo. That little 45nm chip has served me so well these past 5+ years I may just frame it and put it on my wall when I finally decide to retire it. Either that or find a nice farm in Kentucky with lots of lady e8400′s and let it live out it’s days as a stud.

      • Oktober Storm says:

        Three cores are reserved for the system and seamless transition between games and texting your opponent about his mother. The other five is for gaming, and actually runs on a different kernel.

    • Artist says:

      Yes, 8 cores totally make sense so you can play your games that maybe support 2-4 cores max. But hey, good to have, right? Right? *shakes head*

      • Nogo says:

        Computers do a bit more than play games, buddy

        • Benny says:

          Indeed they do, and with the new generation of consoles bringing in 4-5 cores dedicated to the games, having 6-8 cores is going to be very useful, not to mention the plethora of other uses your PC can do (as someone who does 3D animation trust me, more cores are awesome)

          • PopeRatzo says:

            If you’re doing 3D animation, you want a workstation class machine, not a gaming PC. And you can get an 8-core workstation PC right now.

            Intel is dicking around because they can get away with it. It happened before and AMD became very popular with gamers. If the same happens, you’ll see Intel step up.

            There are enough PC gamers around to make it a worthwhile market for someone. Their job is to give us what we want. Our job is to make our purchasing decisions strategically.

          • PoulWrist says:

            PopeRatzo – or you might not have the buying power to purchase high grade intel workstations… or you might not be willing to fork out for the tiny extra in performance that it’ll give you compared to a consumer level device. If you’re going to go the full dual-cpu route etc., it starts to become prohibitively expensive for small companies or one man freelance operations. Just a 8 core CPU from ol’ Intel is the same as a full system with a decent Quadro or Firepro card in it…

      • 00000 says:

        But think of the possibilities. With an octo-core Haswell you could play 2-4 of those games at the same time!!!

    • Kinth says:

      I wouldn’t get too hung up on core count.

      AMD’s octo cores are often out performed by their own Quad cores from 2 years ago.

      The new consoles are very GPU reliant, hence the low power Octo core. Most games have become increasingly GPU bound anyway. The truth is currently few games are programmed with even a Quad core in mind and even those games will struggle to max out a Quad core CPU from a few years ago. They aren’t giving us an octo core because there is simply no need for one in the consumer PC space.

      Just because the consoles are using Octo cores doesn’t mean we will see a ton of PC games that need an octo core to run well or can even use an octo core. In fact I doubt we will see any this generation. When games aren’t even using all of 4 cores (on an average PC quad core from 2 years ago) then it’s kind of pointless adding more.

      • MasterDex says:

        It seems a tad foolish to assume that the status quo will remain with the new console generation, especially since the new console generation represents a pretty big leap in relative terms.

        With every new console generation, we’ve seen the industry and our games change, if even just a little. With the current generation, focusing on parallelism and taking advantage of multiple cores wasn’t very feasible, at least from an economic standpoint. Those few that did it to any degree were firmly in the PC market. With the next generation however, the floor has been raised so that designing games with multi-core in mind, not to mention making use of the increase in video memory, makes more sense.

        I could be wrong but the way I see it, multi-core gaming is going to be the standard within the next 2 years or so once developers find their feet with the new consoles.

        • Wedge says:

          Unfortunately this doesn’t do any of that either. Not for desktops anyways.

        • Strabo says:

          There is a big jump in computing power compared to old consoles, but not compared to PCs. Even a i3 (a dual-core chip) outperforms 8 Jaguar cores easily, and we know that they won’t even get all 8 for gaming. The graphics part is a HD7850 (or a HD7750 in the case of XBox One), something now mid-range for PCs, even more so in six months time. Even including the poor quality of some ports (which might be less so with the similar architectures) and the missing optimization for PC games any run-of-the-mill gaming PC should have no troubles coping with what the new consoles bring to the table.

        • robotslave says:

          People who say things like this have no curiosity about (let alone experience in) developing (and testing and debugging and refactoring) concurrent software.

          If only this were just a matter of waiting for all those wet-behind-the-ears programmers to “find their feet!”

          Some very nice concurrency frameworks have been around for decades now. Hell, there are already languages that have very nice concurrency features baked in. But the fact of the matter is that creating software that does more than one thing at a time is fundamentally more complex, and expending the resources to do it well (enough, i.e. without obvious showstopper bugs) multiplies your development time/staff/budget.

          • MasterDex says:

            @Strabo:
            There is a big jump in computing power compared to old consoles, but not compared to PCs.

            Which is why I said it was a relatively big jump. I don’t think any next gen console game is going to push a run of the mill PC any time soon. However, I do believe that some of the more ambitious developers will release some games that do take advantage of the next gen architecture and that may present a problem for the run of the mill PC. Then again, I could be completely off base but all any of us can do at this point is speculate.

            @robotslave:

            People who say things like this have no curiosity about (let alone experience in) developing (and testing and debugging and refactoring) concurrent software.

            Experience? No. Curiosity? Yes. I’m studying software engineering in college so I’m not sure how that line holds up. From what I’ve read from other developers (I’m assuming you are one given the horse you ride in on), you’re on just one side of the argument whereas I’m on the other.

            If only this were just a matter of waiting for all those wet-behind-the-ears programmers to “find their feet!”

            You misunderstand me. I’m not talking about wet-behind-the-ears programmers finding their feet. I’m talking about developers getting used to the hardware they’ll be working on, that takes time – you know, like every preceding generation where the quality/complexity of games, generally-speaking, rises as time passes. The developers start playing with the hardware, then they get used to it, then they’re experienced at it and by the end of a generation know more or less all the tricks to get what they want out of the system.

            …But the fact of the matter is that creating software that does more than one thing at a time is fundamentally more complex, and expending the resources to do it well (enough, i.e. without obvious showstopper bugs) multiplies your development time/staff/budget.

            So your basis for saying that multi-core gaming won’t be a thing is that costs and complexity will rise? That seems like a rather weak reason, especially considering that such has been a given with every new generation.

            I find it hard to believe that this generation is going to be any different.

    • fish99 says:

      Each core of those console 8-cores is likely to be around 4x slower than a core in a 3570 though, so overall you still have double the horsepower.

      You also have to consider how bad the whole computing industry is at taking advantage of multiple cores.

  2. Continuity says:

    And thus begins the end of the desktop

    • Ross Angus says:

      Take that back. Take that back!

      • Grey Poupon says:

        As long as there’s people who’d rather get extra performance instead of extra mobility, desktops aren’t going anywhere. Maybe the form factor might change to a HTPC-like case and then they won’t be called “desktop” anymore, but still.

        Does anyone actually still have their PC on their desk? Desktop ‘puters died of years ago already. All that’s left is the name.

        • Harlander says:

          Where I have a big-boxed computer, it’s invariably under my desk. “Deskunder PC” sounds like something from Germany to me though

        • robotslave says:

          I for one still use the “desktop” form factor for my desktop computer. It stacks so much better with all the other Boxes Of Thing I’ve got, you know? Damn hard to shop for these days, though, that’s for sure.

        • mickygor says:

          Yeah. My desk is big, and it’s easier to access on top than it is underneath. Better airflow, too

        • roryok says:

          wait. If I use my laptop on a table does it make it a tabletop computer?

        • Lord Custard Smingleigh says:

          I suppose my computer could best be described as an “undercat” computer.

    • JFS says:

      Will it take PC gaming with it? Wait… PC gaming died yeeaarrs ago!

    • Bedeage says:

      You are so very stupid.

      • frightlever says:

        The desktop may not die but the mainstream, configurable desktop PC we’ve become familiar with is going to become a rarity. Those Intel NUCs or similar are going to replace them on most office desks.

        You’ll still be able to buy specialist motherboards and socketable CPUs, discrete GPUs and the rest but it’s going to become even more niche than it already is because everything is inexorably moving towards closed box solutions.

        Five years from now the TV you buy will probably have Xbox One technology baked in, taking up about the same space in the base as a pack of cigarettes.

        To be honest, I think there are exciting times ahead.

        And I’m not sure why anyone interpreted the desktop dying as PC gaming dying. PC gaming is only going to get bigger and bigger, while the number of people doing it on a traditional desktop PC will be getting smaller and smaller. I’m sure plenty of people are already gaming full time on their laptops.

        • robotslave says:

          You, sir, clearly have an enviable optimism in the ability of The Market to create jobs, in this day and age.

    • DiTH says:

      According to the whole internet ,at least at the parts mobile/console users can reach,we have been dieing every day for the last 10 years.And here we are again with a 2 year old system that can harness the same power as the next gen consoles. :/
      Lets drink for one more honorable death.

      • BTAxis says:

        You forget the extra processing power we need to get shoddy console ports to run at an acceptable frame rate.

    • soul4sale says:

      I’ve been waiting to read this review for some perspective on buying a new gaming laptop. While humorously elitist, this piece was very disappointing. I’m not hauling a tower box to a TF2 LAN party, but I’d like something that can run Borderlands 2 on current console-level settings. I was hoping RPS would at least test these chips on the form factor for which they are designed.

      • CommanderJ says:

        These parts are NOT designed for laptop. What was released now is desktop-only parts. The mobile parts will be announced at a later date. Hence he couldn’t possibly give praise to the mobile/lappy benefits of these parts, because they’re not -for- lappys.

      • Eonfge says:

        When shopping for a laptop, make sure you have the following:
        * Dedicated Graphics card
        * High speed HDD or SSD
        * Good cooling

        With such things there is no reason to consider a laptop worse then a dedicated tower. Of course, after that there is a lot of luxury to buy: nice monitor, faster processor, solid manufacturing and such. As far as hardware goes, laptops are some of the most complex things to buy as you don’t just buy the hardware you like, you also buy the assembly, the components around it (shitty broadcom wifi for example) and stuff like the battery.

        Back OT, I got myself a Ivy Bridge half a year ago because there was no reason to wait for Haswell from the rumours. No intend to update in the next two years, so I’ll just skip this generation.

        • Dataflashsabot says:

          In a vacuum, no. But the inability to upgrade it would be the dealbreaker for me.

        • sabasNL says:

          I myself think the Ultrabooks (by Intel, the powerhorses under the laptops, promoted by Gabe Newell among others) or a gaming laptop (generally overpriced, but it does the trick) is what you’re looking for. The normal kind of laptops just aren’t enough for the typical gamer.

          • Lycandar says:

            I’m currently looking for a gaming laptop, any ideas on what i should be looking for or what’s good to get?

            I’m currently looking for the best i can get for around £800 ($1200)

          • Elementlmage says:

            I recently bought a Dell Inspiron 7720 for about $750 American and it works great for me. Getting good solid frame rates on Tomb Raider and Arma 3; about 30-50 fps.

            It’s got an i7-3630 QM with a base clock of 2.4 but will run up to 3.2, an nVidia 650 GT w 2GB gDDR5 (runs way better than a mid range mobile chips has any right to), a 17″ screen, 8 gigs of DDR3, and a 1 TB HDD. It’s only 5400, but with such high density, it runs about the same as a 750GB @ 7200.

            Battery life sucks, 2hrs on very conservative settings, surfing only. But let’s be honest, it’s a DTR.

      • Jeremy Laird says:

        Haven’t had my hands on the new mobile chips yet. On paper, the gaming relevant mobile models are a reasonable, though not exactly dramatic, step forward. The CPU cores haven’t changed much and you still need a a discrete GPU.

      • MasterDex says:

        Hey look! That guy’s lamenting the lack of focus on the desktop and general lack of technological advancement with intel’s next generation chips, he must be an elitist!

        -_- The word elitist should be removed from human vernacular. It’s rarely used correctly these days.

        • Arkh says:

          Yes, MasterDex. Also, the words “awesome”, “epic” and “entitled”.

          • Lord Custard Smingleigh says:

            Can I add “turgid” to the list? I dislike that word immensely.

          • Strabo says:

            Also “vapid” as apparently nobody is able to use it correctly anyway.

          • MasterDex says:

            @Arkh: ‘Awesome’ and ‘Epic’? But then it’ll take ages to explain why Book X is better than Book Y (hint: the answer is Book X is an awesome epic fantasy while Book Y is a mediocre paranormal romance.) :P

            @Lord Custard Smingleigh:

            Your dislike of that words seems somewhat overblown.

            @Strabo:

            Just for you: I find the last generation of consoles became vapid shells halfway through the generation, churning out the same old things we’d been playing since the start of the generation.

      • Wedge says:

        All things aside unless you want something in like, a Wii form factor, it’s not at all difficult to make a cute little boxy PC with a dedicated midrange GPU that isn’t hard to lug around. And if you wanted non-dedicated GPU, AMD’s current on-chip GPU solution still blows this away.

    • PopeRatzo says:

      And thus begins the end of the desktop

      Again?

      • frightlever says:

        What again? Look at the numbers.

        http://www.independent.co.uk/news/business/news/pc-sales-hit-by-biggest-decline-in-20-years-as-tablet-takes-over-8569069.html

        This is an ACTUAL thing that’s happening. Currently home PC sales are split about 50/50 between laptops and desktops but sales of both are turgid, really, really turgid. Tablets sales are going through the roof. This isn’t doom-mongering, it is measurably happening.

        Windows 8 took a lot of the blame but the truth is most people don’t want a giant box PC in their house and a tablet is perfectly adequate for browsing and Facebook. It’ll hit consoles as well, but that doesn’t mean that video-gaming of all stripes is going away, just that the market is expanding and splintering into different areas.

        • Strabo says:

          You don’t need a new desktop if your three year old one still does everything perfectly fine, especially true for companies, the biggest buyer of desktops. Not the number of desktops is decreasing, the number of new ones sold.

          • Dinger says:

            It’s the ugly truth. The industry has been built on a replacement cycle that’s simply no longer viable. I’m something of a junk collector, and in my house there are 2 desktops, 2 laptops, 2 tablets (well, one is a Nokia N800) and a smartphone. I find myself using the desktop I built in 2006 (Intel Core Duo E6600, Windows XP, HDD and video card replacements) more than the monster from last year (with 240GB SDD+3 TB HDD, Win 7, and all that crap).
            Why? My stuff’s still on the old desktop. Sure, if I need to do some serious gaming, watch TV or image editing, I’m on the new thing, but the old one’s just as effective (and the furniture is better suited to it). In 2006, I had a laptop from 2004, and it was struggling.

            This is one reason why Microsoft has been having trouble: their money makers are Office and Windows, and both relied on improvements to the technology to keep people reinvesting. So they switch to a subscription model for office and try forcing Windows onto places they’ve never succeeded in getting Windows before.

            Now the chip manufacturers have decided PCs are dead, and are going after the higher-turnover areas. Many users switched to laptops a long time ago; from a manufacturer’s perspective, laptops have huge advantages over desktops: they break and they get stolen.

            Gaming laptop? The excess heat cooks the battery and the thing dies in a couple years.

        • MasterDex says:

          Keep in mind that those figures are concerned with prebuilt systems, which for sure have been on the decline. However, if you nose around, you’ll also find that PC software and PC game sales are up and also that individual component sales such as motherboard and gpu are up.

          This would point to a paradigm shift rather than the end of an era. The general user has no use for a bulky desktop so they buy tablets or laptops or hybrids because they do everything the general user could want. The PC gamer has no use for a bulky desktop that’s overpriced (simply for being a prebuilt “gaming” PC from a branded seller) so they build their own or get someone to built one for them. The business world just doesn’t really care. They’ve still got their workstations and tablets are fine enough for any computing away from the office.

    • Screamer says:

      Get out STALKER!

    • fish99 says:

      Hardly. No one wants to play real games on a touchscreen, and laptops still have the same issues they’ve always had for gaming. So nothing has changed, and the PC market is thriving.

  3. povu says:

    I’m still on 5 year old dual core CPU (E8500). The difference between Ivy and Haswell may be small, but Haswell will be huge for me.

    I had a look at some benchmarks that compare the E8600 with the new stuff, given there’s no GPU bottleneck I would be looking at a 230% increase in FPS in Hitman Absolution, from 28 to 64 or something like that.

    • nakke says:

      That’s only given no GPU bottleneck. Do you have SLI Titans or something?

      • povu says:

        No so , but Hitman Absolution should run a whole lot better than 28 FPS on a HD 7870 2GB. :P

        I think the benchmark was done in 1080p medium settings, to avoid GPU bottlenecking getting in the way.

    • Moraven says:

      The wife got a new Ivy machine will the E8600 got turned into a Steam Box with a 1 year old AMD 68xx card in it. I won’t have it all on Ultra but games like Space Marine and Batman seem to run fine with settings mostly on high.

      My bottleneck for when I used my E8600 was my inability to really multitask, run more than one game ,livestream on the same PC, browsers open, music playing etc. And CPU more intensive games get a nice boost also. Enjoy it.

      • roryok says:

        My bottleneck for when I used my E8600 was my inability to really multitask, run more than one game

        Run more than one game? why would you need to do that? You can only play one at a time!

    • 2PartReturn says:

      Aye, I’m still on an E6750, so things like on-chip AES, USB 3, SATA 3 are all quite large improvements. Thing is, it’s stably over clocked to 3.5GHz and despite being 6 years old and mid range, it still handles everything I need to throw at it making the cost of all that new hardware hard to justify. I used to need to update regularly, but there really hasn’t been *that* much in the way of worthwhile desktop advancements in the past few years.

      • riverman says:

        GPU upgrade history:
        [whatever was in my 386]
        1MB TRIDENT
        4MB ATI ____?
        32MB ATI RAGE
        32MB RIVA TNT2
        32mb nVIDIAMX400
        64MB nVIDIAGT4200
        128MB ATI RADEON 9700 PRO
        512MB nVIDIA9600GT
        1Gb nVIDIA GTX260
        2GB ATI 7850

        that out of the way, here is my list of upgrades that made me WOAAAAH:

        32mb rage> that thing rocked my world; DITHERED TEXTURES UP IN HERE!?
        radeon 9700pro > that thing played DOOM3 with a playable framerate, unlike the 4200 that I bought it for. HL2? suck it, consoles!
        GTX260 > damn, this looks just like the consoles but crispier!

        Anyhow, I think the point I’m trying to make is that the upgrade to the 7850 was a gigantic “meh”, and I’m sure I would feel the same way if I had a titan in my system. a gtx260 would still run games beyond my satisfaction level. I guess I just haven’t seen a GPU make me go “wow” in a few years :/

    • HisDivineOrder says:

      And yet the difference between what you have and Sandy Bridge or Ivy Bridge would also be huge. This article is basically saying to save your money, buy someone’s used system that’s going to Haswell, and enjoy 90% of the performance for far, far less money. All while enjoying the superior …everything of Intel vs AMD.

    • abremms says:

      wait until Haswell comes out and pickup an Ivy bridge on the cheap. S’what I did with my current sandy bridge i5. When Ivy came out they were practicaly giving away the previous gen, even though the actual performance difference wasn’t that great. and since Haswell still doesn’t look to be a revolutionary step forward, I’m sitting pretty looking at a 2nd year on excellent performance that cost me less than $100.

  4. kalirion says:

    Hmm, if the new desktop chips are not worth it as an upgrade from the current ones, who cares if they require a new motherboard?

    • Moraven says:

      Some people may be on Sandy Bridge (like myself) or i3s or i5s and are looking to update. But due to new socket requiring also a new motherboard, Intel has basically shut down that market.

      • darkChozo says:

        I’m in that spot; I’ve got a 2500K that I’m vaguely thinking of turning into something a bit beefier, both generationally and in price tier. But a 4770K seems like a rather hard sell over a 3770K when you throw a new motherboard into the mix.

        • Fred S. says:

          I’m still running an LGA775 system, so any upgrade will be a new mobo. LGA1055 would have been fine, might still be fine, but the LGA1050 mobos will have better i/o in terms of more SATA 3 drive slots and more USB3.0 ports so that’s probably where I’d go if I were in the mood to upgrade.

        • sharks.don't.sleep says:

          Whaaa..?
          I got a 2500K too and can’t see me upgrading it for the next 2 years.
          My GTX 460 will probably get replaced next year with something stronger, though.

          What makes you think of upgrading?

          • darkChozo says:

            Mainly that the games that I really struggle with are largely CPU-limited (mostly ARMA 3 and PS2), plus a better CPU would be nice for general computing/running lots of stuff in the background. The going up a price tier bit is really what would make it worth it; I don’t see much point in going from a 2500K to a 3570K or a 4670K (side note, Intel naming schemes are dumb), but going to a 3770K or 4770K means some real improvement.

            They’re only vague thoughts though, driven mostly by having more money than when I originally built my box and the fact that my 2500K/560 Ti has gone from “can run everything” to “can run everything but usually not at highest settings at 60 FPS”. I’m not totally sold on any big upgrades yet.

    • John Connor says:

      Maybe Sandy Bridge CPUs will be extra cheap now?

  5. HadToLogin says:

    Looks like Intel knows we’ll need to at least rebuild our PCs in a year or two (as experience suggest, while at first it won’t matter, after a year or two of new console generation you need PC at least one generation better than consoles to be able to play games), so they make new sockets so they can get money not only from CPUs, but from motherboards too.

  6. bad guy says:

    Well currently we don’t really need more power than a 3570K for our games?

    What we need is software that uses our massive amounts processing power to do sth nice.

  7. baziz says:

    The truth is that this is what we should expect from the desktop market now. Due to heat and energy constraints, you just plain cannot make these things faster. This has been the quiet truth of the computer architecture community for the last few years. The only way things will get faster is through parallelism, but heat and energy is still a factor and parallelism isn’t everywhere. We can get better power usage because of improved management but we really truly can’t push single threaded performance much higher than we’ve gotten it. The last remaining performance boost is probably integrated gpu-cpu memory, so hold out to see if AMD releases the chips in the ps4/xbone for the desktop market.

    Source: I am a graduate student in computer architecture.

    • aeolist says:

      http://www.xbitlabs.com/news/cpu/display/20130305060258_AMD_s_Fusion_Kaveri_APU_Supports_GDDR5_Memory_Report.html

      AMD’s next high-end APU Kaveri will have GDDR5 support, which basically makes it a PS4 APU with better processor cores (Steamroller instead of Jaguar, also assuming they have at least as much GPU compute on die).

    • jalf says:

      Due to heat and energy constraints, you just plain cannot make these things faster. This has been the quiet truth of the computer architecture community for the last few years

      No.

      That has been the extremely loud and exaggerated truth for the last 8 years or so.

      The *actual*, and much quieter, truth, is that you can. You just can’t *easily* make them faster. It’s more costly and much less dramatic than it was in, oh, say, the Athlon64 days, but speed improvements still occur even without increasing the number of cores.

      But a modern CPU is still 2-3 times as fast (at the same number of cores) as, say, my aging Core 2 Quad, which is from just around the time when people started stating that “CPUs won’t get any faster, all we have left is parallelism”.

      As always, the truth isn’t that simple. Cores have gotten faster, and they will continue to get faster. They just won’t double in performance every 1-2 years any more.

      • baziz says:

        It’s true that it’s still possible and I’m not saying no improvements have been made in the last few years or will be made in the future. But, as you said we’re in the realm of it being quite difficult to get those gains now, and definitely getting close to diminishing returns, so our expectations about new processors should be adjusted. Do you know of any articles that dismiss the doomsayers? :)

        • jalf says:

          Technically speaking, you *did* say that no improvements will be made in the future.

          You said that “you just plain cannot make these things faster.”

          That’s all I took issue with. We *can* and *will* make these things faster. The speed gains are just going to be much more modest than they were 5-8 years ago.

          I think we agree though, I just wanted to moderate your wording a bit. :)

    • Jeremy Laird says:

      I have an eight-core, 16-thread CPU in my desktop that would beg to differ.

      Intel could very easily be selling six or eight cores into the mainstream. But there’s no competition pushing it in that direction. For now, at least. Hopefully Steamroller and Excavator can change that.

  8. aeolist says:

    Actually since they integrated voltage regulation onto the processor die they really did have to move to a different board design.

    • Sakkura says:

      But moving part of the VRMs onto the CPU die doesn’t really benefit the desktop. The main benefit is, in Intel’s words, increased battery life.

      • Lord Custard Smingleigh says:

        A Haswell chip would allow me to play my games while pedaling the dynamo significantly slower.

      • phoebusQ says:

        Moving VRMs onto the CPU itself absolutely necessitates a new socket, and beyond the implications for mobile also could take a significant chunk of change off the price of many motherboards, or allow manufacturers to include more features at the same price point.

        • Sakkura says:

          The LGA 1150 motherboards still have VRMs, they just convert to a different voltage than on older platforms. So there’s no real difference in motherboard price either (assuming similar featuresets).

  9. nimzy says:

    Saving on power sounds great and all but CPUs haven’t been a significant desktop power draw since the Centrino/Turion days. Now we’re all about monstrous GPUs, like the superchargers sticking out of the hood on muscle cars.

    So less power usage should mean less heat. Did you see any of that in evidence?

    • aeolist says:

      Load power draw is actually higher on Haswell than Ivy Bridge, thanks to a bigger die size on the same 22nm process node. Idle power savings were Intel’s focus this time but the desktop SKUs don’t get the S0ix power modes that they use on the mobile side to get their gains in battery life.

      So actually you’ll see more heat coming off the new chips. Don’t expect lower power until Skylake in 2015. Broadwell next year (barring delays) will move down to 14nm and lower power, but the only desktop SKUs Intel will offer will be BGA chips soldered to the board.

      • jrodman says:

        What is meant by idle power? Does a cpu reach this lower idle state meaningfully when it does it 50 times a second? If so the typical draw may actually be cooler.

  10. jrodman says:

    I’m perfectly satisfied with this. I don’t need more performance.

    Making the computers cheaper, more efficient, cooler, and quieter are quite good benefits.

  11. foobar88 says:

    I’ll gladly take this as good news for PC gaming. More efficient, lower power processor = lower heat = less intense cooling needs = more quiet machines = lower power bills = more civilized gaming experience.

    • Sakkura says:

      Core i5-3570K TDP: 77W, Core i5-4670K TDP: 84W

      • stephenallred says:

        TDP refers to the maximum heat dissipation the cooling system needs to provide. As such it isn’t enough information to draw conclusions on with regards to power requirements.

        • CommanderJ says:

          Actually, according to anandtech, in a specific test, the haswell did 13% better than Ivy, by using 11% more power than Ivy. Essentially these are 1-1 performance per watt compared to Ivy. No great strides have been made in this department, either.

        • Sakkura says:

          The heat dissipation is identical to the power draw, because all the heat dissipated comes from the electrical power. The power draw can theoretically fluctuate faster than thermal equilibrium could set in, but in practice the TDP is proportional to and very close to the actual peak power draw.

  12. stephenallred says:

    FMA3 might end up being useful in games (lockless support for matrix multiplication and all that), but they will have to be optimised/recompiled for it.

  13. jellydonut says:

    So I don’t have to spend money on a computer, for the third generation now?

    Good. I need a place to live instead, and probably a new-old car soon.

  14. CommanderJ says:

    There’s also the fact that VT-d etc etc has been disabled on the K chips so they won’t be nicer than the server parts they’re selling overpriced to business-users. The cheaper, non-overclockable non-K parts? Yup, they have VT-d and TSX. Artificially segmenting the market to milk more money.

    Also, according to anandtech, the haswell managed a 13% increase in a test by using 11% more power than ivy. Yes, that’s almost exactly 1-1 performance per watt compared to Ivy. ‘They’ve focused on idle power this time, not load power’, you might say. Yes, indeed. But the desktop parts get virtually none of those idle power benefits. Combine all of this with the fact that you’re paying for 33-50% die area dedicated to a GPU you will never use…..yes, it is absolute piss. Qualcomm, ARM and Samsung really must have Intel running scared, for them to take such a giant shit on the desktop users just to be competitive in mobile.

    • Wedge says:

      If you’ve looked at the market graphics for mobile vs desktop/laptop space, you’re damn right they’re scared.

  15. Moraven says:

    Glad we did not wait in February to build a new machine. So many people were waiting on Haswell thinking it was going to be this huge boost.

    I just hope AMD gets back into the game sometime soon. They are also focusing on mobile also. With tablets and phones to surpass Desktop and Laptops sales, I don’t blame them. But both Intel and AMD seem late to the game to get much mobile market share.

  16. Jambe says:

    I’m not an Intel apologist, but nor am I a reactionary who reads anti-PC sentiment into corporate responses to new markets. If Intel doesn’t double-down on its entry into mobile it will die because standalone hulking tower computers are dying. Get over it, friend; the increased miniaturization and consolidation of computing devices is inexorable (and has been ever since electronics were invented). We’ll still have games and we’ll still have general purpose computing. In fact, one could see Intel’s push of x86 into the mobile space as a boon to us… but no, no, they’re just a Blue Devil.

    Quick, let’s group-hug and have a cry for our dear friend Pee See Gaming, who unbeknownst to me was just shivved in the spleen by shadowy Intel assassins. I hear the whup-whup of the black choppers which were summoned to dispose of the body. Oh, Pee, we hardly knew ye.

    Anyway, in terms of gaming performance, you could’ve stuck with any mainstream Intel or even AMD chip from three generations ago and you’d still be fine; most PC games are not CPU-bound, and when they are (strategies, sims, etc) old CPUs are generally more than enough to handle them.

    So the question, really, is what could’ve possibly impressed you? Sixteen buh-tillion times more efficiency which you as a PC Gaming Aryan couldn’t possibly utilize? 2 GHz of overclocking headroom which, again, you’d have no reason to use other than to consume more electricity than you need to?

    All I see here is FUD. This was a fine tock from Intel, and a great one if you look at the mobile side of things (I guess laptops don’t count as PC Gaming devices). Since Haswell didn’t lube up our gaming-genitals for coitus with the innumerable compute-starved games on the horizon (???) it’s a total flop.

    Ehhhh. Weak. I, too, hope AMD’s next CPU boots many a behind – Intel needs to be kept on its toes, after all. But Haswell was nothing like a poor showing.

    • Snids says:

      Heh. Good stuff.
      Progress. Let them have their “Desktop Computing Societies” in the year 2050.

    • zeekthegeek says:

      You are adorable. “I am not an intel apologist’ and then a small essay on apologizing for Intel. Cute kid.

      • honuk says:

        As much as your life is probably spent willfully denying it, the truth is actually agnostic. One is not rendered an Internet Warrior of Side A or Side B simply for stating things as they are. You don’t need anything more powerful than these chips for PC gaming. You don’t even need these chips for PC gaming. And you won’t any time soon. So Intel is not going to make them when they could make things that people might conceivably need instead. That one not only can but apparently should look at making a chip that maintains desktop power while offering laptop and mobile boosts as an affront to your lifestyle whoops I mean PC gaming is actually indicative of mental instability. It’s the dumbest reaction possible. Take a breath, your precious video games aren’t going anywhere. RPS righteousness at its finest.

        • MasterDex says:

          Call me crazy but I find your post a bit more reactionary than the one you replied to. With no basis, you just railed on the guy. The comment he replied to starts with “I’m not an Intel apologist” then reads like something an intel apologist would say, venom and all.

          The Haswell is a weak release from Intel, simple as that. That’s how things are. If you read the reviews from other sites (those that have no beef in the PC gaming…uhhh…game), you’ll find that sentiment is shared by many. But whatever, it’s a tock release so who cares, right?

          In all honesty, I think this is precisely the article that RPS should have posted – for the simple reason that a large number of their readership are interested not in mobile performance but in desktop performance and this lackluster tock just doesn’t do it.

          • Jambe says:

            I now play PC games on my laptop more than my desktop and I’ve been reading and recommending RPS since they started up. I’m a fucking heretic, apparently. My ideas and preferences don’t count. If I don’t play my games with a big tower and multiple monitors, I’m not a True PC Gamer. Yeah, fuck me.

            I suppose Anand Shimpi, Scott Wasson, Kyle Bennett, and other long-standing names in the review business are in Intel’s pocket, yeh? What actually happened is they all said Haswell is clearly geared at power reduction to be competitive in the mobile space (it’s almost as if Intel’s been saying this for years already) and that these desktop chips represent something like 2-15% performance gains over Ivy Bridge depending on scenario, and that if you’re a gamer your money would probably be better spent on a GPU if your PC is Sandy Bridge era or later.

            So clearly Intel is intent on castrating PC gaming, yes? They want to follow the mobile market therefore we’re all just second-class trash to them, right? What utter horseshit. If you’re building a new PC for gaming and general use, Haswell or Piledriver would float your boat for a good long while.

            And, ffs, AMD is pursuing the same strategy as Intel – look at their XBone and PS4 wins! Those came about because 1) AMD will take lower margins to stay in business and 2) Jaguar is more integrated and power-conscious than any other AMD architecture to date. So AMD’s also out to undermine our glorious pastime! Woe are we! We’ve no champions! If Steamroller can’t execute a trillion more instructions per cycle than Piledriver, we’re all finished!

            C’mon, friend. There’s more to computing than PC gaming, and we enthusiasts needn’t be stodgy and reactionary because that stuff sometimes takes precedence over our beefomatic boxen.

            <3<3<3<3

          • MasterDex says:

            I feel I should type a large response just to see how angry you’ll get and how many leaps and bounds and fallacies in logic you’ll make but it’s late and I need a dump so yeah, enjoy that rage man.

            PS. You appear far more reactionary than the supposed PC elitists. Breath.

          • Jambe says:

            lol, I may well be.

            You see, I just sat down at my desktop PC. I now feel the inexplicable urge to fling flaccid “mehs” at Intel because I can’t use Haswell to turn my ‘puter into the Second Coming of Christ.

            Ra ra ra, get ‘em, AMD!

    • AlienMind says:

      You won’t believe it, but the packets your fancy round mobile devices generate, go to big fat backend servers.

      • Jambe says:

        No, you’re quite right, I’m an ignorant and temperamental chawbacon, and that’s why I come to a site called Rock, Paper, Shotgun looking for hardware reviews for systems integrators.

        Your comment made me realize the error of my repeated attempts to cram non-ECC-capable setups with 120mm tower coolers into 1U enclosures. I simply didn’t know any better, but I’m closer to figuring it out thanks to you. We yokels may yet get fiber access!

        Actually, you may eventually feel shame for enhancing our connectivity.

        • AlienMind says:

          Yeah, like Xeons or something are FROM SPACE. They of course use a lot of the technology of desktop processors.

      • roryok says:

        actually the big servers are also moving towards laptop-like low powered platforms so they can cram more into the same space.

        • AlienMind says:

          No. Things like blades did not take off. The reason: Room is cheaper than the cost of engineering to cram everything tight together.

    • MasterDex says:

      Uh-huh. Not an intel apologist, you say? Interesting.

      “Desktop is dying! PC gamers are elitist fundamentalists! – but hang on while I invoke godwin’s law to show how moderate I am.”

      Come back to me in 5 years time and tell me again that desktops are dead. Something tells me you’ll be eating those words.

      • Jambe says:

        Yeah, let’s obsess over form factors, because that is so very consequential (hell, I build PCs for extra income but I don’t bemoan further shrinking and integration). I care more about what computers can do than what shape they are…

        In seriousness: what could Intel or AMD do to improve PC gaming if most of today’s PC games don’t even reliably thrash Wolfdale, let alone Yorkfield? It strikes me that many game-focused hardware enthusiasts are a touch misguided as to the physics of transistor design, the realities of international computing markets, the nature of software’s utilization lag, etc.

        Not even AMD (which I admire and which has been snatching up some of the best and brightest from the industry in recent months) revolves around PC Gaming. To these giants of semiconducting, bleeding-edge hardware-fetishizing gamers are like a few raisins scattered throughout the bread of the industry.

        *shrug* But yes! Clearly I’m a cointel operative and frothing fanboy. My HTPC with AMD-branded CPU, GPU and RAM is just there to mask my true identity, you see. It’s bright red paint smeared across my true-blue visage… but you, you managed to see through it all! Clever girl.

        My handlers have recalled me for “education”.

        … damn

        • MasterDex says:

          I’m not sure how to respond to this. You’re taking the sentiment of many people out of context, assuming it all revolves around PC gaming. It doesn’t. Some of us use our PC’s for more than that, and some of the programs we use are quite CPU-intensive, even with a modern chip. Hence, the disappointment.

          You’re then magnifying that sentiment to hyperbolic proportions; putting forth the picture that those disappointed with this showing from Intel are rage-fuelled “elitists” just looking for something to complain about, instead of the indifferent/mildly disappointed/saw-it-coming level-headed people that we are.

          But whatever. You type angry at the imagined rage and I’ll read your comment and have a giggle over it.

          • honuk says:

            this is an article on a PC gaming site, specifically about how this chip relates to PC gaming. Jambe’s initial post was a direct response to the article. so no, I suppose he was not interested in non gaming applications of top end CPUs. why would he be?

          • MasterDex says:

            “This is an article on a PC gaming site, specifically about how this chip relates to PC gaming.”

            You’ll have to point out all those times in the article where Mr Laird specifically talks about how this chip relates to PC gaming. Maybe I’ve just contracted dyslexia or something because I can only see the one reference to PC gaming. The rest of the article seems to just explain why anyone looking for a new desktop chip, whatever their reason, should hold off or just pick up an older chip.

            “Jambe’s initial post was a direct response to the article. so no, I suppose he was not interested in non gaming applications of top end CPUs. why would he be?”

            Yeah. Initial Post. Implying more than one. Funnily enough, the comment from myself that you replied to wasn’t a reply to that initial post. It’s almost as if I was replying to something else Jambe said.

          • Jambe says:

            Why do you think I’m angry or imagining rage? I’m hyperbolic, but I’m super-chill atm. Tommy Emmanuel’s strumming away, I’m relaxing and having a poke at the keyboard.

            I think this is all overblown fluff. Intel has clearly prioritized mobile with Haswell, therefore they’re ignoring PC gaming? I don’t believe PC gaming is inherently, absolutely non-mobile. I’m even heretical enough to want to take my PC gaming around with me as I travel!

            “Haswell isn’t worth upgrading to if you’re lightly using a machine 1-2 generations older, just as was the case with virtually all new architectures.” It’s a mountaintop revelation, that.

    • Arkh says:

      Innumerable compute-starved games on the horizon (???)

      Dwarf Fortress says hi!

      But it’s just inefficient, that why it consumes more power!

      And more power would fix this too!

    • mickygor says:

      For the record, this is a tick not a tock. At least, it’s supposed to be.

    • Widthwood says:

      Haswell being of gamers ever an Intel’s priority?

  17. McTerry says:

    I don’t know why, but that heat signature picture reminds me of Spy vs. Spy.

  18. Vigidis says:

    I’m still running on one of the first i7 Core…the 920 from the old 1366 socket. And in all this time I have not once seen it at 100% CPU Usage…NOT ONCE! (not even with an SSD), I even monitor the logs in case it does that while I’m not looking.

    So I ask you…why should desktops give a shit?

    • Benny says:

      You should try some 3D rendering. Planetside 2 and Natural Selection 2 give it a good workout as well, if only on 1-2 threads.

      And it definitely helps to overclock it at least to 3ghz, or 4 if you have a nice cooler and it’s stable. Even if it’s only being worked 20% that extra speed is huge (especially considering the very low stock clock speed)

      • Mario Figueiredo says:

        Or video editing, or programming, or database management, or … any cpu intensive task.

        That said, it is true that for the majority of users, CPUs have long hit the sweet spot in which more speed simply becomes unnoticeable because their quotidian tasks can’t stress even an older model.

        I think gaming has seen diminishing returns in terms of processor speed too, as the tendency keeps being to offload as many tasks to the GPU as possible. There may come a time when the CPU ceases to have any real impact on gaming performance.

        The emergence of multitasking games has been frankly slow and of little interest. Truth the matter is that the current concurrent programming models are extremely complex, prone to error and hard to debug. Programmers tend to shun away from C++ threads if they can, for instance. Or greatly reduce their presence by limiting the game to 2 or 3 threads at most in order to minimize application maintenance and bug reporting. The much easier actor model still doesn’t have a proper implementation in C++ (and probably never will) and is still the domain only of dedicated programming languages like Erlang. The threading model is a veritable pandora box and needs to die, if we ever want to mass produce games that can seriously take advantage of processor speed.

  19. kwyjibo says:

    Were people really interested in Haswell desktops? They already announced a while back that the only real difference was in the graphics, and that would only be available on laptops.

    What I’m still waiting for are laptop benchmarks, and Iris pricing.

    • Bremze says:

      The tone of the article is as such because the product doesn’t appeal to the writer and to the audience, that he is representing. This isn’t rocket science. We want more power and we don’t care about how it is going to impact the amount of money Intel crams down their investors throats.

      In a nutshell:
      *Intels 22nm process is less suited to leaky high performance transistors, so clock speed increases go right out of the window
      *ILP has been Intel’s bread and butter, but that is seeing hugely diminishing returns for the past couple processor generations
      *Hexacore processors would be pretty huge with a GPU and the additional L2 and L3 needed
      *Instruction set extensions is a fine idea in theory, but Intel has done anything they can to sabotage (Cockblocking AMD’s FMA4, removing AVX from the Pentium and Celeron line and TSX from the K line on Haswell) it

      Pretty much every way forward to give gamers what they want, would mean sucking up a loss in profit margins. Pick your poision, or don’t, we’re just going to sulk a bit and not give Intel our money.

  20. jasonsewall says:

    I am a big PC gamer (and I haven’t owned a console since the NES my parents bought me when I was 6 or 7) and I work for Intel in research. I am a big fan of RPS, and I’d like to weigh in. I work far enough away from product groups that I feel I am relatively unbiased, but I like to think I have enough expertise on architecture/software and gaming to say something useful here.

    Probably the most important thing to consider here is what PC gamers need/want from a CPU. This segment is a strange one because the gamer has control over what they buy but are downstream of both the hardware manufacturer and the game/software developer in terms of what is added to the hardware and ultimately used in the software.

    If the hardware vendor puts feature X into a processor, will/can the developer use it? Is the feature really killer, such that they design a game around it, which risks alienating all the gamers who didn’t upgrade? Is the feature something they can make optional? Does that mean that it is really killer?

    It’s a classical dilemma in the industry, and one that consoles (sort of) solve by providing a level hardware playing field such that the question needn’t be asked — the hardware is homogeneous.

    I am a little disappointed by sullen, sulky tone of the article. Instead of complaining about a lack of attention, why not be specific about what is missing/bad and explain the position? I also somewhat resent the implication that PC gamer = desktop user; while I do usually use my desktop for games, the laptop is a popular and viable option, and one that is probably made more viable by the release of Haswell.

    But back to the complaints: what is missing? Do you want more performance?

    * We can increase clock speeds, but the cost is nonlinear; things will get much hotter and more power hungry the higher the clock goes, and that doesn’t address the issue of memory bandwidth, which is a serious limiting factor.
    * We can increase core counts, but developers don’t seem to do much with them. I have a 6-core system at home that I game with and hardly any games utilize more than 2 of them (and even then, they are not fully occupied)
    * We can add micro-architectural features (and we did, like AVX2 and FMA, along with TSX and more execution ports) but games don’t usually use such things for the reasons explained above.

    The real issue here, as touched on my other commenters, is that PC games aren’t really compute-bound. Even the graphics side of thing has slowed down considerably from the early 2000s; you can run basically any game from 2013 on graphics hardware from 4 years ago at respectable quality.

    So, if you didn’t need more performance, how about some power savings? I think the reading that Haswell is a move in the mobile space more than others is completely correct, but that doesn’t mean it isn’t useful for others.

    As an aside, people shouldn’t gloss over the eDRAM some SKUs of Haswell have; that is a really interesting feature that can go a long way to killing off some memory bandwidth/latency bottlenecks. I am not certain that games need it at the moment, but perhaps it will become one the killer features that will open up new doors to gaming.

    The only things I can imagine adding to games at this point that really demand more compute is more detailed, interactive physics. Something like gameplay-affecting fluid simulation would require lots of horsepower and could certainly take advantage of the very large L4 cache.

    Perhaps this article is simply consciously taking the lede that controversy gets readers, but I don’t think that’s the editorial vision RPS espouses. At any rate, as an exercise for himself (and I think a good piece for though for the rest of us) I recommend the author think carefully about what he expected and wants in his hardware and write about that next time.

    Cheers,
    Jason

    Edit: spelling!

    • Snakejuice says:

      Thank you for taking the time to write all that, it was a good read.

      “Do you want more performance?”
      YES!

      “The only things I can imagine adding to games at this point that really demand more compute is more detailed, interactive physics.”
      YES, PLEASE!

      Yes my 3570K is enough for most tasks but I’m a 60-120 fps guy and it really bums me out that I only get around 30 in busy areas of Planetside 2 (or busy areas of ANY mmo-esque game). So please, hurry up and double your clocks/work done per clock/combination of it!

  21. Mario Figueiredo says:

    Actually I would love for a longer upgrade cycle from Intel. If only every other architecture was developed specifically for desktop computers, we’d get more meaningful and cost effective upgrades. Intel very aggressive upgrade cycle always felt to me like creating a need where before the was none, instead of offering a solution to an existing problem.

    Certainly no one is forced to upgrade and we can all in a way or another deflect marketing strategies and dubious advertising. Still, Intel fast upgrade cycle has had a negative effect on PC prices since it tends to quickly dispose of still perfectly valid and cheaper “old” architectures while, artificially in my opinion, constantly injecting newer more expensive ones.

  22. baltasaronmeth says:

    How could a dramatic reduction of the energy input be not relevant for the desktop? I have been delaying the purchase of new hardware, because I hoped for something like this. I currently run a lot of calculations (~70% of the time) on three i5 750 and I will probably continue doing so for at least two more years. If the Haskell might be able to save 30€ per year and CPU, then I’d gladly waive on more speed for more power.

    But I agree, if this is supposed to be the new generation, then I would have expected a little more.

    • Sakkura says:

      Haswell does not dramatically reduce the energy consumption. It actually increases it slightly, compared to Ivy Bridge.

  23. merrins says:

    Whiny, thoughtless, garbage. Try to tone down the persecution complex.

  24. Branthog says:

    Yep, most gamers don’t care about the on-die GPUs as they’ll go for one or more discrete cards. Haswell wasn’t a meaningful performance improvement over Ivy Bridge, but Ivy and Sandy weren’t massive improvements, either. We are no longer in a time when you can expect to build a new rig every two or three years and expect to get double or triple the CPU performance out of your new chip as your last chip. Instead, we can expect to get maybe 20% to 50% every two or three years.

    Everyone throws out the explanation that “this isn’t for gamers or hardcore desktop users; this is for mobile and for people who are more concerned with low power consumption”. Well, fine. I don’t really care either way. Whatever the intention of the chip, it’s of no importance to me because it’s a trivial change over my 3770K. As soon as there’s an impressive leap, you’ll get more money from me.

    • iniudan says:

      Actually the integrated gpu 5200 could be nice for desktop, has it come with 128MB of eDRAM, which can also be used has a L4 cache. But sadly it the model they don’t make available for desktop socket.

  25. Daniel Klein says:

    Will this have an impact on prices for the 35* line of i5s?

    • MacTheGeek says:

      Did Ivy Bridge cause significantly lower prices on Sandy Bridge chips? No, not really.

      The only thing that’s going to drive prices down is a viable competitive product from AMD.

  26. trajan says:

    I read in PC magazine that the new production methods for Haswell are making costs per cpu much lower. Hopefully, this is one thing desktop computing gets from this chipset.

    • Sakkura says:

      Haswell is on the same manufacturing process as Ivy Bridge.

  27. Arkh says:

    Thanks for the awesome article, Jeremy Laird!

    I may do a upgrade to the old generation since this one isn’t just that better and my AMD 1090T is overheating and generally not being able to handle things all this well.

    Also, FUCK YOU INTEL. WHY THE FUCK YOU NEED TO CHANGE SOCKETS EVERY GODDAMN TIME?

    I wish AMD was on par with Intel.

    • iniudan says:

      Since they are not pushing much on performance, this might actually give AMD a chance to catch, since for power saving AMD APU architecture actually give them an edge, due to it been a more integrated system-on-a-chip then what Intel CPU offer. (It one of the reason why AMD tend to have such high TPD compared to intel started with the core architecture, just more stuff crammed into AMD chip)

      Intel mostly only keep up on power due to using a smaller scale process then what was available through contract manufacturer for AMD.

      Also with Kavari, AMD starting an heterogeneous system architecture, which basically mean certain of their APU will be x86 processor with an ARM co-processor (planned to be generalized to all APU by 2014), use will be hardware security for first gen, has hardware security not part of x86 specification, so quick shortcut for now, but this could give them an edge on both Intel and ARM ODM (Samsung, Apple, Qualcom, Nvidia), has they could have the only CPU chip that can work on both side of the fence, which could lead to some quite ingenious and unique hybrid systems.

      I admit, going a bit ahead of myself on that last paragraph and been too optimistic, but hey, but hey just been hopeful, due to it been possibly the most interesting thing happening to processor architecture since AMD64.

    • Snakejuice says:

      You will be very happy with that. I threw old my shitty 1090T rig and built a i5-3750K rig about a year ago. Even tho I kept the GPU from the old rig (GTX 570) my fps just about DOUBLED in many games and it’s what pushed my source-based games into stable v-synced 120 fps. So very very happy indeed!

  28. Clavus says:

    Oh well, look on the bright side. Gives AMD some time to catch up. I’m still on a first generation Core i5 750. The per-thread performance of Haswell is about 1.5x higher so it might still be an interesting upgrade for me.

  29. Grape Flavor says:

    I think it’s cute how people still seem to think and act like the desktop PC is the only thing that matters these days, or even that it’s still the most important thing.

    Look at the sales numbers. Not just for this year, but for the last decade. Moving more and more towards mobile, first laptops, then tablets. The desktop PC is a niche market at this point, only holding strong among workstations and PC gamers. Of course Intel (and Microsoft for that matter) are going to focus their efforts on competing in mobile, that’s their weak spot, and that’s where the money and the market share are.

    The sooner people come to grips with this new reality instead of taking it out on Intel and MS, the easier it will be for everyone. You don’t like this shift? Fine. But there’s nothing you, or I, or Intel or Microsoft for that matter, can do about it. They’re moving their businesses with the market as a matter of survival. They’re not going to throw away the future of their companies just to shore up the bruised egos of desktop PC enthusiasts, by continuing to treat them as the highest priority.

    The desktop isn’t dead, or dying. It still has a future. But that future is as more of a niche now, not as the dominant computing paradigm. It’s not the end of our form factor, only the end of our form factor ruling the world, and there’s nothing that can be done about that.

    Right now, desktop enthusiasts are still in the anger phase, raging at Intel and Microsoft for Haswell and Windows 8. I hope they snap out of it soon, before they sink the two companies that have the best shot at making this new future easier for us, with Wintel-powered hybrids that offer the power, functionality, and compatibility of a traditional PC with the convenient form factor of a mobile device.

    Because what many enthusiasts are asking for now – that everyone go back to treating the desktop as preeminent, when it’s not – is just going to result in a weakened and marginalized Intel and MS struggling to make a business out of our niche sector, with mainstream computing dominated by companies like Apple and Google, who couldn’t give a rats ass about old fashioned PCs and making their products work well with them. That’s not a better future. We should adapt now, rather than regret our actions later.

    • jrodman says:

      To be fair, windos 8 deserves the rage for sucking, not for some kind of traitor complex.

  30. roryok says:

    I think the sub-heading “Oh no, not laptops again” pretty much sums up the problem with this article. RPS is a PC gaming site, not a Desktop PC gaming site. Laptops are PCs, and they are the most popular type of PC by a long, long way. This, and the constant Windows 8 bashing that goes on makes me feel like elements within RPS care more about people adhering to a very strict, narrow definition of what a PC is than they do about PC gaming.

    • Grape Flavor says:

      Well said. Windows 8 tablets are PCs, as well. There’s a lot of different kinds of PCs. I’d even argue that Macs and Linux systems are PCs, too.

      Yeah, look, I love my desktop rig as much as anyone, but this blinkered view that desktops are all that matters and everything else is just a sideshow is very narrow minded, and way out of touch with the current market realities.

      • iniudan says:

        I will even argue that ARM mobile device can be PC, for I think we are long pass that PC is an IBM compatible and that the term PC should simply be used for multi-purpose personal computing device.

        Has for having to argue that GNU/Linux (using due to speaking of ARM mobile device, which involve other type of Linux based system) based system is a PC, sorry have to laugh that one, for I think it is preposterous to even have to argue it. For OS X based system, I could understand a bit more due to Apple computer never having been x86 based system until 2006, but since then I think it would be ridiculous to have to argue against it.

    • iniudan says:

      Here I am just bashing Windows 8 bashing because it is a terrible GUI on device that have a non-touch focused usage and I don’t see the point of touch enabled device beyond 10-11 inch, except for embedded system, has then you find yourself with a device that drain more power, is more expensive and just too cumbersome to just hold with 1 hand, so the other can use the interface.

      But I do see the desktop disappearing, but I am welcoming it, for I see the enthusiast market simply switching to a home server based model in the future, has it then become much easier to take care of a multiple device home or empower mobile device with capability beyond their hardware, while still keeping the benefit of the mobile device, if desktop still exist in such environment, it simply be a thin client or the user using the server has a desktop.

      • roryok says:

        you find yourself with a device that drain more power, is more expensive and just too cumbersome to just hold with 1 hand, so the other can use the interface.

        You’re talking about tablets there. I think the hybrid tablet / laptop form factor is the one that’ll win out. It’ll still be mobile, but people need keyboards

        I see the enthusiast market simply switching to a home server based model in the future

        Now that’s a really interesting idea… it does make sense, having a mini-server setup away in a cupboard somewhere, and just having thin clients spread around the house. I can really see that. Cloud gaming might not be taking off, but local-cloud is essentially what Steam “Big Picture” mode is trying to achieve. Would be interesting to see that spread out

        • iniudan says:

          Not really the one hand part was to illustrate why touch based hardware is a bad decision on laptop and desktop. Has for power drain and cost still important for laptop and desktop, even if power drain for the later is not really something enthusiast look at , outside of Asia, where underclocking is actually an important part of enthusiast culture, due to the price of electricity, thus having display drain less power, which I think would be important.

    • Jeremy Laird says:

      This is tricky. Haswell is definitely more interesting for mobile gaming than for desktop gaming. But if you look at, for instance, the latest Steam survey, mobile GPUs are borderline non existent beyond Intel integrated. And even with Intel integrated, mobile is a small minority. Under 20 per cent, by the looks of it.

      On the one hand, the fact that some people are stuck with Intel integrated means faster integrated is really interesting. On the other, the SKUs with significantly faster graphics are so expensive, you’d be better off spending that extra cash in a discrete GPU.

      That’s why overall, I don’t actually think Haswell is terribly interesting even for mobile gaming. If you can afford to buy something with the Iris graphics, you’re better off with a cheaper CPU and a discrete GPU.

      • roryok says:

        You need to look at it more abstractly. Sure, maybe Haswell doesn’t directly provide any improvements to GPU or framerate for gaming, but it does provide a fairly dramatic extension of battery life (hopefully, if it lives up to the claims). It means many people, gamers included, are going to ditch ageing desktops for brand new laptops now that they have a battery life to match ARM tablets. Gamers that were maybe doing a lot of casual gaming on iPads and android tablets will be back PC gaming on trains and on couches.

        To be clear, I’m not saying that mobile PC gaming wasn’t already possible with current hardware, I’m saying the new battery improvements (and resolution jumps) are going to make laptops a lot more attractive to gamers.

        Maybe more mobile GPUs will mean more titles like Hotline Miami / FTL / SuperMeatBoy rather than Call of Duty or Crysis. That can only be a good thing.

  31. Joc says:

    Coming from an AMD Phenom II X3, a little bit of extra outlay (there’s like 30-40 quid difference between the 3770k and 4770k?) isn’t all that bad for a 5-10% difference in performance, with the option to upgrade should the next round of Intel processors prove more earth-shattering. Seeing as the next time I can afford to upgrade will likely be in another 3-4 years, moving to Haswell should help me to keep up a bit.. fingers crossed.

  32. Solidstate89 says:

    The only real interesting thing they added in Haswell is the addition of Transactional Memory. Something absolutely no one here will or could ever take advantage of, sadly.

    So yeah, as someone who currently uses an i7-950 in his desktop, bring on what AMD has to offer for their next-gen chip, because if I had to upgrade right now, I’d probably go with Ivy Bridge over Haswell.

    • SuicideKing says:

      You know, i’m hearing this for the third time now, but why won’t it be of any use? No tech site’s covered it (during this launch, i mean).

      • jrodman says:

        It may be of some use for making concurrent code run somewhat faster and/or more reliably. The jury is still a bit out as to whether it’s going to become The Way Things Are Done or a sidenote experiment. But we definitely will need to develop better tools for both correct and performant parallel code in order to deliver the promise of a rising number of cores.

  33. SuicideKing says:

    Well, Jeremy, you’re mostly right.

    Mostly, because:

    1. Not a jump for Ivy/Sandy Bridge, but it is for everyone on older stuff, especially those like me who’re running Core 2 era stuff.

    2. The motherboards are damn nice this time.

    3. They had to change the socket only because of the integrated VRM

    4. Haswell runs hotter because: http://forums.anandtech.com/showpost.php?p=34053183&postcount=570

    5. They’ve actually improved a lot of stuff with the microarchitecture, but it’s reached the point where a lot of the changes at that level aren’t making a difference. Need MOAR COARS but where’s the software that uses them? Might make sense when Skylake hits.

    6. 20x lower power on idle only for mobile.

  34. Nesetalis says:

    Gotta wonder if they are throwing a bone to AMD. Essentially “We know we need competition, if we kill you now, we will become a monopoly with all that entails, so here.. have the desktop market this time around, keep yourself afloat and keep us doing what we do best.”

  35. Ultraman1966 says:

    Really need AMD to bring their A game, this one horse race means Intel can literally bring it a tiny incremental increase to performance and charge the same price; Moore’s law is being defiled! Well, I know it’s not really but you get my jist.

  36. wyatt8740 says:

    Until they make their OLDER CPU’s cheaper, I’m sticking with my Piece of crap with a 3ghZ Pentium 4 HT, circa 2005. Think an i7 at half the current price if you buy a generation older.