Week in Tech: Intel Updates ‘Orrible Haswell, Faster SSDs

By Jeremy Laird on March 6th, 2014 at 9:00 pm.

Intel's new CPUs Hz so good

An extra 100MHz. This is progress, Intel style. I speak of the expected refresh of Intel’s Haswell-vintage CPUs, due in a month or so. It’s a PR upgrade to what was already an underwhelming family of desktop processors and yet another example of some pretty specular foot-dragging from Intel in recent years. Will Intel’s next properly new family of chips, known as Broadwell, be any better? If not, we should at least be able to look forward to a big step up in SSD performance fairly soon in part enabled by Intel’s upcoming 9 Series chipsets. Well, it’s something to look forward to…

Officially, this falls under the rumour heading. But it all looks very plausible to me. Intel is prepping an interim refresh for its current Haswell Core iSomething-4xxx chips for the LGA1150 socket. And it looks like it boils down to a 100MHz speed bump.

The bump applies across a range of Core i3, i5 and i7 chips. For instance, the new Core i7-4790 is expected to clock in at 3.6GHz nominal, 4GHz Turbo, while the Core i5-4690 is a 3.5GHz/3.9GHz chip. Again that’s a spectacular 100MHz increase on the existing 4771 and 4670 chips.

To be fair, marginal speed bumps are a long established tradition for interim refreshes intended to give a CPU family a temporary sheen of novelty before something genuinely new appears. Routine, fairly inoffensive capitalism at work.

The problem is that Intel’s mainstream processors have been stagnating for so long now, every time the firm wheels out a new processor that underwhelms with incrementalism, like the drip-drip of water torture a minor blow lands with a painful thump. It’s becoming tiresome.

Intel’s cheapo Haswell packaging is likely to remain

You may also note that I haven’t mentioned any new unlocked K-Series Haswell chips. None are expected. But that’s arguably a moot point since these new models may well not even be new steppings and no improvements to Intel’s increasingly shonky chip-packaging policies (Intel has been saving a few pennies by no longer soldering the heat spreader to the CPU die and electing to use cheap thermal paste inside the processor packaging) appear to be on the cards.

So, there’s no reason for any new K-Series chip to overclock any better than existing models. As for pricing, it looks like the new models will slip in at around existing levels, so you may get that exciting 100MHz boost for free. Cue much rejoicing.

Of course, I’ve been arguing for a while that the notion of good-enough CPU performance has some validity. And yet I still yearn for something more from Intel. When you look at the wealth of its x86 server chips, which are now available with up to 15 CPU cores, the desktop offering looks feeble going on cynical. It’s a familiar refrain, but this is what a lack of serious competition from AMD leads to, unfortunately.

If this Haswell refresh looks deathly dull, what hope for some excitement from Broadwell, the next major waypoint on Intel’s CPU roadmap? First up, Broadwell and specifically its use of 14nm process tech seems to be suffering a painful birth.

Last October Intel conceded that yields from a test run of 14nm chips were poor enough to force a rethink on the roll out of Broadwell. Further delays have been rumoured lately and it’s possible Broadwell won’t be widely available until early next year, which would constitute a bit of a blow to Intel’s increasingly precarious tick-tock strategy of launching new architectures and production processors in alternating years.

Anyway, from what I can tell, nothing much is expected from the CPU side of Broadwell. The cores will be pretty similar and once again we’ll be stuck with four of them for the mainstream desktop socket. Power consumption will probably plummet and the integrated graphics performance will take another leap towards (but no doubt not actually achieve) true game-ability.

But I doubt there will be much to get excited about for us, the bedraggled gaming and desktop enthusiast community.

Die die, die. It’s German for, ‘The die, the’

That said, one Intel tech that does look promising for the desktop is the 9 Series chipset and its improved SSD support. OK, it will slightly piss me off that Broadwell reportedly won’t be backwards compatible with existing 8-Series motherboards.

But the good news is that the spec list for the new Z97 chipset looks likely to include an M.2 SSD connector hooked up to a pair of PCI Express 2.0 lanes. Explicit chipset support isn’t strictly necessary for M.2 support, of course. You can already get the odd Z87 board with M.2 and several laptops based on the 8 Series platform sport M.2.

But the idea is that M.2 will become a standard feature for Z97 mobos. And that means a boost in peak storage performance from today’s 550MB/s courtesy of SATA 6Gbps to circa 1GB/s. A healthy speed bump by any metric, even if it’s arguably IOPs and 4K random access SSD performance that really needs a bump, not peak sequential throughput. But I’m trying to find the positives. And the sequentials is all I got.

__________________

« | »

, , , .

78 Comments »

  1. Tei says:

    I want to post this fast

  2. joa says:

    You can’t really blame Intel for not putting any money into improving desktop processors. Do the majority of people need an upgrade — I mean for browsing the internet and watching videos? For that matter, do gamers even need particularly faster CPUs? It seems more important for graphics to be faster.

    In my understanding, processing speed has been limited my RAM speed for quite a number of years now. RAM is 20 times slower than the processor unit. So if the processor is capable of executing 5 million instructions per second, say, then it has to limit itself to 250,000 instructions per second so RAM can keep up (I mean assuming every instruction needs some memory to work). You could essentially keep current processors as they are, and just wait for RAM to get faster. And the processors would get faster along with it.

    • Sakkura says:

      That’s not exactly true. If you put in faster memory, you don’t really get any benefit. This goes for either bandwidth or latency.

      • SuicideKing says:

        Yup, for most applications it levels off after DDR3-2133

    • FriendlyFire says:

      Not really. You’re forgetting this little thing we call “cache” which is where the vast majority of the computation time is usually spent. Cache speeds are orders of magnitude higher than memory speeds, and they’ve grown bigger and bigger with each new iteration.

    • Unclepauly says:

      I don’t even know where to begin dissecting this post. I’ll just say you have a fundamental misunderstanding of CPU architecture. Also there is SO many games that are still CPU bound.

    • -funkstar- says:

      While there is truth to this (yes, memory access is slower than register access) it ignores pretty much all the advances made in memory architecture. Very briefly, memory access is no longer always done at the cost of querying RAM, but is layered into a hierarchy of smaller and smaller – and faster and faster – memory caches, which reduces memory access costs significantly when the memory you need is in cache (cheaper the closer to the CPU registers you get). And there are instructions to prefetch blocks of RAM into cache before they are needed, which can be used by compilers to reduce stalling even further.

    • Malibu Stacey says:

      In my understanding, processing speed has been limited my RAM speed for quite a number of years now.

      Your understanding should go read up on what Level 1, Level 2 & Level 3 CPU caches are for then.

  3. Captain Joyless says:

    Why spend money on R&D for faster chips when you have a near monopoly? AMD is so far behind it’s not even funny.

    • SkittleDiddler says:

      Pretty much. I’m all for bashing Intel when appropriate, but in this case it should be pretty obvious to anyone why Intel are dragging their heels in the desktop CPU realm.

    • Carra says:

      How come AMD fails to catch up? All intel is doing is give us a ~10% increase each year.

      • waltC says:

        AMD desktop cpus are nowhere near as “slow” as the people who don’t own them will tell you they are…;) I won’t buy anything else at the moment besides AMD–haven’t bought/built an Intel rig since ’99 because Intel hasn’t given me a reason to, and there isn’t any software I have a problem with. I also like saving money on my hardware which I plow back into more software. It’s very nice, actually. Saving money is always fun.

        People will pretend that there’s a “big difference” between 150 fps and 120 fps in a game (when the delta is that much), but to the person playing the game virtually no difference at all is perceivable (in fact, unless you’re running a benchmark program you wouldn’t see any difference at all while playing and you’d *have to* run a benchmark to show the difference in frame rates.)

        Additionally, and I think more importantly, 95% of so-called “performance” games these days are run at decently high resolutions, say 1920×1200 & higher (which is where I am) all the way up to multi-gpus and 4k resolutions, and the games are virtually *all* GPU bound at those resolutions–the GPU then becomes far more critical to the customer than the cpu at that point. Intel is so far behind AMD on the GPU front that it is doubtful Intel will ever catch up (and Intel is almost as far behind nVidia, too.) These days in 3d gaming, the GPU is king–unless you want to drop back to 640×480 or 1024×768 to play your games–and that would be about as ugly as sin and pretty much spoil the whole thing…;) It’s simply absolutely true that the GPU today is far more important to playing a game at decent resolutions than any cpu either Intel or AMD currently manufactures. With AMD, too, as I said, you get the added benefit of saving what is sometimes significant amounts of money–for cpus. For GPUs, Intel offers *nothing* competitive to either AMD or nVidia.

        • Geebs says:

          I guess you also like paying for all that extra electricity too? Brand loyalty can be a weird thing.

          • SkittleDiddler says:

            Believe it or not, some people aren’t concerned with the minimal difference in power consumption between Intel and AMD offerings.

            Kind of a petty thing to bring up, although others have felt the need to point out Intel’s mind-blowing fantastical revolutionary power savings in the comments here, so I can’t really blame you for trying.

          • TacticalNuclearPenguin says:

            Still, the fact that the Bulldozer architecture has a weaker IPC than even the Phenom that came before tells something, since the latter wasn’t a technological marvel to begin with. AMD likes to tell you that they have more cores, while in fact it’s just an architectural stretch that has a lot of drawbacks. Not “real” cores anyway.

            Games DO enjoy IPC, a lot, games are not like encoding with WinRar or raytracing, more cores are great but they’re not as scalable in that department as there’s always going to be a main thread that has to be synchronized, at the end of the day some extra agility is incredibly preferred.

            The latest, biggest 8 “cores” from AMD have some decent scaling on the newer games but we’re still not there if you simply want to be prepared for any situation, especially if you like to play the usual culprits like MMOs, and it’s useless to argue that there’s no technological gap when AMD released a 5ghz GPU with 220 TDP.

            220 TDP. If intel did the same thing you’d have a monster, my “ancient” Sandy Bridge clocks at 4.6 GHZ and HWmonitors puts it at around 90W when stress tested, and it’s a CPU that’s still stronger overall than anything AMD has put out, now or in the near future.

          • SkittleDiddler says:

            @TNP: Don’t get me wrong, I’m not arguing against discussing thermal issues under the proper context, but it’s far too often brought up in these kinds of debates by Intel fanboys who are only looking to one-up someone who dares to mention they use and appreciate AMD hardware.

            The average user (hell, the average gamer) doesn’t give a shit about TDP, thermal output, or wattage, and the ones using AMD procs are certainly not going to be too concerned over the fact that they’re paying $20 more per year on their electrical bill than the Intel user they live next door to.

            Also, you’re comment on MMOs was a little strange. Multicore AMD procs run MMOs just fine, outside of unoptimized titles like Planetside 2. Is there some widespread problem I’m not aware of?

          • TacticalNuclearPenguin says:

            The wide spread problem is that indeed most need a strong core, you mentioned PS2 but there is WoW, Tera and so on, it’s just that MMOs in general are a thougher beast, and sometimes it’s not even just about removing a CPU bottleneck from a strong GPU. I’m talking about maximizing performance here, not the personal view about what “fine” is.

            Then again, i’m not really discussing on what’s “necessary” for most, nor the meaning of such a word in the context of a hobby approached with wildly different budgets, just plain trying to say that Intel is at various price ranges a steady and solid offer that has no Achille’s heel, just that.

            And it’s not about fanboying either, it’s simply that AMD is focusing on other stuff and when it comes to integrated GPUs it’s actually better than Intel. The thermal comparison was simply a mean to explain how far AMD is when it comes to “straight” CPUs, as a measure of how hard they have to try and just that, and i don’t think this point in particular is really much debatable.

            To end with some joy, have this anecdote: there’s always a use for CPUs, in PCSX2 ( god bless custom resolution and anysotropic filtering ) i’m emulating FF12 ( everyone should give it a go, it’s a masterpiece. ) and without the frame limiter i’m going 4x speed for leveling, it’s genious stuff i tell you!

          • Geebs says:

            Don’t be daft. The poster I replied to was promoting AMD on the basis of 1) personal preference, 2) being functional (“good enough for me”) argument and 3) being a bit cheaper at point of sale. Personal preference to the point of liking the feeling of having GPU and CPU from one manufacturer is mere superstition, just being functional is not a reason to recommend a product, and price of an electrical component without considering running costs is a simpleminded approach given the current ridiculous rate of rise of energy costs. Clearly this makes me a fanboy?

          • SkittleDiddler says:

            @TNP: Thanks for clarifying that. I’d still argue that AMD CPUs are just as capable of maximizing performance with the right combination of hardware as Intel setups are, but I get what you’re saying.

            @Geebs: waltC laid out his reasons for preferring AMD hardware pretty clearly, and your response amounted to “but..but..extra electricity…brand loyalty…”. Nowhere in his post did waltC even hint at “it’s just good enough for me” — that was your personal interpretation of what he wrote.

            Yeah, you came off as kinda fanboyish.

      • SuicideKing says:

        AMD’s done 10% in 6 years, that’s why.

      • Low Life says:

        AMD aren’t even trying. Like literally, they don’t care, they’re focusing on the APU business.

        • TechnicalBen says:

          This. Anyone getting a internet browsing/office PC get’s recommended an Intel for on board graphics.
          Anyone wanting similar that can play games, is recommended an AMD with APU as it is compatible (Intel graphics has far too many bugs).
          That’s if they don’t want dedicated GPU. Intel just cannot touch it, and never have. Even on motherboard I’d recommend an AMD in the past (it’s now all on CPU AFAIK, no motherboard integrated GPUs around, or very few).

  4. aratuk says:

    Haswell is about energy efficiency… exciting for the less deskbound. Or those who mind their utility bills.

    The difference in battery life between Haswell laptops and the models that immediately preceded them is fairly incredible. It’s odd to hear it described as “underwhelming” “foot-dragging”.

    I mean, HURRRRR GRAPHIXXX GAMEZ!!

    • soldant says:

      If you’re only interested in desktop processors then yeah, there’s nothing to be excited about. But like you, I don’t see how it’s ‘foot-dragging’ if they’re releasing processors with (somewhat) better performance and lower power consumption. Even on my desktop I’m pleased to have something use less power or put out less heat. Really, the article reads like a lamentation on the loss of the 6 month upgrade cycle.

      Who the hell misses those dark days?

      • aratuk says:

        Yeah, the criticism here doesn’t seem to be that processors are missing some capability that it would be good to have. Just that (HURRR!) they aren’t getting faster fast enough.

        People here are complaining about “only” four cores, but for most purposes — especially games — it’s better to have fewer, faster cores, because the processes you want to run aren’t multithreaded enough to take advantage of eight or twelve cores (with necessarily lower clock speeds).

        And yes, it’s very nice to be able to hang on to a high-end machine for longer without it becoming noticeably less useful, while the silicon industry focuses on energy efficiency and mobile SoCs. People will always find something to complain about.

      • LionsPhil says:

        Really, the article reads like a lamentation on the loss of the 6 month upgrade cycle.

        Who the hell misses those dark days?

        Hear hear!

        • TechnicalBen says:

          To loose the 6 month upgrade cycle and get nothing in return though… “Boo!”
          If they lost the 6 month upgrade, but passed on the savings of production, possibly. Though I guess a lower demand would eat into any possible production savings, so the supply/demand scales knock prices back up and they stagnate along with performance/progress. :(

    • sophof says:

      Unless your running 24/7 (and even then) the savings are minimal. The difference for a desktop is a few kWh per month at most. Intel may be pretty awesome in this regard, an i3 is a serious option if you want to build a NAS for instance, but this is also the only place where it is really going to make a difference. And even then you are talking in the order of maybe 40 euros savings a year or so (I did the math for that when building a NAS, that’s why :P).

      I’m all for being more energy efficient, and small increments add up over the years, but it doesn’t work as a sales argument for the desktop at all. It’s more an ‘all other things being equal’ kinda thing.
      Intel right now gives you pretty great performance with low idle power, making it king for home servers and stuff. It also has the high-end covered (which frankly almost no gamer needs atm), but the middle ground is actually fairly competitive. And the middle ground is where most gamers sit I would assume.

      The only problem AMD has right now is that they don’t give you a compelling reason to go for their CPUs, not that they are way behind. They have been focussing on APUs for a while now and those are actually miles ahead of what intel can offer. I doubt that’s ever going to be a factor for gaming though, separate GPUs won’t be replaced for quite a while.

  5. Monkey says:

    What about this Haswell-E thing? 6 cores and new DDR4 shiz niz? I’m still sitting on my i7 920

    • Sakkura says:

      Wait until Skylake. Or Skylake-E, it looks like that will get PCIe 4.0 support unlike regular Skylake.

    • SquidgyB says:

      i7 920 high five!

      Been running mine at 4Ghz for years now… touch wood.

    • kevmscotland says:

      Running a i7 960 here.
      I keep toying with the idea of a upgrade.
      Currently looking at a 4770K with a new 1150 Motherboard.

      I’d go socket 2011 but they really are rather expensive for a half decent mobo.

      Still dithering though. I dont particularly struggle to run anything at the mo ( i7 960, 12GB DDR3 at 1600Mhz, radeon 7970) but I feel like the next wave of ‘next gen’ games might push my poor old processor to breaking point.

    • SuicideKing says:

      Correction: 8-cores and DDR4.

  6. jasonsewall says:

    Disappointing analysis — there is a legitimate complaint to be had that desktop processors are not as sexy as they once were for gamers, and this article offers almost no insight. Why not comment what low-power parts means for a ‘desktop’ PCs? What’s wrong with FMAs and AVX2? Do you think games will be able to TSX? How about the MCDRAM packaged with certain SKUs of this microarchitecture?

    Have you tried running any recent games on one of the EX server parts? Do you think that many of them will benefit from 10+ cores? I would guess that no matter how scalable you make your games, you still want to support something modest (like a laptop), which means you can’t run with the FLOPs.

    I don’t even see how this fits into RPS editorially. What games did the RPS staff gush about most this year? Teleglitch? Papers, please? These are not very demanding games, from a hardware point of view. So what? Isn’t it nice from a gamer POV to see at least some developers splitting from the GFX technology pack?

    This reads like an 800-word trollfest, not a incisive, thoughtful article. What would someone working in desktop parts are Intel take away from your post — would they have a sense of what you would like them to do?

    • Sheng-ji says:

      Slightly lower electricity bills nothing no less important than you think no certainly not and bugs would become a nightmare Day Z, Metro Last Light and Titanfall so far so good yes it is yes they would.

      (In my honest opinion, naturally)

      Done my best to answer all your question!

      • Universal Quitter says:

        You might want someone with a better command of the english language to retype that for you.

        • bhauck says:

          They answered each question in the order it was asked in idiomatic English. You might want to ask someone with a better command of the English language (or at least ability to parse poor formatting) to read that for you.

      • MichaelGC says:

        Anyone else getting Ulysses flashbacks?

    • DiTH says:

      Although i agree about the no-insight-whiny article we really need faster CPU’s,GPU’s,RAM or whatever comes our way.
      VR age is right around the corner and we are still fighting to get 120/1080p on medium->high components towers….
      If a successful VR set releases in the next few years we will hit a wall with the current tech our boxes have.

      • Jeremy Laird says:

        Read in isolation, the whiny-no-insight critique would carry some weight, I grant.

        But I’ve posted plenty of content covering low-power hardware and the benefits thereof. NUCs, AMD Jaguar, Small form factor PCs, yada yada. This post is specifically in reaction to a line of refreshed mainstream desktop chips. That’s the context for this post.

  7. Baines says:

    Speaking of Haswell, was the Windows 8.1 memory leak ever fixed? The one present in Windows 8.1 from the start, but which was dismissed and mis-attributed for months? (I know it applied to Haswell owners. I don’t know if it applied to other integrated graphics solutions. Basically, if you had an on-chip graphics solution (that was not completely disabled) and ran a full screen program at a resolution setting different from your desktop resolution setting, then you suffered a memory leak that would eventually consume all your RAM and swap space, which was freed when you exited full screen. Users, without acknowledgement of Intel or Microsoft, finally pinned down the cause around December of last year, but I don’t recall ever hearing if it was eventually fixed.)

  8. Jazzyboy says:

    What? This article has some ridiculous accusations.

    Intel’s mainstream processors have in no way been stagnating. There may not have been any noticeable performance improvements, but they have seen some major power usage improvements.

    While that’s not important for PC users, Intel has in no way been lazing around for the past decade. Mobile devices are a major part of the future, and they’ve been focusing on that. To say their processors have been stagnating just because their focus hasn’t been on high-end PC is ridiculous.

    I do hope they bring some advancements to their high-end processors in the next decade to come, but I’m not expecting it indefinitely personally. Focusing on power usage rather than performance makes a lot of sense, and that’s why not only Intel, but most other manufacturers as well, are doing so. Soon, we’ll have flexible tablets. And maybe we could even get some flexible, or projected keyboards to go with them. That means you can basically carry your work with you wherever you go, in your pocket, on a reasonably large screen.(when it’s unrolled)
    Of course, mobile devices need more performance too. But they’re already quite a ways away from desktop performance, and to change that, chip makers need to lower the power to performance ratio.

    Once they finally meet the perfect point, at which they’ve got the most performance possible out of 6 hours(which I consider the minimum) on battery, then we can expect them to start taking it even further.

    Yeah, sure, more PC performance is great, but we’ve already reached the point where good laptops are more than adequate for anything but high-end gaming, so the only reason to get more performance from desktops is to play fancier games. But frankly, gaming’s just a hobby(even if it is a really fun hobby), and it’s a dwarf of a market compared to more business/work-focused tasks, such as writing long dull reports in Office.

    • Carra says:

      As a desktop user it’s frustrating to see how mobile phones have gotten 40x faster in 6 years time while desktop CPU’s hardly seem to have moved at all (how long have we been stuck at 4 cores now?). Sure, I enjoy a fast phone, but please give me a reason to upgrade my desktop CPU.

      • soldant says:

        Why? If you’re upgrading for the hell of it without anything to really utilise it, you’re wasting money. I honestly don’t know why people want to go back onto that terrible upgrade treadmill that we were all on during the 90s and early-to-mid 2000s.

        Current reasons to upgrade are in lower power usage benefits. If that isn’t interesting to you, don’t upgrade. When the new console generation is in full swing expect the need to upgrade to rise. Otherwise sit tight. I really don’t understand why you need a good reason to upgrade from Intel – surely the need to upgrade is dictated by what your software is doing, not because something new just came out?

        • ukpanik says:

          “that terrible upgrade treadmill that we were all on during the 90s and early-to-mid 2000s”

          Ask yourself why you needed to upgrade then…and you will find your answer to “why”.

          • soldant says:

            The “why” back then isn’t applicable to now – particularly when it comes to gaming. The pace has slowed significantly compared to back then, each new upgrade in gaming is largely incremental and not revolutionary.

          • ukpanik says:

            There is a lot more to PC’s than gaming. Video editing for example which has become quite common now and pushes the CPU. A new CPU that would lessen a 2hr HD encode would be very welcome indeed.

        • P.Funk says:

          Considering how long it usually takes software to catch up to tech capabilities on the consumer end, the sooner the tech leaps forward the sooner we can see software start to take advantage of it.

      • SuicideKing says:

        Mobile has done in the last 6 years what the PC did between 2000 and 2008, but they are at a fundamentally different position in the technology timeline. They’re also starting to hit an efficiency wall in mobile.

        Anyway, i think blaming Intel for not increasing desktop performance more than 10% each year is silly. Why? Because:
        1) It’s very difficult to achieve that 10%.
        2) 10% Year over year is a pretty big amount. Compare a i5-4670K to an extreme edition Core 2 Quad and you’ll realise what i’m talking about. Currently a Core i3 will outpace my Q8400 in games.
        3) IGP performance has gone up many times.
        4) Memory bandwidth has more than doubled over the Core 2 generation.
        5) Power consumption is *much* lower, especially at idle.

    • AngelTear says:

      (Also @Jasonsewall)

      In the last dozen articles, he always specified that the stagnation concerned desktop computers and high-end gaming, that long battery life is all well and good but doesn’t help gaming much – and, I’d like to remind you that this is a PC gaming website, what should the “News In Tech” column focus on, if not on high end improvements for PC gaming?.

      He didn’t go into all the nuances of his argument and thought just once, in order not to repeat himself yet again, and suddenly, Jeremy is an incompetent who deserves all the hate the internet can muster. Funny, huh?

    • Jeremy Laird says:

      As above, I’ve written numerous posts discussing the benefits of low power CPUs – everything from NUCs to tablets to AMD Jaguar and including the benefits even to desktop gaming PCs, not just for mobile devices.

      But this post was about a line of new mainstream, unambiguously desktop CPUS and the notion of a flexible tablet is completely irrelevant to the appeal and utility of the chips being discussed.

      Suggest you go back and read through previous RPS hardware posts and thus be fully aware of the broader context of any individual post. Suspect it will help with your stress levels!

  9. Carra says:

    The whole ‘fast enough’ argument only goes so far. Sure, current CPU’s are fast enough to visit facebook, browse the web, use office and play some flash games. They have been for fifteen years. But, seeing how the project that I’m working on took 3 minutes to compile and now is down to 1 minute by upgrading my 3 year old PC (mostly down to the SSD), I’m more than happy to see CPU’s going faster and faster.

    Still, it’s a fact that most desktop CPU’s are hardly getting above 10% usage most of the time. It’s time someone creates a useful program that everyone wants and actually uses your PC’s capacity.

    • SuicideKing says:

      Details? As in, did you go from a Core i3 to a Core i7? Or a Core i5 to an i5? And think about it, you cut your project compilation time by 1/3rd. Had your project been taking 30 mins to compile, you’d have taken 10 mins now.

      Were you using a multi-threaded compile process? etc.

  10. danijami23 says:

    I guess Intel really has a chip on their shoulder towards desktops these days.

  11. Al82 says:

    If nothing else, this article made me sign up to the site to leave a comment. Sure, power efficiency isn’t a very sexy thing to most PC enthusiasts, but you can hardly accuse Intel of stagnating. The fact that clock-for-clock performance of the 4770K Haswell has improved over the 3770K Ivy Bridge AND shaved 10 watts power consumption is quite a feat of engineering.

    Intel has been investing it’s money in making x86 CPUs power efficient to a point where they can compete with ARM in the mobile market; that’s where the big growth is and who can criticise Intel for that? Do we actually need more cores for consumer desktop parts at the moment anyway? I’m pretty sure that Intel will have their finger on the pulse as to what software developers are looking for from their future hardware parts; when they want more mhz and more cores, they’ll appear.

    • Jeremy Laird says:

      Like others, you’re conflating the idea that Intel’s mainstream desktop processors have stagnated in CPU performance terms to mean the entire company has been sitting on its thumbs. That’s very obviously not the case and other areas of Intel’s activities have been covered extensively on RPS.

      But this post isn’t about Intel’s all round product development. It’s about a a number of refreshed mainstream desktop CPUs. The CPU performance and experience you get from such chips has undeniably stagnated in recent years. In fact if anything, they’ve gone a bit backwards in decosting the packaging. And this 100MHz bump does nothing to change any of that.

      There are extensive benefits to power efficiency and we’ve covered that topic pretty extensively on RPS. But that is not the context for this post.

    • Widthwood says:

      Clock for clock performance doesn’t matter much when Ivy K typically overclocks much better than Haswell K.

      And on any non-K processor you are GUARANTEED to have better performance with Ivy because of silly artificial overclocking limitation in Haswell.

      Meaning by the time Broadwell arrives Intel’s best CPU’s will be 2 years old. I don’t remember such stagnation EVER happening.

  12. Keyrock says:

    Ah, the famous RPS telling some of the facts while neglecting others brand of ‘journalism’ that this site has sadly become famous for. Mention the meager gains in processing power while neglecting to mention the massive gains in power efficiency, or the fact that any gains in processing power of the CPU right now are irrelevant when it comes to gaming (and this is a gaming site) seeing as even a Titan Black Edition won’t get bottlenecked by a i5 or i7 Ivy Bridge, much less a Haswell CPU.

    • Bent Wooden Spoon says:

      Ah, the famous internet commenter, sadly notorious for talking bollocks & building strawmen for the sake of a few lines of pointlessly directed, badly written nerdrage.

      Your “point” had already been addressed, multiple times, a couple of hours before you decided to grace us all with your “thoughts”.

    • Widthwood says:

      Ivy was already very power efficient, certainly efficient enough that it didn’t cause any real problems with heat dissipation.

      And absolute values of power efficiency gains on desktop high performance CPUs are so microscopic they are completely irrelevant to end customers. If 10-15 watt economy while gaming matters to someone – they will be much better served by ARM PCs anyway.

      10-15 watt is roughly equal to what routers/wifi APs/DSL modems need – how many people do you know that turn off their routers when not using computer?

      • evilsooty999 says:

        I do! But I’m the type of person with ADD about leaving things on standby and religiously goes round turning things off at the wall when I’m not using them.

        • Widthwood says:

          Ha :) Is that an electricity price thing? Rough calculations show that 10 Watt modem working 24/7 for a whole month would cost me an equivalent of 62 US cents – hardly anything to be obsessed about..

  13. uh20 says:

    not to mention you buy the box with the processor on top covered by a thin piece of plastic, wonderful breakiful display case. on the bright side it might be easy to steal >_>_>_>_>_>

    intel is slow progressing all around, but then again it looks like we’re slowly walking away from isolate processors, because the real speed improvements are to be found in larger cache sizes, efficient core setup (like 1 fast core and 3 “ok” ones, instead of just 4 same processors) and more integration with graphics, as desktops specifically could benefit from much more advanced and bulky combined/integrated graphics on the processor and a motherboard interface to support such a thing.

    at least on the laptop side, intel is making some nice power reductions, although to be fair that improvement should have been made a while ago

  14. SuicideKing says:

    Okay, some things that are worth mentioning:

    1) Intel’s had Broadwell working since last year’s CES. They apparently got delayed to from the first half because of yield issues initially, but now it’s believed that the “delay” to Q4 2014 (December, really) is to allow Haswell inventory to clear.

    2) Broadwell, for the longest time, wasn’t supposed to be for desktops anyway. This is why we are getting the Haswell refresh. Then there emerged some rumours about a desktop part, but looking at the shift in schedule, i find it unlikely.

    3) Broadwell’s “desktop version” is likely to be Skylake, with DDR4 support. In quotes because Skylake should be a new microarchitecture on 14nm, so it’s not really Broadwell.

    4) Haswell-E will apparently have between 6 and 8 cores, and support DDR4.

    In other news, Microsoft’s launching DX12 with Windows 9 (most likely): http://techreport.com/news/26123/it-official-directx-12-to-be-unveiled-at-gdc

    • Fennster says:

      1/2) I guarantee you that Broadwell’s delays are from yield issues. Die shrinks are not easy, they require a lot of iteration. The company would much rather put out Broadwell to sway OEM’s to start putting it into tablets and such than let ARM continue to dominate that market.

      Also there’s been a lot of internal debate at intel about where to go with new chips. Anyone who knows the company can tell you that they’re pushing all of their resources into becoming the defacto mobile/tablet chip. They don’t have the people to pour into one high-end desktop team for enthusiasts, and one super low power efficient mobile team, hence the scaling of chips you’re seeing for Broadwell and beyond. The sad fact for you and me is that we just don’t generate that much money for Intel compared to mainstream/entry level markets in emerging economies in China and such.

      3) Broadwell has always been the die-shrink of Haswell. Skylake has always been the next architectural jump.

      I’m not an Intel apologist, I don’t condone their anti-competitive practices, but at the same time they aren’t some evil corporation trying to rip you off. They’re struggling to find where they fit in a post-desktop world where margins are getting smaller and smaller which they traditionally are not comfortable with. Even their former CEO has pointed out that Apple originally came to them pitching the iPhone, but they were unwilling to drop their prices for Apple so they went ARM, and he admits his mistake.

      • SuicideKing says:

        You’ve made a lot of solid points, though I’m not aboard the die shrink theory completely. The initial push from Q2 to Q3 was most likely because of die shrinks, but from whatever i’ve read the decision to move Broadwell to December 2014 had more to do with inventory issues.

        Intel does have two separate teams (i think Haifa works on efficiency, while the other works/worked on performance in the tick tock cycle) to tackle both ends, it’s just that throughout their company, the focus is now on efficiency and low power. They have their server chips, their mainstream line (still more important for India and China, at least in India tablets aren’t that big yet), low power Core and Atom.

        From what i’ve been hearing, Intel’s getting less interested in smartphone class chips and is more interested in tablets. In November/December, Airmont should launch before or alongside Broadwell. Till then they have to get Merrifield, Moorefield and Bay Trail everywhere they can.

        The scaling being slow is simply because:
        1) You can only take so much IPC out of a mostly similar architecture. There’s a reason why AMD’s given up on the CPU front. Nvidia could extract performance from Maxwell because they’re dealing with parallel workloads.

        2) Since Sandy Bridge, their focus has been low power, integration and efficiency and the IGP. So while they have more room for resources, they’re dedicating it to integration and IGP.

        3) No competition from AMD. Nvidia is kept on its toes because AMD can still provide similar performance/$.

        4) From (2), they need to add more cores at a lower point in their product stack, and without the IGP, they have plenty of space for that. We’ll likely see that happen with Skylake.

        5) For people like you and me, the number of applications that use multiple cores is still low. Games are starting out now, with the old consoles being phased out and PC becoming important again. Intel probably doesn’t see much point in spending money where returns won’t be much. As you rightly point out, you and me aren’t primary customers any more, because Intel knows that whatever they provide will get the job done for us, better than the competition.

        I’m aware that Skylake was always supposed to succeed Haswell/Boradwell, i just meant that the “desktop Broadwell” that people talk about is likely to be Skylake now, with Broadwell being mobile only.

        We’ll likely see Intel focus on us after Skylake hits the scene, as DDR4 and all should be out by then. All the Pentium 4′s still in circulation are slowly becoming inadequate, and once XP support is dropped, the drive to upgrade may finally trigger more sales. Overclocking may fall out of favour with the company, though.

        • Widthwood says:

          Your number 5 is quite debatable. Almost all processes that *take time* on a typical computer – converting, archiving, rendering, etc – use multiple cores nowadays. Anything else is usually instant, at least with i7 on an SSD.

          • SuicideKing says:

            Yeah, i know. You and me are the few who do stuff like that. For most other people, it’s word/ppt/web/email…for gamers, it’s games too…

            Basically what you were saying about us not being the target market anymore. Intel expects us to shell out big bucks for LGA2011. It’s unfair, but what can you do? Their priorities are somewhere else now…it’s sad, not denying it. I actually wish they’d go all out and give us a pure CPU in the K series.

            Economics won’t make that feasible, they’d probably have to make a new mask for it…

  15. darkhog says:

    Loading watch_dogs trailer… Available on 27/05/2014

  16. Syphus says:

    Well, I read this whole thing, I’m pretty sure the only thing I truly understood was the Simpsons reference.

    • Jeremy Laird says:

      Yay!

      Pop culture references in captions are obviously the most important part of any post, so pleased one person got it. Don’t worry about the rest, it was all about the Sideshow Bob reference!

      • Syphus says:

        Whoo! As long as I got the most important part then I am satisfied.

  17. Tekrunner says:

    Intel’s strategy makes sense if you realize that its main threat is ARM. Not only does ARM utterly dominate the mobile space, it is starting to go after the server market. With the increased popularity of large-scale distributed processing (Hadoop), it is a very credible move. And from there, why not move to desktops, which are rarely CPU-bound anyway? Intel is aware that it needs to make more power-efficient and cheaper processors. Otherwise it may end up being a niche market player in the long term.

    • Widthwood says:

      There are plenty of desktop ARM computers out there – efficient, tiny and fast enough for anything typical users do. But desktops are VERY CPU bound. Without windows and office and all other programs, like adobe’s entire catalogue – there is hardly any point in having a desktop like that at all. Not that most desktop users NEED faster CPUs, but forgetting its main market seems hardly wise. Even though it is slowly shrinking, in any case PCs will remain the main source of Intel’s income for years and years to come.

      Now notebooks – that’s another story. Chromebooks seem to be really taking off..

      • Tekrunner says:

        By “moving to desktop” I meant producing processors that can do most of what an Intel processor can. But that part of my comment was highly speculative anyway. What matters right now is the server market. Intel had managed to basically eliminate competition there (who ever buys SPARC servers anymore?), but now ARM is coming after them.

  18. Ladygrace says:

    I am running the Intel® Core™ i7-4820K Processor (10M Cache, up to 3.90 GHz) It is ok but if you have noticed the improvements of CPUs they are not really doing anything with them apart from just overclocking them really then claiming it to be a new CPU. I have a fair idea that they are doing this to create more income.