A laying-on of hands with Intel’s new 10-core monster


What’s that you say? A $1,000 10-core CPU has naff all to do with real-world gaming? To which I might riposte, who cares? Get a load of all 10 cores. Behold 20 threads humming away in Task Manager. Or I might not. But I have had a go with the new Intel Core-i9 7900X. Here’s wot I think.

Wot I think is that the 7900X doesn’t deserve to be taken seriously. Or to be reviewed properly. So, I’m not going to.

That’s not because it isn’t fast. The reason why I don’t think it deserves to be taken seriously is because the 7900X is symptomatic of what amounts to a rotten corporate culture at Intel. Strong words, I grant, and not everyone will agree.

Either way, I’m not going to take it seriously. Instead, I’ll tell you that it sits nicely in the hand and feels weighty and expensive, albeit only in that somatic way that makes something feel expensive when you already know it’s expensive.

And that’s it.

I won’t tell you how it performs because, firstly, none of us are going to buy it. So it doesn’t matter. And secondly the 7900X, the manner of its launch and the non-launch of its new many-core CPU brethren make it all too obvious how out of touch Intel has been of late.

The way I see it, for far too long Intel has set itself the wrong task. It should have aspired to make great CPUs. The end. Instead, it aimed to produce the worst CPUs it could get away with. And then dressed that up as progress.

For a while, that worked. There was no competition. Then AMD’s Ryzen popped up and, though it’s arguably flawed as a gaming CPU, Intel panicked. The result is the shitstorm of CPUs models I reported on recently and none of it makes any sense.

My impression is that the 7900X was originally intended to be the top chip. As a consequence it’s clocked very high, high enough in all-cores Turbo mode that I suspect it will make the task of outperforming it with CPUs configured with more cores actually pretty tricky. Which may be part of the reason why all those CPU have yet to be released.

It wouldn’t actually surprise me if Intel adjusts its plans for the line up of CPUs it intends to stick in the new LGA2066 socket, into which the the 7900X slots. It’s a mess. Bizarre four-core models with bits turned off that are inexplicably more expensive than the mainstream socket LGA1151 chips they’re based on and hobble the motherboards into which they are inserted. Numerous models that have been partially announced but nobody knows when they will arrive and which seemingly only exist to act as a spoiler to dampen sales of upcoming many-core AMD Ryzen chips. It all has a general air of chaotic improvisation.

The whole sorry story smacks of an entrenched operator that has grown defensive to the point of self-destruction. Everything about the way Intel plans its products now seems overwhelmingly negative to the point that I honestly think there’s a certain belligerence to the way it views its customers. Yes, it exists to make a profit. I have no issue with that. But there are many ways to skin that particular cat. And if you don’t respect your customers, you can’t really expect your customers – or the media – to respect you.

I’m also reminded of a somewhat analogous situation in my other occupational muse, the car market. For some time now, Porsche has been rolling out big-ticket limited edition car models that are sold out before they are even officially announced as products, which is a nice trick when you think about it.

Anyway, the somewhat preposterous net result is that Porsche keeps on launching new cars that you can’t buy unless you’re part of a tiny club of favoured clients, the membership criteria of which is opaque to say the least. And yet, Porsche goes through the usual press launch routine. The hacks turn up. The column inches and YouTube video content are cranked out. The circus goes on.

In a similar fashion, I’ve been as guilty as the next hack of toeing the line, of reviewing the latest infinitesimally incremental new mainstream Intel CPU release. When really i should have said nothing, because the new chips changed nothing and didn’t merit serious coverage.

Of course, you could accuse me of having my cake and eating it, of not reviewing the 7900X but still giving it airtime. I grant there’s also a certain hypocrisy tied up in all this as it pertains to gaming specifically and the limited role of the PC processor as a bottleneck to gaming performance.

But that’s a complex chicken-and-egg argument. Moreover saying nothing at all isn’t really viable. On a one off basis, it has zero impact.

So this is my review of the Intel Core-i9 7900X. And it’s no review at all.

80 Comments

  1. Premium User Badge

    Drib says:

    So basically, wait a while for releases to calm down and companies to get their shit together again?

  2. Zigsaz says:

    Is Intel specifically marketing these as gaming CPUs? Maybe I’m missing something, but wouldn’t these still be a hot sell for other high-end uses? (Simulation, development, machine learning, etc.)

    • Ghostwise says:

      Even if it was marketed as a combination waffle iron/soap bubble maker, the RPS angle would presumably be its PC gaming applications.

    • Blad the impaler says:

      Yes, that’s a valid point. The problem is Intel has put the cart before the horse here. As the article kinda says, the new chips don’t make full use of the hardware they plug into. They’ll be better at what you’ve mentioned – but they could be so much better at everything, including what you’ve mentioned. Infinitesimally incremental iterations from Intel – until the competition delivers a perceptible threat. That’s basically what this crap is.

    • Czrly says:

      Except that we have GPGPU (Cuda, etc) for that!

      Honestly, I do not understand the point of these high core-count processors. If you’re really running a high-performance workflow (machine learning, simulation, physics, numerical methods, wotnot) you’re probably going to end up doing it on the GPU in the near future if you aren’t, already. If you’re running a server, you should be using some light-weight containerisation layer — a cluster of cheap metal hosts is way more effective than a single node with any number of cores!

      • Archonsod says:

        I suspect it’s aimed more towards the virtualization market than high performance applications.

  3. keefybabe says:

    The situation you describe is why I’ve not seen any point in upgrading my 2012 cpu. In 5 years there’s been fuck all progress.

    • Archonsod says:

      Traditional hardware market has been stagnant for quite some time, and the desktop segment is in pretty much terminal decline. Intel is treading water largely because there’s very little economic sense in doing anything else.

      • Asurmen says:

        Except GPUs and since when is it in terminal decline?

        • Premium User Badge

          MajorLag says:

          While GPUs have made significant gains in performance lately, that performance has largely been useless outside of e-peen waggling. My aging GTX 560 Ti has yet to be bested by any modern game I’ve thrown at it, for instance.

          • Asurmen says:

            As usual with these things, you need to state resolution, settings, the types of games we’re talking about and frames per second.

            My 290X for example isn’t running as fast as I’d like at 1440p on something like Mankind Divided on high/ultra settings (about 40-50fps), and I can’t imagine 4K running too well on it.

            So by e-peen wagging, what you actually mean is you’re playing older games at lower res with lower settings and you’re happy with that. Some of us aren’t.

          • Sic says:

            Then, dare I say, you aren’t throwing very relevant games at it, nor do you have very high standards.

            For running most modern games at 1920*1080@60Hz, we’re talking at least a GTX1060, these days.

          • SenorRoboto says:

            Just because you’re playing less demanding games doesn’t make that true. My experience with a faster 560Ti 448 Edition at 1680×1050: BF4 came out ages ago and was only smooth on near-minimum settings. Far Cry 3 was choppy no matter what. Dirt: Rally (choppy, made it easy to crash), Abzu (choppy on low), Assassin’s Creed IV: Black Flag, Mad Max, Sleeping Dogs, all either needed to be cranked way down or were still choppy on low.

        • Archonsod says:

          It’s been in decline for around the past decade. Mobile is where it’s at these days.

    • RichUncleSkeleton says:

      Beats the hell out of having to upgrade every 2 years.

      • Unclepauly says:

        Not in my opinion. Technological progress shouldn’t have to halt because a section of the users don’t like it.

        • RichUncleSkeleton says:

          If technological progress meant more than just prettier graphics maybe I’d agree.

    • Premium User Badge

      phuzz says:

      I only upgraded before five years because I needed/wanted a new motherboard.

  4. Baines says:

    If people unimpressed with Intel’s tactics, and the incremental increases, refuse to review new chips, then the public will only be left with PR and positive reviews.

    Mind, the new Intel chip plans have had enough of a negative public reception that even some more favorably biased reviewers have voiced issue.

    • waltC says:

      There is reason enough not to take this cpu seriously, and that is that neither it nor the motherboards required to run it–are available for purchase. But this is certainly nothing new on the Part of Intel–the company has *always* been this way. There are so *many* things I could recite from memory–but I’ll just stick with my favorite.

      It was the “You don’t need 64-bits on the Desktop” advertising campaign that Intel ran back when AMD had just released the first 64-bit desktop x86 cpu, the A64. Intel’s idea was that it could move everyone to Itanium–which not only required new motherboards, chipsets, and so on, but all new software–Intel wanted you to run an emulator for your older software–sort of like IBM’s OS/2 ideas about Windows/Dos-compatible software…;) And of course, the Itanium bus was not up for licensing to third parties–oh, no. Intel was bound and determined that *this time* it would bloody well keep its monopoly intact. With Itanium. One of the biggest dud cpus ever made and supported for tiny groups of eccentric people.

      As I say, there are many other examples, like Intel investing $1B into RAMBUS, which ended in total defeat for Intel when, just before it pulled out of RAMBUS and abandoned the Rdram standard, Intel didn’t have much luck with even *giving Rdram away for free in cpu/Rdram bundles*! The market chose AMD’s backwards-compatible DDR-SDRAM standard instead. The list of Intel’s attempts to build monopolies and run competitors out of business is a long one. The company succeeded in running every cpu company that emerged out of business–except for AMD. The only cpu company to have beaten Intel technologically.

      Another site which got one of these 10-core engineering samples to review mentioned in its review that the system he got crashed as often as it was up. So, yes, this is highly a knee-jerk reaction on the part of Intel to AMD’s Ryzen. Like Apple, Intel is not accustomed to competition and becomes a rather bad actor in its presence.

      BTW, Ryzen’s “problems” with gaming will be short lived, for those anal enough to notice it at present…;) The bios updates AMD has given motherboard vendors are just half the story–the other half is game devs properly optimizing their engines for Ryzen. I well remember when the Athlon shipped–also a brand-new architecture from AMD that changed the cpu industry (Intel adopted AMD’s x86-64 through a cross-licensing deal and Core 2 was born), at first it beat the original Pentium (not the Pentiums sold today by Intel) at some things, lost to it on others, but after a year had elapsed, there wasn’t a gaming site on the Internet using Intel cpus anymore. Stayed that way for years, until AMD tried to emulate Intel and dropped the ball…I don’t know if the Ryzen case will be just as dramatic in a year, but I do know that under its present management (MIT-educated Lisa Su) that it won’t be dropping the ball again in the foreseeable future.

  5. Osito says:

    Please stick to talking about that which you understand (computers) not that which you obviously do not understand (cars). While I understand the allure of the Intel/Porsche analogy, it is deeply flawed, and was done gratuitously. I hope you will take this advice on board for the future. Apart from that, good article!

    • Sarfrin says:

      Jeremy “former editor of iCar magazine and car guru for T3” Laird, doesn’t understand cars?

      • grundus says:

        I’m not saying he doesn’t understand cars, but having a certain job doesn’t automatically mean you understand what the job is about, you could just be a good bullshitter.

    • ColonelFlanders says:

      To use a car term, this comment is really going to backfire on you. Of anyone at RPS, Jeremy is probably best qualified to talk about cars.

    • Rikard Peterson says:

      How about you describe what was wrong with it instead of just throwing around vague insults?

    • SenorRoboto says:

      I found the Porsche guy!

    • Jeremy Laird says:

      I understand cars substantially better than I understand CPUs. Which is not to say anything about how well I understand either in absolute terms.

      The analogy was not one that was meant to map well, hence the qualification ‘somewhat’ analogous. It was chiefly about nonsense marketing and nonsense products in broad terms.

  6. Kingseeker Camargo says:

    And here I am, all kinds of happy with my new G4560 CPU.

    • Premium User Badge

      distantlurker says:

      Pretty much sames. My i5 2700k is, what? 5 years old.

      Still doesn’t throttle 95% of the games I play.

  7. GenialityOfEvil says:

    Honestly, this seem very naive. It’s well documented that silicon chipsets are coming to their end, and Intel is currently investing most of the R&D into figuring out what comes next, and that takes money away from developing new products. All that would happen if they pursued the fastest chips with each iteration is we would hit the cliff long before a replacement was ready. Let’s not forget that AMD hasn’t exactly clamoured forth to take the reins.

    Yes, ideally we want them to pursue the best products possible and shouldn’t excuse corporate mediocrity, forsaking progress for profit. But the CPU market isn’t exactly normal right now. We wouldn’t excuse a car manufacturer deliberately withholding fuel efficiency or horsepower from their new models, but the combustion engine isn’t exactly running out of capacity, CPUs are.

    • Rikard Peterson says:

      The combustion engine is rapidly about to become a thing of the past. Countries have outlawed them (in the future), better alternatives exist, and car manufacturers are phasing the combustion engines out.

      • GenialityOfEvil says:

        Yes but it’s becoming a thing of the past because alternatives are surpassing it, not because it’s reaching any kind of limit. CPUs are built with transistors that can only shrink so much before you start getting electrical interference between them. We’re approaching that limit now. By 2022-24 CPU dies will need to shrink to their physical limit to keep up with Moore’s law, then that’s it, silicon is unable to provide any advance of power.

  8. foszae says:

    From the title i was hoping for a much more satirical article about some weird faith-healing malarkey.

  9. BarneyL says:

    So 7/10 overall?

  10. Premium User Badge

    syllopsium says:

    Sorry,but this article shouldn’t be taken seriously, either.

    AMD hasn’t done much better. Despite their difficulties with clock speeds and IPC until Ryzen, they have done virtually nothing novel with instruction set enhancements when they could have done (there’s finally the memory encryption in Ryzen, but frankly they could do more). The most interesting things AMD creates other than Ryzen (which is, yes, surprisingly good albeit not as good as Intel’s best if you pay much more) are their lower end APU chipsets and embedded options.

    For years Intel’s Xeon systems have fit more cores in the same thermal envelope, but barely increased single core speed unless it’s by instruction enhancements (FMA, etc). The desktop chips aren’t that different, probably because Intel is finding it difficult to boost single core speed beyond a certain level.

    Porsche are a poor comparison, they are now a very successful investment firm that happens to also manufacture a small number of cars.

    I’d also question the ‘none of us are going to buy it’ line – it all depends on price. Xeon processors have more cores and lesser clock speed, desktop processors are the other way around. Those of us who want to do a load of virtualisation (cores) and maybe a bit of gaming (clock speed) could be interested in the new desktop options.

    As it is, personally I’ve gone for second hand older Xeon for virtualisation, and if I ever get round to VR or high end gaming it’ll be with a fast four core system fairly overclocked, because who cares if you lose a couple of bits occasionally when playing games?

  11. atarikafa says:

    I have both 7700k and Ryzen 1700x PC

    I will just say this:
    Fuck Intel i7. Ryzen is best cpu for this year. Maybe next year too…

    i7 is not better other than just making 10-20 fps on games. Thats it just 20 fps on games.
    Stop buying Intel i7

    • vahnn says:

      Um… Why, exactly? If you’re building a machine for gaming, all that matters is framerate. If another cpu for the same price gets you 10-20 fps more, why would I take anything else?

      • ravenshrike says:

        In actuality there are plenty of things that matter besides average framerate. There’s minimum frames; frametimes; and delta between frames. That last one is very important to a smooth gaming experience. In comparisons between Ryzen 5 and 7 and the 7700k many people have pointed out that their experience with Ryzen was butter smooth. Whereas in many games that can’t be run on a potato you get intermittent hitches due to massive frame deltas that are noticeable and remove immersion for a moment on the 7700k and 7600k. Worse on the i5 of course. So unless you are playing twitch games competitively there is little advantage to buying a 7700k or its X299 counterpart let alone the i5 chips.

    • Premium User Badge

      phuzz says:

      10-20fps is the difference between a game playing like a dog, and being just fine.
      On the other hand, when the price difference for a 10fps gain is over nine hundred quid (i7 6950X vs Ryzen 7 1800X), I guess I’ll settle for just turning the graphics down a bit.

      • ravenshrike says:

        Not at framerates above 60 fps which if you’re not severely GPU bound is where both the Ryzen and Intel chips are. If you are GPU bound you’re not gonna see that large a framerate difference.

  12. Be_reasonable says:

    3 things about AMD that worry me:

    1) Will the AMD chip have a compatibility problem with my favorite game? How about one of my old favorites? I am near certain that there is no game for which it only runs on AMD.
    2) Same for video cards. I have always regretted my AMD video card purchases.
    3) AMD benchmarks compared with Intel.

    Yes, I pay a certain premium for Intel processors. Yes, Intel processor games with which bits are turned on or off and the hundreds of dollars attached to those irritates me. I know that I am being financially taken advantage of. But if there’s one thing I know, I won’t be sitting in front of a weird game error, a crash to desktop, or getting lag at the worst time and wondering, should I have bought Intel instead of AMD?

    I bet a lot of people are just like me in that regard. If AMD wants my attention, they have to promise the support, the backwards compatibility, and blazing fast benchmarks that are winners (not just participation ribbons).

    • mavrik says:

      But if there’s one thing I know, I won’t be sitting in front of a weird game error, a crash to desktop, or getting lag at the worst time and wondering, should I have bought Intel instead of AMD?

      Nothing like that will happen because of a CPU. Nothing.

      But this brand brainwashed line of thinking is why you’re paying stupidass huge prices for processors that have practically stagnated for last 5 years. Intel has no reason to actually improve your experience because you’ll just give them money no matter what kind of turd they deliver.

      • Dareg says:

        Nothing like that will happen because of a CPU. Nothing.

        It’s always possible that a piece of software crash because of the processor, see: link to arstechnica.com

      • Be_reasonable says:

        From AMD’s very own website:
        Troubleshooting Crashes, Hangs, and Performance Issues in Games
        link to support.amd.com

        And about Ryzen:
        AMD FOUND THE ROOT PROBLEM CAUSING ITS NEW RYZEN PROCESSORS TO FREEZE DESKTOPS
        link to digitaltrends.com

        This is the kind of stuff I’m talking about and why it’s worth the extra money for me.

    • Chromatose says:

      Hello, one happy Ryzen 1600x user here. Switched over from an aging Intel chip a few months ago, and the transition has been painless. My rig is stable as can be, and has so far run everything I’ve thrown at it. So yeah, if it’s compatibility you’re worried about, don’t be. Ryzen runs the full x86/64 instruction set just fine.

  13. GurtTractor says:

    I’m slightly disappointed that you didn’t mention one the most egregious things about these new CPUs, and all their recent consumer chips from the past few years; the use of thermal paste between the silicon die and heat spreader, instead of the two being soldered together in the case of AMD’s CPUs.

    It’s utterly ridiculous that you can get an improvement of 20c or more by delidding such an expensive chip. It’s a problem that should not need solving with a vice (or expensive custom made tool), when they could just have manufactured the bloody thing properly in the factory in the first place!

    The venerable and still well regarded Sandy Bridge chips used solder, and could overclock very well. Then Intel decided to abandon solder for cheap TIM in their subsequent CPUs for some reason. They even ended up releasing the Devil’s Canyon line as a refresh of the Haswell architecture but with better thermal paste to combat crazy high temperatures. Their high end desktop processors were using solder up until Broadwell-E, now with Skylake-X an expensive high end CPU cooler is pretty much essential, and even with something like a big water cooler getting the chip to it’s max overclock is pretty unlikely unless you delid it, which you should never have to do.

    I would just like this ‘feature’ of Intel chips to be decried far and wide (as it already has been to some extent across many of the online publications and video reviewers), so that they are forced to stop taking the piss like this, and match AMD’s manufacturing (and hopefully price/performance).

    For now anyway this is enough to keep me away from Intel chips entirely, and thankfully we have some incredible chips on offer in the form of Ryzen 5 and 7.

    • InternetBatman says:

      I was going to ask the reviewer about this. The marginal cost savings vs. the severe loss to system stability seems particularly egregious.

    • Czrly says:

      I was just scrolling through these comments to see if SOMEONE bothered to mention that. Thanks.

      Yes, folks, if you don’t think that opening your web browser warrants an instant jump from 28 to 100 degrees (at which point your CPU *is* throttled to prevent it melting itself) then you’re better off NOT buying an intel CPU.

      No — cooling doesn’t help. That fancy water-block you bought is just decorative because there’s no point running cold-water through cold copper — heat has to first get to the water-block. Same goes for fan-coold blocks.

  14. Jarl says:

    So, how long can you play dwarf fortress on it until you get fps death?

    • Stromko says:

      I don’t think this CPU would do anything for Dwarf Fortress, it would just have 9 extra cores not doing anything. Possibly 9 1/2 idle cores depending on how rigorously the CPU relies on multithreading.

      To the best of my knowledge Dwarf Fortress is still a single thread game so just about all the advancements in CPU ‘power’ over the last many years has been moot.

      • Premium User Badge

        Mungrul says:

        Yeah, it’s 64bit now, but still not multi-threaded. And it’s the main thing that stops me firing up DF when I get the hankering. Framerate death just happens far too quickly with the increased migrant spam of recent versions.

    • xrror says:

      Yea Dwarf Fortress needs like a 20Ghz dual-core, the second core just being for the underlying OS to run on (or have a DOS version =D). I’m no programmer, but I just imagine DF being a Branch Statement torture test for processors ;p

  15. grundus says:

    I personally think the mess Intel has made of this is down to them being reluctantly dragged into a marketing war by AMD, because the i9 range is completely pointless – to use the Porsche analogy, it’s like a Cayman with 911 GT2 performance. They already have a gaming CPU in the 7700K and they already have Xeons with plenty more than 10 cores and support for ECC RAM which makes that a better choice for productivity and high-end not-gaming stuff, so who exactly needs this i9 range?

    CPUs have stagnated in the last 10 years, but there’s a chicken/egg situation anyway; if there was enough software to justify having more than 4 cores with hyperthreading, that would be the norm… But there isn’t. Console performance sets the requirements for 90% of games out there and games are far and away the most demanding thing that a normal person would use a PC for, so if an i7 is more than good enough to keep up with consoles, why should Intel pour millions or billions of dollars into improving CPUs if they don’t actually have to?

    They clearly are leading the industry and I don’t doubt they’ve got all kinds of prototypes and tech we don’t know about that would blow our tiny minds, but we don’t need it which is why we don’t have it. But they can’t let AMD have bigger numbers than they do, so they were like “Oh ALRIGHT” and now we have the i9 range.

    • Chromatose says:

      There’s plenty of software that benefits from higher core counts just now, especially in the creative industries. I think the prevailing assumption has been that anybody who wants to run a workstation for music, video editing, 3D work, streaming etc would either stump up for a Xeon or i7x, just put up with four cores, or buy a Mac.

      • grundus says:

        Yeah, that’s what I’m saying. Xeon for productivity, 7700K for top-of-the-range gaming, between the two is irrelevant. Anyone who really can constantly max out 4+4 cores should already have a Xeon, I don’t think anyone (except perhaps clueless hedge fund managers or super hardcore tech geeks who just like toys without necessarily needing a valid use to justify the huge expense) was waiting for this launch to buy something.

        • Chromatose says:

          Sorry, I should have probably been more specific that I don’t agree with that conclusion and why.

          In short, although I doubt creative pros are going to stop buying midrange Xeons or Mac Pros any time soon, a lot of creative software that really benefits from having as many cores as possible is actually affordable to the point of being mainstream now. You can get almost all the software you need for a modest digital content creation setup for just a few hundred pounds and/or a subscription fee. A decade ago that was practically unheard of.
          It makes no financial sense from that perspective to go for a Xeon system, or even an i7 Extreme, because it’s simply going to be out of a lot of peoples’ budgets. AMD’s push for greater core counts at relatively modest prices makes absolute sense from this perspective – sure, it’s not going to smoke the competition in terms of gaming performance, but for streamers, modders and indie developers, the mainstreaming of greater core counts is a huge benefit.

  16. Colthor says:

    Terrible review, didn’t even say how it tastes.

  17. Premium User Badge

    MajorLag says:

    It seems to me, as a casual observer, that Intel is pretty frightened right now. AMD gave the world the backwards compatible 64-bit processor it wanted while Intel fumbled around trying to leverage its market domination to push a radical new architecture. They recovered with the Core line and brought about the age of multicore, winning back dominance for a long time but failing to do much further innovating.

    Meanwhile, mobile took off in a big way and low-power ARM got huge. Intel made some attempts at this market but largely failed due to, as I understand it, never really getting the whole “low-power” part. Now even Microsoft is talking about putting Windows on ARM and it’s got Intel so scared they’ve made vague threats about legal action for trying to add an x86/64 instruction set mode to ARM processors, which are probably completely groundless.

    Then AMD puts out Ryzen and it, for the most part, has caught up with Core, leaving Intel in the position of having to justify their price gouging and give everyone a reason to believe they’re still as technically superior as they were pre-ryzen. Consequently, a bunch of rushed announcements and some processors that really aren’t very interesting or indeed even practical.

    • Kamikaze-X says:

      They (ie research and development) aren’t frightened at all – its the shareholders that are frightened and pushing Intel to do something. Both Intel and AMD are in the same position in terms of architectures reaching the end of their useful life, in terms of moores law being a severe limitation, in terms of yields at smaller lithography being more wastefu and only so much exta IPC being squeezed out with dimishing returns.

      You’re right in regards to netburst, but I fully remember much of the press at the time, when it was the ‘thing’ to criticise Intel, deriding Intel for sticking together 2 dual cores to make the eponymous Q6600, which is considered the classic quad core from that period. Turn it around to when AMD were selling core disabled processors because they couldn’t get good enough yields, and somehow this was ‘good guy AMD’ when it was anything but.

      Intel didn’t ‘get’ mobile because they released their chips as x86. It wasn’t to do with ‘not getting low power’. There’s a reason most low power laptops and convertibles are Atom based. The intel Atom phones were hobbled as they ran ARM emulation layers to run android properly which required extra processing power.

      In regards to Intel being ‘scared’ because one of the worlds largest bully boy tech companies (MS) wants to use its patents to add x86/64 to ARM… they are fully within their right to protect their IP. You realise that even AMD are paying Intel a license to use X86/64 instruction sets right, because, you know, they invented it with HP.

      Intel is not some sort of bad guy. They’re a business, much like AMD, and they have more leverage. Switch positions, and I’m pretty sure AMD wouldn’t be giving away processors to all and sundry.

      • Premium User Badge

        MajorLag says:

        You may be right about some of that, though I sincerely doubt that atom android did much, if anything, special to work on x86 because x86 is an official compile target. You can run Android using Intel virtualization, he’ll, you can install some versions of Android on any old PC. Linux kernel, lots of open source and java in the userland, nothing about that makes me think ARM emulation should be at all necessary.

        Also, AMD invented x64. On this I am pretty certain. They have some sort of cross-licensing deal with Intel that allows Intel to use it, not the other way around. Though honestly I wouldn’t think an instruction set would be copyrightable or patentable, but I thought the same about APIs and some stupid judge said they were after Oracle appealed so there you go.

  18. The Fuzz says:

    I just hope this mean prices on laptops go down. My last purchase was in 2014, with an i7, and it’s been fine but it seems bizarre to buy a new laptop with similar specs at the same price so many years later.

    • GrumpyCatFace says:

      This, very much.

      Many gamers don’t have the space or desire to play on a desktop machine. I get nearly the same performance from a good laptop, but the prices only seem to go higher.

  19. Chorltonwheelie says:

    What a strange little hissy fit.
    I’m still not buying an AMD chip because they’re not as good as Intels.
    Dry your eyes mate.

    • Michael Anson says:

      Spoken like someone who likes throwing money away. AMD has the best power-per-pound value.

  20. Ragnar says:

    But if you won’t do a proper review, how will we know whether the integrated GPU is slightly less crap than last year’s model?