Intel’s 18-core CPU and, er, other exciting stuff

As I was saying, an 18-core CPU is obviously irrelevant for PC gaming. Actually, I was speaking then of AMD’s then-staggering 16-core Threadripper CPU. Two weeks later, Threadripper is already ancient news. It’s been comprehensively gazumped by a new 18-core CPU from Intel and suddenly the PC hardware landscape looks a little potty. I know I’ve been bleating for literally years about Intel’s sandbagging and how we needed AMD to spice things up. But this is a bit ridiculous. Be careful what you ask for…

First and foremost, this is a developing situation. Just two weeks ago, it seemed Intel was all set to unleash its first 12-core desktop CPU. Now the ante has been upped to fully 18 cores in an official announcement for the Computex tech fest in Taiwan.

But the announcement was unusual in that it included some fully fleshed out details, including the full model names of the new chips along with core and thread counts, but excluded other critical items such as clockspeeds and power consumption ratings. It’s pure speculation on my behalf, but this feels to me like a very last-minute response to AMD’s CPU roll outs of late.

Anywho, the short version of this rather unusual story is that Intel is wheeling out no fewer than nine new processors for its equally new high-end platform, the latter composed of, again, a new socket in the form of LGA2066 and yet more newness courtesy of the X299 chipset.

Topping things off is the Core i9 (yup, the i9 family is new, too) 7980XE with 18 cores and 36 threads, but no known clockspeeds for now. It’s yours for $1,999 or very likely a painfully similar figure preceeded by a pound sign and is almost certainly irrelevant to all of us.

Ryzen 9 begat Core i9

Intel, of course, has long had the ability to fire out this kind of CPU. It has offered Xeon CPUs with almost countless cores for several generations. But it has taken AMD to finally force its hand. How many of these CPus get sold into the high-end desktop market, I’m not sure. My sense is that chips like the 7980XE are as much or possibly more about ensuring bragging rights for Intel as actually generating sales revenue.

The i9 line up also includes a range of nearly but not quite as exotic 16, 14, 12 and 10-core chips, the last of which seems almost accessible by comparison at $999. But it’s really the new core i7 models that are gamer-relevant. The new eight-core Core i7-7820X clocks in at $599 and has very healthy clockspeeds of 3.6GHz base, 4.3GHz Turbo and 4.5GHz Turbo 3.0 (that’s a special Turbo mode that actively determines which cores are capable of running at the highest stable frequency).

If you can afford it, that does rather seem to offer a pretty comprehensive combo of single-threaded frequency and multi-core parallelism that should make light work of pretty much any game for a few years to come.

But I’d say it’s the new Core i7-7800X that’s most interesting. It’s not exactly cheap at $389, but it is Intel’s most affordable six-core CPU yet and offers a decent Turbo speed of 4GHz. Certainly, it would be a tough choice between the 7800X and the lower end of AMD’s eight-core Ryzen 7 family. AMD still offers better value, to be sure. But Intel is at least more competitive at that rough price point.

I count 20 cores. Like I said, a developing situation…

As for the oddball Core i7-7740X and Core i5-7640X will sit in the bottom two rungs of Intel’s new LGA2066 range, I’m not sure what to make of them. My understanding is that they are the existing Kaby Lake quad-core CPU dies, as seen in the Core i7-7700K and Core i5-7600K for the mainstream LGA1151 socket, but rewired for LGA2066 and with the integrated graphics disabled.

At $339 and $242 respectively, they’re price parity with their LGA1151 cousins – at least at pre-announcement pricing for the existing pair – though they offer very slightly higher baseclocks but not Turbo speeds.

The only way I can see to make sense of them is as gateway CPUs for the LGA2066 platform. In other words, they help keep the overall up-front cost of upgrading to an LGA2066 PC under control and allow a future upgrade path to those big core count CPUs.

As things stand right now, then, these latest developments from Intel and AMD arguably don’t have much direct impact on the choices most gamers must make when configuring a new PC. All the 10-core and beyond models are arguably too much money for far too little gaming relevance.

Ermegerd, so many pins…er pads!

Even Intel’s new six and eight-core CPUs are probably too pricey for the majority. Meanwhile, the new Intel quad-core models come with some expensive baggage in terms of requiring an X299 motherboard.

In other words, the real-world choice remains that of Intel’s existing Core i5 and Core i7 chips for the LGA1151 socket, which offer the safest bet for ensuring good gaming performance in this very here-and-now, or AMD’s Ryzen 5 and Ryzen 7s, which are clearly better value and might well prove more future proof thanks to offering more affordable six and eight-core options.

Like I said, the situation is developing fast. I fully expect to see Intel add a six-core chip for its mainstream LGA1151 socket later this year and I’m hoping that will be part of a further re-jigging of prices that will see six cores slot in where Intel’s best quad-core chips currently reside in the price lists and those in turn become yet more accessible.

Things are moving in the right direction, then. But I’d be holding out just a little longer to see how things develop later this year before pulling the trigger on that new gaming rig.

38 Comments

  1. causticnl says:

    If I can the talk Linuxtech had was very informative, and shows what absolute mess currently the new chipset and socket is.

    • Buuurr says:

      It is always fun to see the cattle flock to the sound of the farmer. One who is not of the herd cannot help but be fascinated by the blind obedience and trust.

      In short, the chips are not released yet. Intel has most always NOT been a disappointment in their releases. There is no evidence that this will be any different. What we have are fan boys with empty pockets crying yet again.

      • Premium User Badge

        Phasma Felis says:

        I don’t really have a dog in this fight, but wow do you sound like a pompous jerk. Do you use the word “sheeple” unironically?

        In my experience, the only people who rush to call people fanboys are other fanboys.

        • Buuurr says:

          Sorry. More or less tired of people just jumping on the wagon. I am betting on Intel in this latest race. It isn’t released yet and people are saying they are switching to AMD because of this one guys opinion? Seriously? Sheep is the word for sheep.

          As for fanboy, yes… and no. If AMD comes with some tech, that – for my money, is better than what Intel has released or is about to be released. I go with AMD. If not, Intel. I am a fanboy. A fanboy of performance.

      • Blasg says:

        Yep, definitely a jerk. Pompous or not

      • milonz says:

        Things are simple, the first Ryzen samples have already been benchmarked.
        I’m a gamer AND a professional systems engineer AND a 3D conceptor (as a hobby)

        – For gaming, the higher frequencies are better, the number of cores is not important above 4.
        – For servers and 3D rendering, video works, only the number of cores is important (the more, the better), the frequency is secondary.

        If i have to choose a gaming rig, which could possibly do 3D rendering and videos, AMD Ryzen seems now largely the best real-life-performance / price proposition. For pure gaming, any low-end quad core i7 will fit. Even my 8 years old i7 860 fits perfectly well with my GTX 1060, i’ve never felt the need to upgrade my CPU, as a gamer.
        If i have to choose a processor for my home servers, i’ll go for Intel Core i7 or i9, because they are cheaper than Xeons and have all the virtualization instructions sets required for Vmware ESXi (vT-x, VT-d), and are largely less energy consuming and noisy that real professional servers (i live in a flat in Paris, i don’t have a 1000m² country house with a dedicated datacenter in the basement)
        If i have to choose a processor for my professionnal servers, in my job, i’ll choose Intel Xeons (they are way too overpriced, but eh, that’s the professional world !)
        If i have to choose a processor for a professionnal rendering farm, i’ll choose Intel Xeons OR probably AMD Ryzen. Because what would matter to me is server density by square feet and TCO (total cost of ownership) by core : every job has a budget, and your job is to squeeze the more performance for a given budget. Multiplying the performance by 2 for 10 times the price will get you fired for heavy professionnal fault, if there’s not some specific need for Xeons.
        For 1 Intel Xeon server, i could possibly buy 4 or 5 Ryzen servers, for the same price (Supermicro has already plans to integrate Ryzen CPUs for very specific markets, like 3D rendering farms)

        Buuurr, as a professionnal or even as a PC enthusiast / gamer, you shouldn’t rely on reputation or vague assumptions.
        The only thing that matters is does it fit the job and how much it costs.
        It’s a binary, very tangible choice.

        For all the rest, the computer industry has a proper term : FUD (Fear Uncertainty and Doubt).
        IBM, Microsoft and Intel have used this psychological manipulation techniques for years and have become masters in it.

      • milonz says:

        You can be a granpa buying some Mac Pro, because you want to be sure, because FUD.
        As said a french philosopher, if you want to be sure and feel reassured, you should stick to farming turnips.

        • milonz says:

          As a gamer, my budget is not unlimited, i’m not a millionaire, so every dollar / euro i can save on a good-enough CPU will be another dollar i can spend on other components (GPUs, larger SSDs for my games, new multi-TB disks for my NAS, etc)

  2. Tyrmot says:

    My next chip will be an AMD (for the first time)

    • Czrly says:

      I’d hate to say that my next chip will be AMD but, right now, I could not buy another Intel one. I bought an i7 7700k a few weeks ago and it is truly junk. The speeds are fine but the quality-control is rubbish and the thing is impossible to cool – even at stock clocks and voltages.

      Quite simply, you can put a great water block on top of it and all you have done is waste your money – the heat simply doesn’t get from the silicon to the copper fast enough because the bean-counters at Intel have cut costs in the manufacturing process of the heat-spreader and whatever gunk lies between it and the die.

      28 degrees idle to 100 degrees at load in under 2 seconds – not overclocked! The chip is throttled at 100. Cut the load and the temperature falls almost instantly to 30 again showing that the external cooling is just fine.

      • Person of Interest says:

        I’m aware of the inconsistent voltage/power consumption on the Kaby Lake K-series, but if your processor spikes to 100C in two seconds, there is almost certainly something wrong with your cooling mount and/or motherboard voltage settings. The drop back to 30C is not strong evidence that the cooler is mounted correctly.

  3. brucethemoose says:

    Xeons work in desktop motherboards, but ya’ll already know that.

    What most people don’t know is that single socket Xeons (previously branded as E5-1xxx) are generally overclockable.

    For example, when the i7-4960X was supposedly the baddest kid on the block, you could buy a 8C E5-1680v2 and overclock it. It even had better thermal properties, as it used a huge 12-core die with 4 cores disabled (whereas the 4960X used a small 6 core die).

    So, while it will probably be a weird OEM SKU us mere mortals can’t buy (as was the case with Haswell/Broadwell), an unlocked 24 core+ Skylake-e monster could exist.

  4. Banks says:

    Despite the slower clocks and the high dependence on faster RAM, ZEN cpus are outstanding. IMO, they are so good that building with Intel right now is a mistake.

    Intel could have reacted to this in many ways, but I think they have done it in the worst possible way.

  5. sandineyes says:

    If I were in a position to make a new build, the $599 i7-7820X would seem to be the best of the line-up for the price. The cheaper skylake-sp lacks the fancy new Turbo Boost, and the clock rates are not great if you are in it for gaming. The $999 part seems to just have more cores and pci-e lanes.

    I’m not sure what the purpose of the Kaby Lake-X parts are, except perhaps to offer an entry point into the platform. You don’t get the four channel memory, nor increased pci-e lanes, nor the Turbo Boost 3.0, nor the AVX-512 instructions. Seems like it would be better to wait for Coffee Lake.

  6. Premium User Badge

    Don Reba says:

    I would really like to see a Threadripper vs Skylake-X benchmark making the most of available AVX instructions.

  7. ravenshrike says:

    Here’s the problem for Intel. The boards for X299 are probably going to start at around 200 dollars. Whereas you can get the 6 core Ryzen part and an overclockable motherboard for all of 320 bucks, 300 if you’re willing to go small form factor. This vs figure almost double for the 6 core Skylake-X and mobo.

    • Buuurr says:

      The problem you are siting has never been a problem for Intel. They hold 80% of the market share for processors. They have held this share for a good 15-20 years now. AMD has always been the cheap alternative, as it is the case now. The status quo has in no shape or form changed. How could this be defined as a problem for Intel? From a business standpoint it is off to the market as usual.

      • Premium User Badge

        phuzz says:

        The Athlon64 was the cheap alternative sure, but it was also faster and cooler than anything Intel had at the same time.

        Another plus for AMD in my book is that I can buy mid-range CPUs without integrated graphics. I’ve got a graphics card, I don’t need to waste money on integrated graphics I’m never going to use (except when my GPU broke, I admit it was handy then).

        • Buuurr says:

          “phuzz says:
          The Athlon64 was the cheap alternative sure, but it was also faster and cooler than anything Intel had at the same time.
          Another plus for AMD in my book is that I can buy mid-range CPUs without integrated graphics. I’ve got a graphics card, I don’t need to waste money on integrated graphics I’m never going to use (except when my GPU broke, I admit it was handy then).”

          Yeah, so… we agree that AMD is the cheaper alternative? I had an Athon64. It was a great chip. It was up there for me as one of my most budget/performance friendly CPUs of all time. The thing is that they stayed right there. Back in the early 2000s. For a long, long, long time. Phenom? Failure. And they stuck with it from the Core2Duo days right up until recently. Many of my friends run these because there was no other alternative until this year.

          I’m not down on AMD. I have had many builds in the late 90’s (my first build was an AMD with a Asus branded Nvidia 4400) and early 2000’s that had the AMD badge. I just left it there though. AMD wanted to stay back there. I don’t think there is any arguing that. They have lagged for a long time.

          People bitch and moan about Intel being the giant and not giving us the potential of what it could. My look at it is this. AMD didn’t bother to give it any competition. If AMD didn’t give it a run for the money, what incentive has Intel had to plunk money into R&D when it can coast? Sure… the spirit of competition and all that but Intel is a business and easy money is still money. AMD’s lazy lineup is just as much to blame for the lag in tech.

      • GettCouped says:

        You are incorrect. The AMD design is actually a big win for business. AMD is using infinity fabric to link multiple clusters together to make a coherent processor with near perfect scalability.

        What does that mean? They basically link 4 8 core r7s together to make a 32 core 64 thread monster that performs better than anything Intel has, at a cheaper price, with better TDP, at incredible yields.

        Intel is in some serious trouble. Watch this video and learn something.
        link to youtu.be

  8. racccoon says:

    Actually this is briliant news for PC players & users as we had a massive slump many years ago where nothing was happening and we lumbered with a tech that we thought would never evolve much. This has awakened that sleep and made it more greater than before and the future of the PC is still alive as long their is competitiveness and technological advances.
    I love Intel for their dedicated hard work.
    I love Nvidia for their dedicated hard work.
    all the rest are nags just helping the top two go faster.
    thanks nags..
    p.s. boooo to consoles lol

  9. Tuidjy says:

    I know a few definitions for carper bagging, and I cannot see any way any of them can match your use in the third sentence.

    It clearly has nothing to do with luggage or the American South. I cannot have anything to do with newcomers, because Intel has been around forever, and were founded explicitly to do what they have always done, i.e. they did not move or switch targets for a underhanded gain. And I finally, I even looked up the British meaning, which I knew was different from the American ones, and I cannot see how Intel is trying to demutualize any company.

    So how are Intel carper bagging? Arguably, someone could think that the processor scene needs some carpetbaggers, but Intel cannot play that role.

    • ravenshrike says:

      Pretty sure he meant sandbagging.

      • ooshp says:

        Pretty sure he meant Intel is acting like a steak stuffed with oysters.

      • Jeremy Laird says:

        I did and you may claim your £5. Or something.

        Would it help if I said I’ve been under a lot of strain recently? I mean, I haven’t, but would it help if I said I have?! :D

  10. robotslave says:

    You know, there might be a few budget-constrained game developers reading your little web site here.

    And for them, this news is anything but irrelevant (particularly if they’re keen on Agile-type methodologies with tight build/test cycles).

    Give it a little time, and the impact might even make it all the way to the people who play the games those readers build…

  11. braven5 says:

    Seem all the majour people in the tech community like Linus and Jay2cents among many others are very upset at Intel and backing AMD for the high end productivity stuff, this guy kinda covers the technical reason for it link to youtube.com

  12. Dudeface says:

    RE. the Intel die shot – if you look closely you’ll notice the 2nd in from the left ‘core’ in both the top and bottom rows is actually different. So possibly the MCC die does top out at 18 cores?

  13. Premium User Badge

    The Almighty Moo says:

    I’ve been watching these with baited breath- holding off for a new computer for some time now and just as I got in a place to do it Jeremy started recommending holding off. Story of my life, but does give me something to look forward to.

    Just say the word Mr. Laird!

  14. aircool says:

    My PC is 5 years old (albeit with a GTX 970) and still does the job, but there’s no doubt I should be looking to get a new one.

    Unfortunately, there hasn’t been any significant improvements to CPU’s (still got an i5-3750K) to justify a whole new mobo/CPU/RAM upgrade. I thought that intel might have something to offer this year (tick-tock and all that), but there’s nothing that warrants justifies the cost.

    So, it looks like another year of my GFX card being throttled by the CPU in many games (despite those games being perfectly playable on high settings).

    One thing that does confuse me though is the i5 vs i7 debate. Some people say that hyper threading is important for games, other say that it’s not worth the extra money. That’s probably another reason why I’m holding off, I don’t want to get an i5 only to find out I need an i7 (or an i7 only to find it’s surplus to requirements).

    • Soapeh says:

      Yep, I’m in a similar boat. I’ve had an OC’d 2500k @ 4.4Ghz for the last 5 years and it’s showing signs of age when paired with my 980 trying to run on a gsync monitor. At high refresh rates the Sandy Bridges are seemingly bottlenecking the performance but I suppose I’ll have to hold out longer to see what these new platforms will offer.

      I had eyed up the 7700k for the crazy single-thread performance at 4.5Ghz but now it seems wise to check out the competition between Intel and AMD.

  15. tormeh says:

    I recently underclocked my CPU (FX 8350) to 1GHz just to see what cores games would use. Surprisingly enough, games remained playable but with significant stutter. Anyway, many AA games never needed more than 4 cores even at that clock. Even among AAA games it was rare to find a game that could use more than 6. If you buy an 8 core your 2017 games will never really have any threads competing for a core. Sure, there will probably be more than one thread per core, but no more than 8 of those threads will use significant processor time.

    6 cores seems to me to be the sweet spot right now.