AMD Ryzen 7 2700 / 2700X review: A tense showdown with Intel’s Coffee Lake Core i7s

Ryzen 7 2700X

AMD’s 2nd Gen Ryzen+ CPUs have put on a pretty impressive show so far, from the entry-level Ryzen 3 2200G and Ryzen 5 2400G with integrated Radeon Vega graphics right up to the mid-range Ryzen 5 2600 and 2600X – which for my money are better buys than Intel’s current crop of Core i5 chips. Now it’s time to look at AMD’s pair of flagship processors for 2018, the Ryzen 7 2700 and its souped-up counterpart, the 2700X.

With eight cores and 16 threads each, these top-end CPUs are AMD’s answer to Intel’s fancy 8th Gen Core i7 Coffee Lake chips, most notably the Core i7-8700 and its unlocked, overclockable sibling, the Core i7-8700K. Can AMD pull off that coveted hat-trick of CPU brilliance? The answer would appear to be…sort of, just about, but also not quite.

From a money-saving point of view, there are a lot of things AMD get right – and goodness knows we all could do with a bit of that these days given the current price hikes of today’s best graphics cards, RAM and literally everything else involved with building a new PC.

Firstly, they’ve had the good sense to include a cooler – and fairly substantial coolers at that – in the box with each chip, which is something last year’s Ryzen 7 1700X buyers sadly missed out on. With the Ryzen 7 2700, you get the RGB Wraith Spire cooler, while the 2700X nets you the even beefier, direct-contact heat pipe-equipped Wraith Prism (pictured below), whose RGB USB header gives you per-RGB light control around the ring and its transparent fan blades (just make sure you download the somewhat hidden software for it from CoolerMaster before you begin).

Ryzen 7 2700X Wraith Prism cooler

You don’t get any of that when buying Coffee Lake, and neither do you get the chance to potentially reuse your current motherboard to save yourself even more cash either. For while Intel’s 8th Gen Coffee Lake chips still use LGA 1151 socket motherboards, they’re only compatible with 300-series chipsets, which means you have to upgrade your motherboard at the same time as buying a new CPU.

AMD, on the other hand, have pledged to support their AM4 socket motherboards until at least 2020, and all Ryzen+ chips, including the Ryzen 7 2700 and 2700X, will work with every pre-existing AM4 board that’s currently on sale. Existing AM4 owners will likely need to perform a BIOS update to make sure you can slot them straight in (unless you fancy upgrading to the new X470 motherboards which have Ryzen+ support straight out of the box), but it’s a welcome gesture nonetheless.

There’s also the rather nice bonus that both the Ryzen 7 2700 and 2700X currently cost less than their predecessors did when they first came out a year ago, saving you $30 on the 2700 and $70 on the 2700X. You’re still looking at laying out considerably more than the pair of 2nd Gen Ryzen 5s, of course, but even at £257 / $292 for the 2700 and £283 / $325 for the 2700X, they’re still cheaper than buying a Core i7, as the 8700 currently costs £269 / $302, while the 8700K will set you back £320 / $350.

Already, then, the Ryzen 7 2700 and 2700X are look pretty tempting, and that’s without having even opened the box. So let’s get down to the nitty gritty of performance.

On paper, you’d be forgiven for thinking the 2700 and 2700X are just slightly enhanced versions of their 1700 and 1700X predecessors. It would certainly look that way from glancing at their respective base and boost clock speeds (see the table below for a full summary), but most of Ryzen+’s biggest new features aren’t really visible from their various spec sheets.

CPU No. of cores  / threads Base clock speed Boost clock speed Thermal Design Power (TDP)
Ryzen 7 1700
8/16 3.0GHz 3.7GHz 65W
Ryzen 7 2700
8/16 3.2GHz 4.1GHz 65W
Ryzen 7 1700X
8/16 3.4GHz 3.8GHz 95W
Ryzen 7 2700X
8/16 3.7GHz 4.3GHz 105W

 

Instead, what Ryzen+ really brings to the table is accessibility, allowing you to get the best speeds out of your CPU wherever possible without even the vaguest whiff of having to tinker around in your motherboard’s BIOS settings and start overclocking them manually. And it’s all down to AMD’s new and improved Precision Boost 2 and XFR 2 (or Extended Range Frequency 2) tech.

Provided there’s enough thermal headroom on offer, Precision Boost 2 runs each core as fast as possible whenever it can, giving you better multitasking performance when the CPU isn’t completely maxed out, such as playing and streaming games simultaneously or editing photos while browsing the web.

XFR 2, meanwhile, will keep boosting the clock speed up to 100MHz over the processor’s max limit when temperatures allow. Admittedly, you’ll probably need a beefier cooling system in place than the stock coolers that come in the box to really see the benefits of XFR 2, but there’s no denying that together these two bits of tech make both Ryzen 7 chips a lot more user-friendly than previous generations of AMD CPU. Indeed, if you do start overclocking them manually, both PB2 and XFR 2 get turned off, which can actually result in worse performance unless you overclock each chip close to their max boost clock speed.

Ryzen 7 2700 pins

As such, I’ve tested both chips as they come out of the box, pairing them with an Asus ROG Strix X470-F Gaming motherboard, 16GB of G.Skill Sniper X RAM, an Nvidia GeForce GTX 1070Ti graphics card and a Samsung 850 Evo SSD. Starting with Cinebench R15, which uses Maxon’s real-world Cinema 4D engine to render a complex, photorealistic scene of 3D orbs and baubles replete with anti-aliasing, lighting and other CPU-draining effects, both Ryzen 7 chips put in fairly respectable performances. Sort of.

In terms of single core zippiness, the Ryzen 7 2700 doesn’t fare particularly well. Its Cinebench score was just two points ahead of the Ryzen 5 2600X, for example, which by extension means it’s neither as fast as Intel’s Core i5-8600K, or either of its Core i7 rivals.

In a way, it’s not surprising given the 2700’s lower base clock speed of 3.2GHz (the 2600X starts at 3.6GHz), but even the Ryzen 7 2700X (which has a base clock of 3.7GHz), only managed a 5% improvement over its non-X 2700 counterpart – and even that still isn’t as good as what I got out of the Core i5-8600K.

Ryzen 7 Cinebench single core results

Single core performance, then, isn’t Ryzen+’s strong point, but historically this has always been an area where AMD have suffered compared to Intel. Instead, the real benefits come when you turn to AMD’s multicore performance, which in turn has often tended to be Intel’s main weakness.

The Ryzen 7 2700, for instance, sailed past everything but the 2700X in Cinebench’s multicore test, offering a 14% boost in speed over the 2600X, a massive 32% improvement over the Core i5-8600K, and a rise  of around 9% compared to the Core i7-8700K. The 2700X was even further ahead, coming in 13% faster than the regular 2700, and a sizable 21% quicker than the Core i7-8700K.

That 13% jump over the Ryzen 7 2700 is significant, too, as it makes the 2700X feel like a proper upgrade compared to the situation between AMD’s two Ryzen 5 chips. Here, the 2600X only offers a mere 5% bump over the regular 2600 – and that’s in both single and multicore performance. When both X variants cost around £20-25 / $30 more than their non-X counterparts, your money’s definitely going a lot further with the 2700X than it is the 2600X.

Ryzen 7 Cinebench multicore results

I saw similar results in Geekbench 4, too, although here the Ryzen 7 2700 actually came in just behind the Ryzen 5 2600X on single core performance rather than just ahead of it. Still, the difference in multicore performance played out much the same way, with the 2700X emerging victorious over its Intel rivals once again.

When it comes to playing games, however, there seems to be much less in it, regardless of whether you go for one of the Ryzen 7 chips, a Ryzen 5, or one of Intel’s lot. Of course, your CPU isn’t exactly the most critical component when it comes to gaming, but it’s still important for certain physics effects and particle bits and pieces when there’s a lot happening onscreen.

Indeed, when I put them through Rise of the Tomb Raider’s internal benchmark, which sees Lara stand on top of the snowy mountain from the beginning of the game, gaze in wonder at the rushing waterfalls from the early-on ruins in Syria, and do a camera sweep through the misty haze of the late-game’s demanding geothermal valley, both Ryzen 7s spat out almost identical frame rates to the Ryzen 5s on Very High, regardless of whether I ran them at 1920×1080 or 2560×1440.

The Ryzen 7s were ever so slightly ahead at 1920×1080, but we’re only talking a jump from 62fps to 64fps compared to the bang-on overall average of 40fps they produced at 2560×1440. Even their individual scores across all three benchmarks at 1080p were pretty much the same, and it was only really the 2700X that showed any vague signs of superiority – and even then you’re still only looking at a margin of one or two frames across every test. When you’re already running that close to 60fps anyway, this isn’t something you’re actually going to notice in practice unless you’ve got a frame rate counter going in the corner of your screen.

Ryzen 7 2700

The Intel Core i7 chips, meanwhile, scooted ahead slightly at both resolutions, but at 1440p you’re once again looking at an improvement of just a couple of frames. At 1080p, you might get roughly another 5-10fps, but again, unless you’ve got a high refresh rate monitor that does more than 60Hz, this kind of improvement isn’t something you’re likely going to be able to see with your naked eye if you’ve got a high-end graphics card.

Even if you don’t have a GTX 1070Ti at your disposal, though, is another potential 10fps really worth paying the extra £40 / $25 you’ll need to spend over to get the Core i7-8700K, not to mention the additional cash you’ll need to shell out for a cooler as well? Personally, I wouldn’t say it was, especially if you’re in UK where the price difference is more pronounced than it is in the US, but there are cases to be made in Intel’s favour.

For instance, if you’re largely going to be using your PC for gaming and nothing else, then Intel’s superior single core performance and ever so slightly better gaming chops may well be enough for you to tip the scales in the direction of the 8700K, regardless of all the extra cost involved. After all, it’s not like a chip with this kind of power won’t have enough oomph to handle simultaneous streaming and other multitasking gubbins effectively, so even if it isn’t quite as good as the Ryzen 7s on paper, it will still do one heck of a decent job at it.

Then again, if you’re more of a creative type of regularly deals with editing photos and videos as well as all the streaming paraphernalia while playing games, then I’d definitely err toward the 2700X. It’s arguably a much better value proposition than the regular 2700 for day-to-day computing (and definitely better value than either of the Core i7s just generally as well), and the 2700X gets you a much fancier cooler out of it, too.

Really, though, both of AMD’s Ryzen 7 and Intel’s Core i7 processors are a bit overkill for your average gaming PC, and you’d probably be perfectly happy by opting for the significantly cheaper £169 / $200 Ryzen 5 2600 instead, which, in my books, is a much better buy than both the 2600X and the Core i5-8600K. Still, if you’ve got the cash to spend and are adamant about having the best of the best, then the Ryzen 7 2700X certainly makes a very compelling case against Intel’s Core i7-8700K. It all just depends on what type of PC you’re looking to build, and whether you’d rather save a bit of cash in the process.

28 Comments

  1. Sakkura says:

    Good point about monitor refresh rate – an 8700K can be well worth it for a 144Hz display and a beefy GPU, but on a 60Hz monitor its advantage really shrinks a lot (let alone if you stream your gameplay or do video editing etc).

    • GurtTractor says:

      I happen to have a 144Hz monitor, paired with a 2700x and a GTX 1080. In any properly competitive or high speed games that might need the most FPS, the chip can easily provide the full 144 (overwatch for example, Dirt Rally, Rocket League etc). In some less optimised and more intense AAA games I might get maybe 10-15% less frames than if I had an 8700k, but I’m still basically always over the magic 80FPS mark, and mostly over 100. I think it’s only really the edge cases where I might actually notice a difference between either chip. That coupled with the general uptake in multi core utilisation from APIs and software, makes the 2700x a pretty sure bet for years to come IMO. But horses for courses.

      • Sakkura says:

        I actually have a 144Hz Freesync monitor myself, with a 2700X on the way (if Komplett ever actually ships the order, it keeps getting postponed). So I can’t exactly disagree. :)

      • Cooe says:

        1080p or 1440p? For the latter there’s literally next to nothing in the i7-8700K’s favor, a handful of fps at most [with a chart-topping GTX 1080 Ti for ex, the gap is just ≈3% on average & outright negligable for cards weaker than a 1080].

        Heck, well implemented DX12/Vulkan titles [most esp native one’s ala Doom or Forza] that can use Ryzen’s 4-extra threads for GPU draw calls [or titles threaded beyond 6c/12t], tend to put up better all important 1% & .1% fps minimums than the i7’s. With this effect being greatest when one’s GPU bound [as that’s literally DX12/Vulkan’s “thing”; multi-threading GPU draw calls across spare CPU threads to increase GPU utilization]. But even when you aren’t, this effect will still be present as long as the game ever has moments of heavier GPU than CPU load, and heavily utilizes 12 or less threads by itself (meaning Ryzen has more spare CPU headroom). This meaning that even at 1080p/144Hz with a top-end card, the Ryzen 2700X often matches or get’s near the i7’s min’s in said games, even if notably behind as far as overall average for said title.

        (I’ve got a 2700X with a 1440p 144Hz IPS w/ an R9 Fury X [aka a GTX 980 Ti/1070 class GPU], aka fast enough to push the display more than satisfactorily [with adaptive sync ofc], but “slow” enough to be very rarely CPU bound. Thus, as far as gaming is concerned, I get all of the 2700X’s gaming advantages w/o any of it’s weaknesses [as tiny as they may be at this res]).

  2. Tholesund says:

    Some quick googling shows that G.Skill’s Sniper X memory sticks are available in various speeds ranging from 2400 MHz all the way up to 3600 MHz. Ryzen CPUs greatly benefit from being paired with fast memory, so it would be good if the speed of the memory used in the review was mentioned somewhere.

    • televizor says:

      Based on what I’ve seen on tomshardware, they recommended the FlareX RAM that sat at 3200Mhz. The OCed it very easily to 3400 which resulted in minor boosts to FPS. SniperX is a bit newer and it’s also touting full Ryzen compatibility.
      So if you’re looking for RAM with max compatibility for your Ryzen 2700x build, like I am, get either of the two, clocked at 3200. Anything over that, you’re seeing diminishing returns and the RAM’s expansive as it is.

    • Tyrmot says:

      True. I’ve also seen a lot of evidence that the timings of the RAM makes a big difference as well, so I think that should be given also. The 8700K also overclocks pretty easily on the other hand, so that should be given as a benchmark as well.

      Still, I can’t help but feel that while the 2700X is very similar to the 8700K in games right now, those extra 2 cores are going to prove their worth down the line compared to the 6 intel are offering. And the possibility of replacing your CPU without a motherboard change as well (something you will never get with intel) is what tipped the scales for AMD for me this time around.

  3. mitrovarr says:

    I recently built a 2700x system to replace my aging i7 960. Seems nice, but game speeds seem basically identical. I carried over my video card (970 gtx) – I guess I must have been GPU limited.

    • Frosty Grin says:

      It’s pretty consistent with results on the newer processors. Most games still perform fine on fast quad cores. But you’re buying a CPU for the future – because it can serve you for 3+ years. Kinda like your i7-960. If you bought a Core i3 back then, you’d have to upgrade it sooner.

      • mitrovarr says:

        Yeah, I’m happy with the upgrade. It wasn’t just for better gaming performance (I mean, the old desktop was 9 years old!) I’m genuinely surprised I wasn’t cpu limited with that 960, though.

  4. sosolidshoe says:

    I don’t think I’ve used the cooler provided with a CPU since the one that came stock in the RM my mum looted from her university job back in the 90’s – why on earth would you consider the doubtless marked-up and still distinctly average air cooler being tacked on to the purchase without giving the customer a choice a point in these things’ favour?

    The vast majority of people building a PC are either already going to own a better quality custom cooler, or were planning to buy one with the upcoming build.

    I’d rather keep my Kraken X60 and save the 20 extra quid they’re doubtless using the daft LED-garbo cooler to justify adding on the CPU pricetag.

    • Sakkura says:

      The included coolers with the Ryzen CPUs are MUCH, MUCH better than what has been normal. You no longer have to plan on buying a better cooler.

      Of course many people will already have a good cooler, but sometimes that won’t be compatible with AM4. And some people won’t have a good cooler on hand.

      • sosolidshoe says:

        They’re better than the complete garbage coolers that were supplied with CPUs in the past, and I’m sure if someone’s buying a Dell or whatever with one of these in it it’ll do just fine for them, but if you’re an enthusiast buying one of these to put in a gaming rig you’re building(or upgrading one you’ve already built), the odds the stock cooler doesn’t end up discarded in a junk drawer or just the bin are pretty small. Certainly small enough that it should be considered an at best situational benefit, rather than a genuine positive as the article portrays it.

        • matt198992 says:

          The stock cooler actually helped me! One of the reasons I held out to buy the 2700x over the 1700x was because it came with a cooler. I have a custom water loop going on and can’t afford the rest of the parts for the CPU, so I’m using the remarkably good stock cooler for now.
          Just because YOU can’t make use of it doesn’t mean it’s useless.

          • sosolidshoe says:

            And just because a spectacularly small minority of people actually do have a use for it doesn’t make it a selling point for everyone else for whom it’s just dead weight that costs them money needlessly.

    • Cederic says:

      That confused me too. Especially suggested as a cost saving measure.

  5. Kittim says:

    You know what?

    I don’t care about these results.
    Not to say don’t appreciate the effort in obtaining them or anything. Thank you for the time, Katharine Castle.

    My dismissal comes from Intel’s attitude.
    They said they would issue a patch for my CPU.
    Then they said they won’t.

    That’s my decision made on all future builds.
    Intel can go away for not being thorough in patching the CPUs susceptible to Specter/Meltdown. It’s not the user’s fault that Intel screwed all CPUs built from ~2011 onward.

    If Intel can’t be arsed to patch their CPUs, I can’t be arsed to buy Intel.

    AMD all the way from here onward.
    AMD seem less prone to exploits.
    AMD seem to want to be more transparent.

    Unless Intel pull a quantum CPU out of their bum and offer it for £500, I don’t care at all.

    • mitrovarr says:

      Yeah, if Intel had offered a patch for my 970, I’d have waited a year, and then bought whatever Intel proc had the bugs fixed in hardware. But if they won’t fix their own mistakes, I refuse to let them benefit from it. So the next upgrade was going to be AMD (thankfully they are doing really good right now, so I didn’t have to give up much).

    • DeepFried says:

      Frankly even if Intel did patch the microcode for your CPU the chances of your mobo manufacturer issuing a BIOS update with it in are practically zero on a machine that old, so it would be a complete waste of time.

      Don’t blame Intel for your machine being obsolete.

      • mitrovarr says:

        It would have been a show of good faith. If the motherboard maker hadn’t made a bios update after they issued a patch, I would have held it against them instead of Intel.

      • Bent Wooden Spoon says:

        Microcode updates are applied at the OS level, they have nothing to do with BIOS.

        • DeepFried says:

          9. Is my device protected after I’ve applied the Windows security updates Microsoft released on January 3, 2018?

          To get all available protections for your device(s) against the three vulnerabilities described in this advisory, you must install the security updates for Windows and apply microcode updates provided by your hardware OEM.

          If your OEM does not provide a microcode update, or if you are unable to apply it, the Windows security updates released on January 3, 2018 alone address:

          CVE-2017-5753 – Bounds check bypass
          CVE-2017-5754 – Rogue data cache load
          To address CVE-2017-5715 – Branch target injection, you must apply a microcode update in conjunction with the Windows security update. Any questions regarding microcode updates must be directed to your OEM. Systems without updated microcode remain vulnerable to information disclosure as described in FAQ 8: What is the scope of the vulnerabilities?

          link to portal.msrc.microsoft.com

          link to support.microsoft.com

      • Don Reba says:

        Having to make new motherboards for every generation of Intel’s CPUs makes them harder to support. This can definitely be held against Intel.

  6. tekknik says:

    Still, the difference in multicore performance played out much the same way, with the 2700X emerging victorious over its Intel rivals once again.

    While technically true, it looks way worse for AMD when you look at an aggregate score in geekbench and also consider the core count. The 2700x has 8 physical and 16 logical cores while the 8700k has 6 physical and 12 logical. Yet looking at the geekbench scores for these both in multithreading performance you see the 2700x get a score of around 29k and the 8700k gets a score of around 26k. The 2700x only scoring 10% faster despite having 2/4 more cores. Combine that with the increased power usage over the 8700k and many would say the 8700k comes out on top.

    • Sakkura says:

      105W vs. 95W in TDP is a pretty small difference. Even more importantly, in many workloads the 2700X uses less power than the 8700K because it can run lower clocks/voltages and so on.

      link to tomshardware.com

      3 different load scenarios, the 2700X uses less power than the 8700K in every single one. Granted, it does use more power at idle… one whole watt extra.

      • tekknik says:

        That’s definitely a plus, and I was only looking at TDP instead of real world usage. Still I wish AMD bumped the clock rate a bit or something to push multicore to beat the 8700k by more but here’s hoping to AMDs next gen procs. Despite what it may sound like I am rooting for them.

  7. Dandadandan says:

    Built a 1700X system last year and cleaned myself out. Wouldn’t be bothered to upgrade for this gen really, but a bet with a mate involving kicking cigarettes has a 2700X as the prize for me if I make it to 100 days fag free… I buy it for him should I fail.

    93.5 days to go!

    • fleet hassle says:

      Hold fast. The cravings come in waves, and every time you make it through one without breaking down, you will get stronger. For me they peaked at about a week in, then slowly diminished. Occasionally a strong urge to smoke would come on, especially if out drinking; you may want to figure out what your best distraction strategy is and have it ready. Good luck!

Comment on this story

HTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>