AMD Ryzen 3 2200G review: The Vega CPU with 1080p gaming chops

AMD Ryzen 3 2200G

Graphics card prices continue to outrage and frustrate almost every PC person on the planet. No one likes spending more than they have to in order to play the newest, shiniest games, but the current cost of GPUs is almost enough to make you want to throw your PC out the window and turn tail to join the console brigade. It’s that bad.

Before you do that, though, you’ll be pleased to hear there’s some very good news to be found in AMD’s recently released Ryzen Vega CPUs. Thanks to their built-in Radeon Vega graphics – Vega being the same name given to AMD’s top-end GPUs like the Radeon RX Vega 64 and Radeon RX Vega 56 – both the quad-core 3.5GHz AMD Ryzen 3 2200G on test today and the quad-core 3.6GHz Ryzen 5 2400G (which you’ll be hearing more about later this week) offer a surprisingly decent stab at 1080p gaming without the need for dropping hundreds of pounds on a dedicated card.

Now, let’s not get ahead of ourselves. Despite sharing the same Vega name as AMD’s premium GPUs, these integrated chips are very much aimed at budget builders and small form factor PCs. You’re not going to using them for 4K or 1440p gaming, and even at 1080p you’ll probably still have to lower the graphics settings right down to their most basic setting on most modern and recent releases.

But if you’re looking to build or buy a basic PC for your kids, or just want something you can play Rocket League and undemanding esports-y games like CS:GO and Dota 2 on without breaking the bank, then the Ryzen 3 2200G is a great place to start – particularly when this entry-level chip costs just £84 / $105. That’s a heck of a lot cheaper than buying an Nvidia GeForce GTX 1050Ti, for example, which is currently our recommended best graphics card for smooth gaming at 1080p under £200.

AMD Ryzen 3 2200G size

Even better, the 2200G will happily slot into any existing AM4 motherboard, and it comes with its own AMD Wraith Stealth cooler with a pre-applied layer of thermal paste in the box, making it incredibly easy to slot in and get going.

For the purposes of this review, AMD provided me with a Gigabyte GA-AB350N Gaming Wi-Fi motherboard, along with 16GB of G.Skill Flare X DDR4 RAM. Admittedly, first impressions weren’t brilliant, as an issue with the motherboard’s Wi-Fi kept making the whole thing crash. As a result, you’d do well to update your motherboard drivers as soon as you can before installing the Ryzen 3 2200G, as AMD have acknowledged that some AM4 motherboards won’t support it straight out of the box.

Wi-Fi issues aside, though, the Ryzen 3 2200G certainly felt pleasingly nippy in day-to-day use. AMD have said it’s really designed to replace the Ryzen 3 1200 as they start to usher in the rest of their new 2000-series Ryzen CPUs, but paired with 16GB of RAM I found the 2200G exceeded the speed of its intended predecessor quite comfortably when I put it through Geekbench 4’s benchmarking suite.

Here, the 2200G managed a score of 4016 in the single core test and 11428 in its multicore test. Going by Geekbench’s own benchmarking charts, that actually puts the 2200G more on par with AMD’s Ryzen 5 1400 / 1500X CPUs as well as Intel’s 7th Gen Core i5-7400, which isn’t bad considering the 2200G’s significantly smaller outlay.

AMD Ryzen 3 2200G side

I was suitably impressed with the Ryzen 3 2200G’s gaming performance as well. Some games will naturally be beyond its modest, entry-level ambitions – don’t expect to start playing Final Fantasy XV on it, for instance – but when I took it for a spin with a couple of today’s better optimised 3D games, I was pleasantly surprised by some of the results.

In Wolfenstein II: The New Colossus, for instance, rattling through reams of Nazis felt as smooth as butter on Low at 1920×1080, and even Medium felt wonderfully stable. Admittedly, Steam’s in-game overlay was throwing a wobbly at time of testing, so I wasn’t able to get an accurate reading on either setting’s average number of frames. However, both felt like they were easily punching above 30fps, with Low almost certainly hitting the full 60fps.

Hitman also put in a reasonable 40fps average on Low at 1080p. Admittedly, I had to drop down to 1280×720 to get Agent 47 strolling at a smooth 60fps on the Parisian catwalks, but nudging the settings up to High at 720p still produced an perfectly playable 45-odd fps in even the busiest of crowded bar scenes.

AMD Ryzen 3 2200G complete motherboard

Another game that typically performs well in benchmarks is Doom. Sadly, this was beyond the Ryzen 3 2200G’s means, as I wasn’t even able to squeeze out a steady 25fps on Low at 1280×720. Yes, the 2200G can run it, but attempting to play such a fast-paced game like this at anything less than 30fps is really a disservice to the sacred art of demon skull crushing.

The 2200G also began to reach its limits in The Witcher III, as 1080p on Low only garnered a rough, choppy average of 25-30fps. Fortunately, you can still get Geralt slicing and dicing at 60fps on Low at 1280×720, and even a decent 40-50fps on Medium at 720p. Likewise, those after slightly higher resolutions can expect to see around 35-40fps at 1600×900, albeit still on Low.

The Ryzen 3 2200G isn’t a complete replacement for a dedicated graphics card, then, and probably won’t provide a huge amount of future-proofing for big releases further down the line. Still, being able to run something as recent as Wolfenstein II is pretty impressive stuff for an integrated graphics chip, and anyone whose game library largely consists of older shooters and RPGs will be absolutely laughing. Skyrim at 60fps on High at 1080p, you say? The 2200G can do that and more, as 2560×1440 monitor owners should also be able to enjoy some 60fps Fus-Ro-Dah action on Low as well.

On balance, it’s probably worth spending the extra £45 / $55-odd and getting the 2200G’s more capable sibling, the Ryzen 5 2400G, but those looking to keep costs down certainly won’t be disappointed. Whether you’re after a cheap system for the kids or you’ve simply got to the point where you really need to upgrade your PC but don’t have the money for a dedicated graphics card, the Ryzen 3 2200G is both a decent stop-gap while you wait for graphics card prices to come back down and a great little CPU in its own right. It comes recommended.

25 Comments

  1. Carra says:

    I’m wondering who the target audience is for these integrated GPU’s. Most PC gamers will have a GPU.

    My Intel CPU has an integrated GPU. Never used it as I have a Geforce 1060 besides it.

    • Artist says:

      What smart piece of wisdom! So, if it doesnt target gamers.. who might be the target then? I mean – is there even other targets than gamers out there? Anybody? Who the heck uses all those integrated intel gpus that drive every game developer mad for years? Whats this insanity about?

      • DeepFried says:

        There is basically no-one but gamers building home PC’s these days, everyone else has a laptop if they have a computer at all and not just a tablet/phone.

        • Jievo says:

          I thoroughly disagree. I know plenty of people who do the odd bit of gaming but are by no means “gamers”, who play on desktop PCs for genre or convenience reasons. Most people who do a bit of light gaming aren’t going to shell out on a console, and consoles are highly restrictive in what genres and tastes are catered to. There is still plenty of market for low-mid tier gaming-capable PCs, even I have a high-mid tier pc and am quite happy with doing lots of my gaming at 720p, and I’m definitely a pc gamer (though I hate the word and the connotations). There are many, many reasons people like a desktop pc, and it’s a tragedy that they are becoming rarer in favour of laptops and tablets – innovations like this have loads of potential.

    • pack.wolf says:

      There is a world of difference between an Intel iGPU and this thing, both in terms of performance and compatibility with games.
      According to the steam hardware software 10% of steam users have a 750Ti. I’m pretty sure this CPU could play in the same ballpark. So that’s a few hundred thousand people to start with.
      It’s also nice to have an iGPU when using PCIe passthrough to a Windows VM to get native graphics performance, while running Linux as the host- and everything-except-gaming-OS on a second monitor connected to the iGPU. I do admit that I’m a niche audience though ;)

      • Premium User Badge

        keithzg says:

        Ooh, nice; what virtualization setup are you running?

        (Hopefully you’ll actually notice this, I know for myself for whatever reason email notifications haven’t worked for literally years.)

        • pack.wolf says:

          I run a OVMF KVM machine using libvirt-daemon-system on Debian. It gets its own graphics card (Nvidia 970), hence also its own sound via HDMI, and a PCIe-USB3.0 card for hotplugging.
          The most annoying part getting it running was that Nvidia purposely broke their driver twice by explicitly checking for the KVM CPUID bit and the hypervisor name. Current versions of libvirt can instruct KVM to hide/rename these though and with all the other HyperV-bits it can tell Windows about there’s no funky behavior as Windows still is made aware of being virtualized.
          And yes, mail notifications are still broken :D

    • tekknik says:

      link to en.m.wikipedia.org

      Writing code to utilize the GPU is/will be easier on AMD CPUs as they start realizing more of their HSA dream.

    • ColonelFlanders says:

      Well for one, the Ryzen Integrated graphics, even on the cheaper Ryzen 3, is absolutely blowing the coffee lake i7 out of the water. For example Shadow of Mordor runs on ultra at about 25 fps on the Ryzen 3,while the i7 manages about 10. So it’s a step in the right direction, especially since you’re getting CPU and graphics for a paltry 87 quid.

      Which leads me to the answer for your question – target audience. My guess at a target audience will be people who want to game on PC but can’t afford to, or maybe PC Gamers who have a decent rig, but maybe want to build a secondary one for their kid so they don’t have to share screen time.

      All in all this is a step in a very positive direction on AMD’s part, and if this chip is a success and sells well, we’ll only be seeing huge improvements in both performance and price in years to come. I’ll not be buying one as I don’t need it, but I can’t help but root for them – if nothing else but to keep Intel and Nvidia on their toes.

    • MiniMatt says:

      A 2400G looks to have a 65 watt TDP.

      65 watts for the CPU and the GPU, capable of running 1080p means a functionally silent and extremely tiny gaming PC is within easy reach. And relatively cheap too.

      If your gaming is of the Stellaris & Opus Magnum variety (my most played games this month) then this may be all the GPU chops you need.

      • blur says:

        And what that in turn means to me is that this particular chip seems like it’s the perfect solution for HTPCs. In that design space where compact & silent rule, while hardware transcoding and the occasionally 3d game are the taxing jobs, this might be just the ticket. I’ve been thinking of something to replace the AMD A4 and GTX 750 low profile (I know that the A-series APUs have graphics capabilities, but it turns out that they’re dreadful) I’ve been rocking, and this might allow me to downsize even further!

    • Premium User Badge

      keithzg says:

      This precise chip is what I used in a PC build at work last week. In modern PCs there’s a lot of hidden bottlenecks that GPUs are required for; Windows itself even runs rather painfully slow and stuttery if you don’t have much GPU oomph, and so if for instance you want to run a number of virtual machines for testing purposes you’re kindof out of luck unless you have a decent amount of video performance overhead to work with. This chip is very competitive as a single price and power draw for both CPU and GPU, and at the low price point I was aiming the build at there’s basically currently no competition.

    • Jievo says:

      I’m currently tasked with building my old man a pc for office work, music and a little gaming. The world is full of people who just want to play the odd bit of civilisation, with a card like this I think all his needs would be more than catered to and at a really tempting price. There is definitely a market for things like this – PC gaming is not just the domain of graphics enthusiasts, there are a LOT of people who game on PC because they like genres that aren’t represented on consoles (strategy, for example), or because they need to own a pc anyway and don’t do enough gaming to justify a console purchase.

    • PseudoKnight says:

      There’s a huge market for this in the desktop space. It’s great for an HTPC, an entry level gaming PC (not everyone needs to play the latest modern 3D games), and is generally great for a cheap system that provides a better performance profile for common tasks on a high resolution display. And why onboard video when you have a dedicated card? For backup. When my previous video card gave out, I had bought a motherboard with onboard video (for this very reason) that allowed me to continue to use my machine while I shopped around for my next card and waited for it to ship. There’s also ways of utilizing these new chips in parallel with your dedicated card. (though not on the same tasks, as that tends to be worse)

      • PseudoKnight says:

        Also worth mentioning that because of the inflated GPU prices right now, these chips are a great way to get into PC gaming now while waiting for prices to stabilize again.

    • megaboz says:

      @Carra
      Um, users that want a decent gaming box, but refuse to pay borderline extortionist prices for graphics cards, maybe?

      I can’t bring myself to pay more than MSRP for hardware, and the price of even a 1060 is crazy-go-nuts these days. I (foolishly, perhaps) decided to make a new Windows machine for games this year, and went with the 2400G model. So far, I’m extremely impressed with the performance. I’m getting smooth gameplay on near full settings at 1080p in games like Prey and Echo (note: I turn off AA, motion blur, and depth-of-field). WoW at 1080p on the “10” preset option gets 40-60fps even in raids. Witcher 3 takes a hit, though — I can manage right around 30fps, with mid-20s during combat.

      I don’t consider myself “hardcore,” or anything, but this is a _very_ nice stopgap until GPUs become affordable again.

  2. Chorltonwheelie says:

    If it can’t run something as well optimised as Doom it’s a little disingenuous to claim it has “1080p gaming Chops”.
    Wishful thinking won’t create the serious contender for Intel/Nvidia that we all want.

    • ColonelFlanders says:

      Give it time. These chips are going to enable us to buy an *almost* gaming PC for around £200. That’s just amazing value, and if these processors are a success then we’ll only see the situation gets better, especially if this mining bullshit goes on much longer.

    • Sakkura says:

      This IS a serious contender against Intel, and Nvidia has no answer to it whatsoever (unless you want to recompile all your games for ARM).

      This crushes Intel’s competing efforts, and Intel knows it. They are so aware of AMD’s GPU capabilities that they even made a deal to put an AMD GPU on a module with an Intel CPU (with iGPU) just recently. That was with a larger standalone AMD GPU, but AMD is still in the lead at the iGPU level.

      For reference, Ryzen 3 2200G gets 41FPS at 720p in Witcher 3, while an Intel Core i5-8400 (with HD Graphics 630) gets 11FPS. Guess which chip costs more… Yeah, the Core i5-8400 costs nearly TWICE as much.

    • truck says:

      Doom runs fine with these chips when using Vulkan, with DirectX there seems to be a glitch and barely any processing power is used.

  3. Premium User Badge

    keithzg says:

    Admittedly, Steam’s in-game overlay was throwing a wobbly at time of testing, so I wasn’t able to get an accurate reading on either setting’s average number of frames

    FRAPS to the rescue, perhaps? Is FRAPS even still around? I’ve increasingly diatanced myself from Windows over the years (particularly having to maintain such systems at work, which makes it a lot less fun to do at home) and I’m realizing now I don’t even know if FRAPS still exists as a software product, particularly in a world where its central feature can also be provided by PC gaming’s biggest distribution method itself. I’d imagine there’s still a niche for it, though.

  4. Michael Anson says:

    I have the big brother for this chip, the 2400G, installed in a relatively decent personal server rig. The system pulls bonus duty as a couch-gaming PC for same-screen multiplayer and games that my wife and I might want to experience that only allow a single player, and for these roles it has proved more than adequate. The decision to use the chip was entirely budget-based, as the investment in memory and storage space left little room for a discrete graphics card, and the 2400G is quite simply the best bang for the buck not just in APUs, but in CPUs with low-end discrete cards.

    In all, no regrets on this system.

    Edit: I should add a caveat that the monster cooler that came with the chip was DOA, but monster coolers are a dime a dozen these days and the replacement of it still didn’t impact the economics of the decision.

  5. racccoon says:

    Miners our Problem:
    The problem is now some motherboard companies have come up with a 19 graphics card connections set in their boards for miners, that is going to blow the market to oblivion when people work it out.
    What needs to be done is graphic card companies need make separate cards specific cards just for miners. even then that’s not the answer at all but a solution to our problem.
    I have heard that there is going to be a motherboard with slots to attached 45approx graphic cards for mining. WTF!
    But in doing all this research I also found the 19 card one burns 3000 watts of power!!
    So my calculations for the 45approx slot board is going to use 7000 watts of power!! Imagine a warehouse full! Friggin nuts! They are all nuts.
    I also heard that they only make 700 bucks a month after costs with a 19 slot board layout, that’s a total waste of time money, bad accounting and bad for our environment.
    The man who invented this data mining thing needs to be locked up as he’s created a major environmental impact just for greed in such a pathetic stupid way, for something that is nothing!

    • DeepFried says:

      You mean block-chain/bitcoin mining? its basically just transaction accounting, just like any bank would do. People get paid to do the accounts in bitcoin, that’s what bitcoin mining is, accounting.

  6. Jernau Gurgeh says:

    The Witcher 3 running at 30+fps at 1600×900 with the integrated graphics of a sub-£100 65w CPU? That’s really not something to be sneezed at. Actually pretty goshdarn impressive if you ask me. Which you didn’t, but here I am anyway.