CES 2018: AMD unveils new Ryzen CPUs and Nvidia takes gaming to the big screen

AMD CES 2018

The yearly tech fest that is the Consumer Electronics Show (CES) doesn’t officially start until tomorrow, but both AMD and Nvidia kicked things off early this weekend, detailing what’s in store for the rest of the year regarding their latest graphics cards, CPUs, and… “big format gaming displays”? Read on for a potted summary of all the big important bits you might have missed.

Starting in the AMD camp, we finally saw the announcement of the first Ryzen desktop processor with built-in Vega graphics, a glimpse of Ryzen’s second gen desktop CPUs that are due to launch this April, plus a full line-up of AMD’s Ryzen Mobile chips that will be going into forthcoming laptops. Graphics news was a little light, all told, suggesting its new Navi-based cards are still a way off from an official unveiling, but AMD did say they will be introducing its first 7 nanometer (nm) Vega Mobile GPU for machine learning.

Going back to that first announcement for a moment, AMD said their new Ryzen-Vega processors – the quad-core 3.6GHz Ryzen 2400G and quad-core 3.5GHz 2200G, to be precise – would have the highest performance graphics engine in a desktop CPU. That’s a promising claim considering they’ll cost just $169 and $99 respectively when they launch on February 12th.

Touting a score of around 5000 in 3DMark 11’s performance benchmark for the Ryzen 2400G, this easily beats the 3100-odd figure produced Intel’s integrated graphics on the Core i7-5775c in the same test, according to AMD, and should therefore theoretically provide a competent gaming experience at 1080p without the need for a discrete graphics card. Throw in FreeSync support and the fact they’ll be able to fit into all AM4 motherboards, and the G-chips could suddenly become quite a potent combo for budget system builders and those after teeny tiny gaming PCs, such as the one below that was shown off during AMD’s press conference.

Phenomenal cosmic power, itty-bitty living space

Then, just two months later in April, AMD will release its second gen Ryzen chips, Ryzen+. While it doesn’t look like these 2000-series CPUs will deliver a huge upgrade over the current gen chips, they should be more power-efficient thanks to their 12nm manufacturing process and onboard Precision Boost 2 support means they’ll also be better equipped to regulate their turbo clock speed frequency under load. They won’t require a complete system rebuild, either, as they’ll run on current X300-series chipsets and slot into every AM4 motherboard.

That said, AMD said it will also be introducing a new X470 chipset that’s supposed to be fully optimised for Ryzen+, with new boards using less power and providing better overclocking opportunities than their X300 counterparts.

As for AMD graphics news, most of it was about machine learning gubbins for cars that will mean little for PC gaming, so anyone hoping we might see Vega replacements for the RX 500 series will be sorely disappointed.

Still, at least it was a bit more exciting than what Nvidia had to show for itself, which was also largely about cars. Indeed, gaming only got a modest bit of stage time this year, but the two main announcements are thus: look forward to ‘big format gaming displays’ (BFGD) this year, and the fact that Nvidia’s cloud-based GeForce Now service, which was first announced for PCs and Macs at last year’s CES, is now finally available as a free beta for most Windows users in Europe and North America.

BFGD Nvidia

Sticking with GeForce Now, well, er… now, this should (finally) be a huge boon for those looking to play games on their aging laptops without sinking thousands of pounds/dollars into a separate one for gaming. With it you’ll be able to play all your games from Steam and UPlay via the cloud without the need for superpowered discrete graphics. Games can be streamed at up to 1080p resolutions and 120fps, and all driver and patch updates are installed automatically, with cloud saves enabled for cross-platform play.

Nvidia’s BFGDs, on the other hand, will need to remain firmly at home. These giant 65in screens will be packed with all the latest tech, including 4K resolutions, a 120Hz refresh rate, HDR support, Nvidia’s variable frame rate G-Sync tech and its very own Nvidia Shield streaming machine.

All displays, which are being made by Acer, Asus and HP at the moment, will have a peak brightness of 1000 nits (or cd/m2) and will also cover the full professional DCI-P3 colour gamut, so they should look pretty damn nice when they launch this summer. With an Nvidia Shield onboard as well, you’ll also be able to stream your PC games to the display, play the latest Android games, and get built-in Netflix, Amazon Prime Video and all manner of other entertainment apps for when you decide you need a break from all that big-screen gaming.

No word yet on how much these things will actually cost, but when your typical 65in TV usually costs in the region of £1000-1500, I wouldn’t be surprised if they end up being north of two grand when they finally hit stores. I’ll keep you posted.


  1. shaydeeadi says:

    Nvidia potentially setting a trend on big, high quality gaming monitors is welcome news indeed. Might make them more affordable overall as some of them are crazy expensive compared to TV costs.

  2. Blowfeld81 says:

    I am not sure about those gigantic monitors, you will have to sit quite afar from them, I guess, as otherwise you will miss information at the border of the screen, without your eyes having to move quite far.

    I would be more comfortable by screens around 40 – 50 inch with mentioned features, without bleeding or glow or gsync flickering and at a proper price below 2k Euro. ;)

    • Solidstate89 says:

      Those monitors come with a built-in nVidia shield capabilities. It seems to me they’re being sold as “monitors” because they have no built-in scalers or tuners like a TV would. It would rely entirely on hooking up something like an HTPC to it. Especially since nothing else supports Gsync except an nVidia graphics card.

  3. Carr0t says:

    40-50 inch is still way too big for me. My living room has a 37″ TV in as anything bigger just looks oversized in there. The office is smaller, and my monitors are currently 24″. I’m planning an upgrade to 27″, but I wouldn’t want to go any larger than that. I just want 4K 120+Hz to come out in that range too, as I’d want any new monitor to last me a good 10+ years.

  4. mattevansc3 says:

    Unless games start allowing to set the UI based on screen size, not resolution I won’t be buying into it.

    I’ve tried gaming on my TV and the UI, especially written words, never scale for how far away I’m likely to sit away from the TV so it just across as stretched and blurry.

    • juan_h says:

      Some games are better than others in this respect. Anything action-y or meant to be played with a controller is probably going to be fine. In my experience, the games that really suffer on big screens are strategy games and PC-centric RPGs. At 1080p on a 42″ television, Crusader Kings II is an eye-straining, headache-inducing epxerience. I have to drop the resolution to 1360×768 in order for it to be endurable. I would like to drop the resolution even further, but if I did there wouldn’t be enough screen space for all the things that I want to see simultaneously. There are mods for CK2 that will let you change the font but none that make the font bigger. I am forced to assume that doing so would break the UI.

  5. GrumpyCatFace says:

    Yes, I should definitely purchase a new CPU, before they fix the Meltdown/Spectre issue, to be sure that I can purchase another one after the fix.

    This makes good sense to me.

  6. Premium User Badge

    phuzz says:

    I’ve got a 24″ monitor (well, two, and one’s an ultrawide so it’s actually closer to 27″) and that’s quite big enough for me at desktop distances. Higher resolution would be nice.
    I’d rather have smaller pixels close to my face rather than big ones in a massive TV on the other side of the room.

  7. fish99 says:

    I could go up to 30″ for a desktop monitor (currently using 24″), but beyond that is just a bit silly, unless you’re sitting across the room, in which case a TV would be cheaper.

    I’m sure some people will like them though.

  8. Imperialist says:

    While i own both AMD and Nvidia products (which i have both praise and condemnation for), i will say it sure looks like AMD has a case of “We are actually innovating and doing something that people will like/use, and it will be affordable by average citizens who dont dwell on mountaintops of marble”.
    Then Nvidia comes around and is like “we are making something that is probably going to be so expensive, only 3 people will ever use it, but we sure are great right”.
    AMD has been on the ball these past few years, and their Ryzen and Vega products show promise that may need a bit of refinement.

  9. Fade2Gray says:

    My takeaway from Nvidia is that it’s still safe to upgrade to a current GTX card.