Week In Tech: The Bifurcatin' PC, Nvidia Spoils AMD's Party
Computers, Computers.
With AMD making noise lately with new(ish) graphics cards and the threat of console-derived gaming domination courtesy of Mantle, the inevitable has happened. Nvidia has hit back. Predictably there's a new and pointlessly pricey graphics chipset to take on AMD's mighty Radeon R9 290X. Of more interest to we mere financial mortals are a range of broader technologies and updates, one of which is alleged to deliver the smoothest gaming mankind has ever seen. Meanwhile, is there a worrying new trend in the PC's technical development? Certainly, there are early signs that a split in the hitherto relatively happy community that is the PC platform itself is becoming a realistic threat...
The chasm erupts. Well, possibly.
Firstly, that bifurcating PC thing. Dunno about you, but seems to me that many of the major developments in PC tech of late are pulling in opposite directions.
First up, we've got AMD's Mantle threatening to make Nvidia GPUs second-class citizens when it comes to playing console ports. Just typical GPU-war stuff that will amount to nothing? Maybe. But what about the Steam Box and SteamOS?
As things stand, from a GPU perspective that's the opposite of Mantle and very much favours Nvidia thanks to the latter's generally perceived performance advantage in Linux. Then there's the whole Windows versus Linux pissing contest that's implicit in SteamOS. What if the next Half-Life title is SteamOS only? That's what I'd do if I was Valve and wanted to encourage SteamOS adoption.
OK, things like dual-booting PCs might mitigate some of this. But there's still a risk that configuring a PC could get seriously tricky in future. Are console ports your bag? How badly do you want to play that SteamOS-only title? Whatever, the inference is that whatever PC you go for might involve some serious compromises. That can't be good. It's bad enough having to buy multiple consoles if you're a committed multi-platformer. But multiple PCs, too? Anyway, just food for thought.
Nvidia's spoiler
So, Nvidia's new graphics. That'll be the GeForce GTX 780 Ti. What is it, exactly? We know it's out on 7th November for $699 which means £500 and up in Blighty. We know it exists as a spoiler for AMD's new uber GPU, the Radeon R9 290X. We know it will be pitched above the GTX 780.
We can probably assume it will be faster than the 290X, it's the only reason for it to exist. Depending on how you read Nvidia's bumpf, it should be faster than Titan as a pure gaming card. The latest rumours suggest it may even be the full GK110 chip unleashed and thus unlock a scarcely comprehensible 2,880 shaders. If so, where that leaves Titan is a tricky question. At this stage I'm not entirely clear if Titan lives on.
GK110 in all its 2,880 shader glory?
The 780 Ti almost definitely won't give you full speed FP64 double-precision performance as per Titan. It will likely be hobbled to 1/24th speed like the 780. Which doesn't actually matter for games and would allow Titan to remain as a sort of GPGPU enthusiast card. And as above 780 Ti will be very expensive. So not hugely relevant in itself, though it does give us a little trickle-down goodness as the 780 and 770 boards are now going to be a bit cheaper. Already we're seeing sub-£400 780s and near-£200 770s. That's very good news.
Big, green smoothie
Still, arguably the more interesting bits to come out of Nvidia's recent announcements (which we touched on earlier) kick off with something called G-Sync. The idea here is to maximise 3D rendering and in turn gaming smoothness by fully syncing the GPU with the display.
This is possible to a degree already courtesy of v-sync settings in-game or via the graphics driver. But v-sync is a very blunt tool and only really works at certain, shall we say, refresh rate steppings.
Chips with everything: You'll need both an Nvidia GPU and a monitor with a special chip to enjoy the alleged buttery smoothness of G-Sync.
Let's say your monitor is running at 60Hz. You can sync at 60Hz and if your GPU is capable of producing at least 60 frames per second at all times in whatever game you're running, everything is pretty much golden.
However, should it drop below 60 frames per second, you get problems, including stuttering with v-sync enabled or stuttering and tearing with it switched off. It's a bit more complicated than that, but the bottom line is that the mismatch inherent in a display with a fixed refresh rate being fed by a variable frame rates by a GPU means stuttering is usually going to creep in.
The solution is to dynamically match the display refresh to the GPU's output. And that's exactly what G-Sync is. Unfortunately, to achieve that requires hardware built into the display. In other words, you'll need a new monitor, though the slightly better news is that G-Sync works with any Nvidia GPU from the 650 Ti upwards.
Putting the technicalities to one side, the question is whether the quest for smoother frame rates is worth all this effort. Personally, I reckon the requirement for a new monitor is approaching deal-breaker territory.
G-Sync's need for proprietary hardware is a definite downer.
I absolutely get the attraction of super-smooth gaming. I'm a big fan of 120Hz-plus monitors for just that reason. But I also think if you have a beefy GPU, the benefit of G-Sync is going to be pretty marginal. You're already getting pretty darn smooth rendering.
Where it might well help is lower down the performance scale where budgets are tighter. But then that proposition is compromised by the need to buy a need screen. Whatever, G-Sync isn't an unambiguous win for gamers.
Well, not if we're talking flat-panel displays. Shift the context to virtual reality kit like the Oculus Rift and G-Sync could be a killer feature. Maintaining the illusion of being fully immersed in a VR world is precarious stuff and visible frame rate stuttering is the VR equivalent of popping the proverbial red pill. Suddenly, the illusion evaporates.
Some other stuff...
The remainder of the noise coming out of Nvidia is of relatively niche interest for now. Nvidia has brought all its game stream tech – so, both local streaming from your PC to your Nvidia Shield, not that you likely have a Shield, and cloud-based streaming – under a single brand called Gamestream. They've also added a new mode for the Shield called GameStream which allows for 1080p streaming to an HDTV but requires a wired end-to-end connection to achieve it. At this stage, the attraction of this somewhat eludes me.
This is wot the ShadowPlay control interface looks like. Er, that's it!
For all you game-capture and machinima buffs, there's also a beta release of GeForce ShadowPlay. It's basically an in-game video capture feature that uses hardware encoding to reduce the overhead to nearly nothing. The idea is that it means you can capture whenever you fancy and without any worries re borking your frame rates. As far as I'm aware, it's a feature that's switched on and available for free, so can only be a good thing.