Last week we caught an early glimpse of Nvidia’s latest and greatest GPU design, known as Maxwell. We’ll have to wait a while to see what impact it has on true gaming PCs, but the sheer power efficiency of the new architecture certainly looks promising. Anywho, the Maxwell launch event was a chance to hook up with Nvidia and quiz them on a subject that’s been vexing me of late, namely the rise of proprietary gaming tech – well, mainly graphics – for the PC. What with Mantle and HSA from AMD, G-Sync, 3D Vision and Shield-tethered game streaming from Nvidia, it feels like gaming hardware is becoming increasingly partisan. So what gives? Tom Petersen, Nvidia’s Director of Technical Marketing for GeForce, gave me the low down.
Right from the get go, I’ve not much liked the cut of Nvidia GameStream’s jib. You need an Nvidia GPU in the host PC, fair enough. But not only do you need Nvidia tech in the client, it’s limited to just one Nvidia-produced device, the Shield handheld console. Yuck.
That’s a pity, because there’s some definite goodness in Nvidia GPUs regards reducing streaming latency in the form of ShadowPlay and the on-die hardware that enables it. If you’ve bought an Nvidia GPU, shouldn’t you be free to choose the client device? What with Valve recently releasing its much more open streaming tech in beta form, you are. And that leaves Nvidia’s locked-down end-to-end solution looking pretty cynical.
“Right now our streaming tech is end-to-end with Shield and a host PC based on Kepler,” says Petersen, “but Shield isn’t a requirement for playback, so who knows what might happen.”
Petersen reckons the current pure-Nvidia-end-to-end setup is actually justified. “We aim to guarantee an experience and the most important characteristic for streaming is latency. We’re using our encoder and decoder technologies to minimise latency. If we don’t control both ends of that – the host and the client – it’s harder to guarantee the experience.”
Nvidia here, Nvidia there, Nvidia bloody everywhere
There’s logic in what Petersen says, for sure. But I still think we should be able to decide just how much we want to invest in Nvidia’s efforts to reduce latency. Take Petersen’s logic to the extreme and Nvidia would only be selling Titan Blacks because anything less is literally sub optimal for gaming.
And what about G-Sync, another end-to-end Nvidia technology? Couldn’t Nvidia open out compatibility on the GPU side and still make money flogging G-Sync boards to monitor makers?
“Firstly, G-Sync is not a gift to humanity,” says Petersen. “But, of course, there’s a natural tension. If we’re doing G-Sync and it only works with our GPUs, isn’t that a barrier to adoption? I totally get that. But we’re carrying the ball with G-Sync. We’ve spent a tonne of money investing in it and developing it and we’re now building the modules with partners. If we don’t get any return for that, we’re less likely to do something like this again.”
“Generally, there are two ways to think about this. Are we trying to develop a business for monitor technology or are we trying to make it so that our GPUs are differentiated versus our competitors? It’s the latter.”
Refreshingly honest, then, and frankly hard to argue with. With that in mind my remaining objections are cost and choice. G-Sync monitors are looking expensive and they limit your choice. Petersen says six major monitor makers, including Asus, Viewsonic, BenQ and Philips, are on board so hopefully the choice part of the equation will improve.
As for cost, in so many words Petersen suggested the monitor makers see G-Sync as a nice opportunity to make some margin and that means opening high with pricing and adjusting down if the market won’t swallow it.
Nvidia G-Sync: Seriously smooth at just 40fps
He also said the added cost of G-Sync for a monitor that’s had its scaler board swapped out for a G-Sync board (rather than adding G-Sync and keeping the existing board) is around $50. That sounds like a pretty big addition to the bill of materials to me and that’s a shame because G-Sync is a seriously nice technology. I had another chance to see it in action that day and the way it makes gaming at circa 40fps looks super slick is very impressive and could actually save you money. No longer do you need a £500 video card kicking out 100fps-plus for perfectly smooth gaming.
While I had Petersen’s ear, I couldn’t help but poke him with the prospect of Mantle, AMD’s very own proprietary tech. What’s his take on Mantle and the promise of dramtically reduced CPU overheads in-game?
“The idea of reducing the overhead in abstractions layers throughout the software stack – that’s great. But there’s a trade off between higher level descriptions or layers and ease of use. Mantle as a concept sounds pretty good, but I don’t think the delivered experience so far has been that great. Microsoft may or may not view it as something to look at. If Microsoft does something with Mantle, we’d certainly support it.”
Anyway, the overall momentum at the moment definitely feels like it’s towards proprietary tech at the cost of open standards. Want Mantle and some general console-port goodness? You need AMD and GCN. Fancy G-Sync or low-latency game streaming? It has to be Nvidia end-to-end. Is that the way Petersen sees it?
“We love gaming and we’re always looking for ways to make our platform better but it has to make business sense. And I don’t think that’s incompatible [with proprietary technology]. People who value what we’re doing are happy to buy our products. If they don’t value what we’re doing enough, they’re not going to buy them.”
Ultra low-latency game streaming? Yes, please. Nvidia Shield? No thanks
That’s not to say there’s any stubbornness involved. “Shield is an example where we did a technology that’s compelling but we’ve changed the pricing as we figure out where it fits – we’re changing and we’re learning. The same thing is going to happen with G-Sync. If it turns out that we’re doing something stupid with our pricing or policy that’s preventing adoption, we’ll adapt.”
Overall, Petersen makes a pretty good case for Nvidia’s broader strategy when it comes to proprietary technologies like G-Sync or GameStream. Put simply, these technologies have to make money. The point for me is that it’s easy to get antsy about steps companies like Nvidia take to make sure the things they invest in actually make money. When a guy like Petersen gives you straight, honest answers and doesn’t pretend that technological lock-in and end-to-end solutions are purely about looking after the customer, I tend to have more respect for what Nvidia is doing.
If there is a problem, it probably comes down to a lack of competition. With only two big players remaining in each of the core PC component market – graphics and CPUs – market distortions are inevitable. But we should still be bloody grateful for what competition there is. Things could be an awful lot worse.