Week In Tech: The Bifurcatin’ PC, Nvidia Spoils AMD’s Party

By Jeremy Laird on November 1st, 2013 at 3:00 pm.

With AMD making noise lately with new(ish) graphics cards and the threat of console-derived gaming domination courtesy of Mantle, the inevitable has happened. Nvidia has hit back. Predictably there’s a new and pointlessly pricey graphics chipset to take on AMD’s mighty Radeon R9 290X. Of more interest to we mere financial mortals are a range of broader technologies and updates, one of which is alleged to deliver the smoothest gaming mankind has ever seen. Meanwhile, is there a worrying new trend in the PC’s technical development? Certainly, there are early signs that a split in the hitherto relatively happy community that is the PC platform itself is becoming a realistic threat…

The chasm erupts. Well, possibly.
Firstly, that bifurcating PC thing. Dunno about you, but seems to me that many of the major developments in PC tech of late are pulling in opposite directions.

First up, we’ve got AMD’s Mantle threatening to make Nvidia GPUs second-class citizens when it comes to playing console ports. Just typical GPU-war stuff that will amount to nothing? Maybe. But what about the Steam Box and SteamOS?

As things stand, from a GPU perspective that’s the opposite of Mantle and very much favours Nvidia thanks to the latter’s generally perceived performance advantage in Linux. Then there’s the whole Windows versus Linux pissing contest that’s implicit in SteamOS. What if the next Half-Life title is SteamOS only? That’s what I’d do if I was Valve and wanted to encourage SteamOS adoption.

OK, things like dual-booting PCs might mitigate some of this. But there’s still a risk that configuring a PC could get seriously tricky in future. Are console ports your bag? How badly do you want to play that SteamOS-only title? Whatever, the inference is that whatever PC you go for might involve some serious compromises. That can’t be good. It’s bad enough having to buy multiple consoles if you’re a committed multi-platformer. But multiple PCs, too? Anyway, just food for thought.

Nvidia’s spoiler
So, Nvidia’s new graphics. That’ll be the GeForce GTX 780 Ti. What is it, exactly? We know it’s out on 7th November for $699 which means £500 and up in Blighty. We know it exists as a spoiler for AMD’s new uber GPU, the Radeon R9 290X. We know it will be pitched above the GTX 780.

We can probably assume it will be faster than the 290X, it’s the only reason for it to exist. Depending on how you read Nvidia’s bumpf, it should be faster than Titan as a pure gaming card. The latest rumours suggest it may even be the full GK110 chip unleashed and thus unlock a scarcely comprehensible 2,880 shaders. If so, where that leaves Titan is a tricky question. At this stage I’m not entirely clear if Titan lives on.

GK110 in all its 2,880 shader glory?

The 780 Ti almost definitely won’t give you full speed FP64 double-precision performance as per Titan. It will likely be hobbled to 1/24th speed like the 780. Which doesn’t actually matter for games and would allow Titan to remain as a sort of GPGPU enthusiast card. And as above 780 Ti will be very expensive. So not hugely relevant in itself, though it does give us a little trickle-down goodness as the 780 and 770 boards are now going to be a bit cheaper. Already we’re seeing sub-£400 780s and near-£200 770s. That’s very good news.

Big, green smoothie
Still, arguably the more interesting bits to come out of Nvidia’s recent announcements (which we touched on earlier) kick off with something called G-Sync. The idea here is to maximise 3D rendering and in turn gaming smoothness by fully syncing the GPU with the display.

This is possible to a degree already courtesy of v-sync settings in-game or via the graphics driver. But v-sync is a very blunt tool and only really works at certain, shall we say, refresh rate steppings.

Chips with everything: You’ll need both an Nvidia GPU and a monitor with a special chip to enjoy the alleged buttery smoothness of G-Sync.

Let’s say your monitor is running at 60Hz. You can sync at 60Hz and if your GPU is capable of producing at least 60 frames per second at all times in whatever game you’re running, everything is pretty much golden.

However, should it drop below 60 frames per second, you get problems, including stuttering with v-sync enabled or stuttering and tearing with it switched off. It’s a bit more complicated than that, but the bottom line is that the mismatch inherent in a display with a fixed refresh rate being fed by a variable frame rates by a GPU means stuttering is usually going to creep in.

The solution is to dynamically match the display refresh to the GPU’s output. And that’s exactly what G-Sync is. Unfortunately, to achieve that requires hardware built into the display. In other words, you’ll need a new monitor, though the slightly better news is that G-Sync works with any Nvidia GPU from the 650 Ti upwards.

Putting the technicalities to one side, the question is whether the quest for smoother frame rates is worth all this effort. Personally, I reckon the requirement for a new monitor is approaching deal-breaker territory.

G-Sync’s need for proprietary hardware is a definite downer.

I absolutely get the attraction of super-smooth gaming. I’m a big fan of 120Hz-plus monitors for just that reason. But I also think if you have a beefy GPU, the benefit of G-Sync is going to be pretty marginal. You’re already getting pretty darn smooth rendering.

Where it might well help is lower down the performance scale where budgets are tighter. But then that proposition is compromised by the need to buy a need screen. Whatever, G-Sync isn’t an unambiguous win for gamers.

Well, not if we’re talking flat-panel displays. Shift the context to virtual reality kit like the Oculus Rift and G-Sync could be a killer feature. Maintaining the illusion of being fully immersed in a VR world is precarious stuff and visible frame rate stuttering is the VR equivalent of popping the proverbial red pill. Suddenly, the illusion evaporates.

Some other stuff…
The remainder of the noise coming out of Nvidia is of relatively niche interest for now. Nvidia has brought all its game stream tech – so, both local streaming from your PC to your Nvidia Shield, not that you likely have a Shield, and cloud-based streaming – under a single brand called Gamestream. They’ve also added a new mode for the Shield called GameStream which allows for 1080p streaming to an HDTV but requires a wired end-to-end connection to achieve it. At this stage, the attraction of this somewhat eludes me.

This is wot the ShadowPlay control interface looks like. Er, that’s it!

For all you game-capture and machinima buffs, there’s also a beta release of GeForce ShadowPlay. It’s basically an in-game video capture feature that uses hardware encoding to reduce the overhead to nearly nothing. The idea is that it means you can capture whenever you fancy and without any worries re borking your frame rates. As far as I’m aware, it’s a feature that’s switched on and available for free, so can only be a good thing.


« | »

, , , , .


  1. FriendlyFire says:

    The one thing I’ll say is that I’m very curious about G-sync. One wouldn’t expect such a massive difference on beefy cards in normal circumstances, but apparently even then the impact is very noticeable (according to people who saw it in action).

    All that remains to be seen is whether the impact is the same in a (non-NVIDIA) controlled environment.

  2. Loiosh says:

    I’m curious if Mantle will be embraced by anyone except DICE, or left to die like Glide and other proprietary 3d architectures.

    G-Sync, at least, developers do not need to code for, for whatever good it can provide. I know that it is something I am interested in, but I’m a big vsync fan. The price of it is that I will not be able to stream when using G-Sync since none of my HDMI splitters can handle odd framerates.

    • Lenderz says:

      Should be embraced by most EA games using Frostbyte (lots of them) as it’s going to be baked into the engine.

    • whexican says:

      Aren’t they “forced” to use it since ATI has their gfx cards in all next gen console systems? Won’t devs be using the API regardless?

      • FriendlyFire says:

        Neither. The PS4 runs a variant of OpenGL as far as I know and the Xbone is using “DirectX 11.X”, which is a fancy way of saying a variant on DirectX 11. Neither uses or even supports Mantle as far as I’m aware.

        • top8cat says:

          From my understanding Mantle is baked into newer architecture and is apparently compatible with Opengl and DirectX, as well as easy to port to and from. So if its the magic water they claim, then there’s no reason why developers wouldn’t support it. Unless it’s more of a hassle than EA and AMD are suggesting, but if it also works….well that’s another thing altogether. No real reason to see AMD’s Mantle as anything but a hot air until December, where it will live or die.

          If it works AMD have absolutely nothing to worry about, it’ll be Nvidia that will have to convince developers to use their proprietary api(lets face it, they’d rather crash and burn than use Mantle), which will be a constant up hill battle with AMD dominating the most important battle ground, consoles. Nvidia’s ’54% PC share’ or not.

      • SuicideKing says:

        Apparently, not. Microsoft publicly stated that Mantle isn’t available on the Xbone, and AMD later confirmed that the consoles weren’t using Mantle.

    • SuicideKing says:

      That’s my impression too, Sweeney and Carmack were pretty pessimistic about it.

      So Unreal and idTech don’t support it, i don’t think Source 2 will either. Not sure about CryEngine.

    • Geebs says:

      Given the amount of toys leaving the pram in console-land re: the XBone’s lousy, down scaled performance and the PS4′s pretty poor, marginally less down scaled performance, I’m not exactly bullish on the whole DiCE+AMD = amazing!!! concept

  3. babajikibooti says:

    Epitome of paid journalism!

    • Solidstate89 says:

      If you think this is what paid journalism looks like (much less the epitome of it) you’ve literally never seen it before.

    • Ricc says:

      I’m not even sure which of the companies you are accusing the author to be biased towards. That should tell you something.

    • remon says:

      Fully agree. This is one huge Nvidia ad. Three mentions to AMDs latest card, a card that’s already been released but not covered in any RPS post. One of those mentions is to say that it will probably be slower than the 780 ti, a card that hasn’t been released yet.

      No mention of course about 290X’s performance being close to, and at higher resolutions better, than Titans.

      G-sync again, which has been covered before in its own article. Same with Shadowplay. Either paid, or well, I don’t know what else…

      • stoopiduk says:

        Could be poor PR on AMD’s part, could be that Jeremy was asked to write a piece just as the Nvidia announcement came out, and this was the most concise way of summing up the offerings from both parties, could be that he has a personal preference for Nvidia components and feels less comfortable waxing lyrical about AMD.

        Could be that you are far more intelligent than anyone else here, that we all blindly accept every word on RPS as gospel and that none of us have the capacity to further research hardware before making our buying decisions.

        My word, you may well have saved us all.

      • LukeNukem says:

        The article that is linked to in the first line is an article from RPS, from only three weeks ago, entirely about AMD. Where were your libelous accusations then?

      • airmikee99 says:

        The 290 wasn’t covered by RPS?


        Oh, I get it, you meant it wasn’t covered by Ronny’s Pilsner Service. Or you don’t know how to use a search engine. Which is it?

        • GamesInquirer says:

          That article mostly talks about inadequate AMD’s rebranding of old cards in line with the naming convention of their new card (7970 -> 280X), and how the new card (290X) is just the same as every other card but with more power, or something. As if we get one for something other than standards compatibility and great performance. It’s altogether negative and nothing seen on here would imply AMD’s rise to at least parity with Nvidia’s best for a far lower price. This article still doesn’t mention that despite price cuts AMD in most gaming performance tiers is quite a bit cheaper than Nvidia, it only talks about how Nvidia just spoiled things for AMD by doing this and that. I’m not suggesting either article was paid (for all I know they could be equally positive or negative for both companies based on the current mood of the writer had the same person done both related pieces, assuming he didn’t) but their coverage on RPS in general seems hardly balanced because of these, even the speculation over Nvidia’s next models is quite positive in that they will once again surpass AMD (with no talk of price, although in the previous article it was an issue that the 280X had suggested retail pricing higher than the clearing-the-shelves-for-new-stuff 7970s). I have no brand loyalty mind, my last GPU was a GTX285, my current is an 7970, in 2~3 years I’ll get whatever offers the best bang for my euros as always.

      • Hahaha says:

        “No mention of course about 290X’s performance being close to, and at higher resolutions better, than Titans. ”

        Your comparing it to a titan your opinion is invalid

          • neurosisxeno says:

            Now compare the power consumption, temperature, and noise to the Nvidia cards and you can see what lengths AMD had to go to just to compete. The average max load temp for a GTX780 and Titan is in the ballpark of 78-80c; for the 290X is 95c–where it hits a brick wall and throttles the GPU clock substantially to keep temperatures low enough to not melt the GPU. On top of that, the fan is a pleasent 55db under max load, which is the equivalent of your computer having a conversation with you. It takes 3-4x as much power to play back a Blu-Ray with a 290X than a GTX Titan (78w vs. <20w) and for general gaming under full load uses 50w+ more power than a Titan. While the price to performance for the 290X is great, the other aspects of teh card are honestly pretty underwhelming.

          • GamesInquirer says:

            Who even buys reference cards these days? I haven’t bought anything directly from AMD or Nvidia in a decade, it’s mostly been Gigabyte, MSI, etc. I haven’t even seen reference cards in quite a long while. Other companies have their own great cooling solutions, like Windforce and the like. It’s worth noting that AMD don’t let their fan speed up to anywhere near its maximum, probably because they used a fan that loud as that would make it sound even worse. Reviewers that bothered to use tweaks to set the fan speed to 80% or 100% manually reported much lower temperatures which bodes well for all the third party cooling solutions.

            Anyway, the 290 has also been reviewed and appears to offer performance pretty close to Titan and the 290X, surpassing the 780, for just $400 or less. That will probably be the one to watch for most, similarly to the 8800GT back in its day. So yeah, the only real “lengths” I see is a poor choice of fan for the reference cards, which won’t affect the majority of buyers, even less now that it’s been a widely acknowledged issue which means more people will wait for the third parties to put their stuff out.

            AMD also announced three more developers that will use MANTLE. Eidos Montreal (last Tomb Raider and Deus EX, next Thief developers), Oxide (Stardock studio with Civilization vets) and Cloud Imperium (Star Citizen, CryEngine) on top of EA/DICE.

    • phuzz says:

      I do not think that word means what you think it means.

    • FriendlyFire says:

      Aye. How dare the guy accept payment for his work? He should be doing all of this for free!

      This is an outrage.

    • SuicideKing says:

      Ok, not sure if spam. The account is made by an Indian (inferred from the name), after a song about weed with visualizations suggesting the writers know their LSD.

  4. frightlever says:

    Just wanted to say I love these hardware articles. Unless you keep half-aware of what’s going on it’s so easy to get completely lost in the buzzwords.

  5. realitysconcierge says:

    I kind of feel like nvidia is taking a page out of apple’s book with all of this needing consumers to exist within their ecosystem and nowhere else.

    • ZeDestructor says:

      Not really… Nvidia is just extending the DisplayPort protocol a bit (nothing VESA can’t add in natively) and Shadowplay is just FRAPS/dxtory, but using the GPU to encode the video rather than the CPU. You could implement that yourself if you want to…. but nobody has bothered yet. Maybe now someone will, and add in support for AMD and Intel chips.

      GameStream is proprietary, but the basic concept has been around since ages in the shape of VNC and thin clients (like Citrix and VMWare) with a central virtualized/shared core. All Nvidia has done is make it work properly. There is literally nothing stopping people from doing the same with an open protocol, the main issue being getting the latency down.

      • realitysconcierge says:

        I stand corrected! That was very concise ZeDe

        • ZeDestructor says:

          No problem.

          Honestly, aside from G-Sync (that actually requires some hefty re-engineering work for the controllers), I’m rather disappointed the rest of the industry did nothing to take advantage of the technical capabilities… :/

      • AsamiImako says:

        It’s not exactly as simple as “using the GPU to encode rather than CPU.” Kepler cards literally have hardware dedicated to this exact purpose, which is how it doesn’t really impact game performance either. Sure you could write an H.264 encoder that uses CUDA or OpenCL instead of the CPU, but that still wouldn’t achieve exactly what ShadowPlay does. That said, I do hope they open up access to the hardware encoder so that other programs like OpenBroadcaster or XSplit or Evolve can use it for their streaming/video recording tech.

        • SuicideKing says:

          AMD has a similar DSP for Audio and i believe they also have a video encoder with GCN 1.1.

          Heck, even Intel has QuickSync.

        • ZeDestructor says:

          Technically the encoder is part of the GPU. I was being concise for brevity.

          In any case: “Like Quick Sync, NVEnc is currently exposed through a proprietary API, though Nvidia does have plans to provide access to NVEnc through CUDA.”, from http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-16.html

          There just hasn’t been any interest so far from the community in building an NVENC powered streaming tool. Hopefully this will change, cos I’d quite like similar support under Linux…

      • jrodman says:

        I disagree with the summary of shadowplay. One of the tricky things is efficiently acquiring the stream of screen update information. When the access is being made in the video card driver or on the videocard itself, this is much more achievable then when you have to find some way to hook into the operating system. The latter requires either copying all the video data back off the card, or simulating the card’s actions in some space-brain software thing.

        So i think it’s genuinely more difficult by far for third parties to achieve what they’ve achieved, partly because of nvidia’s longstanding non-openness of their hardware interfaces.

        That said, it looks neat, and I can’t see why anyone would have problems with it.

  6. Retro says:

    Friday is my favourite RPS day

  7. slerbal says:

    ShadowPlay is great – I’ve already been using it lots in Arma3 and Outlast and compared to FRAPS there is no noticeable FPS hit – it is a genuinely useful and helpful release for my needs :)

    • nrvsNRG says:

      Agreed. Shadowplay is awesome and the best thing from Nvidia ive seen in ages that I’m actually gonna use.

    • neolith says:

      I have to disagree.
      Shadowplay is an incredibly good idea, but its implementation is more than flawed.

      The Shadowplay interface is wired to the IMHO rather useless Geforce Experience panel and it offers almost no settings whatsoever. Sure, you can set things to max quality, but you still don’t know what this means for the h264 settings internally – so you are most likely bound to re-encode.

      The software is also not capable of rescaling. This again means re-encoding afterwards.

      Shadowplay doesn’t work outside of fullscreen games – not for windowed titles, not for fullscreen windows and not for standard desktop applications. It also doesn’t record all sounds your system outputs, so trying to comment your gameplay is a problem.

      The encoding leaves much to be desired. Videos made with Shadowplay have the same problems as some made with CUDA based encoders: there’s encoding errors every few seconds (in most cases the usual boxy elements of wrong colors) and dark colors (especially dark blue and dark brown) get pixelated way too much.
      Every software encoder produces better quality.

      While the running encoder does not slow down the framerate by much (at least on my 760 and on high quality settings), the produced videos still stutter noticeably. This is of course most easily to be seen in fast games like shooters, driving games and beat’em ups. When I watched my videos frame by frame I found that the encoder seems to sometimes put the same rendered frame in consecutive frames of the video, increasing the stuttering even though the game while playing remained smooth.

      The recordings sometimes show errors that seem to be based on the 3d geometry in the game. I could reproduce these errors in Batman:AA the easiest: move the camera rather close to Batman’s back, make a sudden movement that makes the cape move and Shadowplay renders the whole cape pitch black for a single frame. This might be the game’s fault, but the error is not reproducable with other recording software.

      The 4GB limit on Win7 machines is disappointing.

      I know that the software is probably not finished, but in its current state it is nothing more than a fun gimmick with tech problems. :(

      • nrvsNRG says:

        Only tried it on PoE so far and that looked great, and sounded good, (including TeamSpeak).

      • droid says:

        Another thing not awesome about ShadowPlay that manual mode turns itself off in ~10 minutes. I am guessing this is related to a file size limit, but why it doesn’t immediately open a new file?

  8. IshtarGate says:

    G-Sync is actually old news at this point. It’s a shame because for all the good writing, I was expecting this article to expound more on how all these proprietary technologies are tearing PC apart right as it is picking up momentum like never before. A golden goose case if I ever saw one.

  9. BLACKOUT-MK2 says:

    Another beautiful graphics card that I can’t afford XD

  10. amateurviking says:

    Must say I was, until reading this, thinking about trading up th 560ti fro something shinier. Think I will wait for the time being and see how things shake out. Not too much need for an upgrade as I’m still using a 1680×1050 monitor. This may change soon though.

    • nrvsNRG says:

      760′s are well cheap right now (from Amazon anyway), and a lot of the 770′s brands have dropped, but apparently not from EVGA (teh best). But yeah absolutely no point in upgrading to any of that unless you have at least 1080P.

  11. lautalocos says:

    well, probably by next year, when a lot of the new generation games are coming, i will buy an nvidia VGA and a new processor. for now, my little i3 and my HD 7770 will be fine.

  12. ZeDestructor says:

    “very much favours Nvidia thanks to the latter’s generally perceived performance advantage in Linux” There isn’t much performance difference between AMD and nvidia… assuming you can make AMD’s drivers work reliably. Been there, done that, sticking to nvidia for the forseeable future.

    “At this stage I’m not entirely clear if Titan lives on.” Titan has always been about the full FP64 compute. It will remain as the poor man’s Quadro K6000 (those go for 5k USD, or regional equivalents) or be mildly upgraded to having a fully-enabled core. Not much else though.

  13. Viroso says:

    I don’t think there’ll be SteamOS only games. Wouldn’t those at least be Linux only games? I don’t think anyone would ever do that, not even Valve. They have a ton of users who don’t have SteamOS, simply because it doesn’t exist yet, then they’d release a super anticipated game and arbitrarily lock it to their own OS, which can be run on any machine, not just a Steam Machine. Doesn’t make sense.

    Also, all of this console port talk from AMD, I think we’ll see minimal improvements and probably not for all of their graphics cards. We’ll only be getting a GPU that’s kinda like the ones in a console, meanwhile everything else won’t match.

    But most important of all is that console ports have never been a problem to anyone gaming on PC. Most of us don’t have a problem running console games, maybe this console compatibility will only be worth it for people buying the weaker GPUs.

    • Baines says:

      Valve is supposedly working on getting better performance out of SteamOS than vanilla Linux, so even if a game runs on Linux, it could potentially run better on SteamOS. Of course Valve could always pull a Microsoft (since it looks more and more that that is what Valve wants to become) and just flat out make something SteamOS exclusive even if it could run on something else.

      • Viroso says:

        What’d be the point though, get people to use a free OS that exists for games? They already get their good share of users on Windows, and the one of the important things with SteamOS is to make PC gaming more accessible and more open, which goes against cutting access to the platform everyone uses.

        • jrodman says:

          Weird indies who are Linux users full time and aren’t in games for a living might make linux-only games partly for steambox. I can’t see commercial game makers doing that though.

    • DanMan says:

      I could totally see HL3 being a timed exclusive to SteamOS just to drum up news coverage and nerd rage.

    • Ross Angus says:

      I thought SteamOS would install on any (modern) x86 architecture. And would be distributed free. Is it a SteamBox only thing? Have I misunderstood the internet?

    • killias2 says:

      Yeah, people who think HL3 or whatever will be a SteamOS exclusive are, IMO, entirely missing the point of SteamOS. SteamOS isn’t there to be a new market. It’s there to -expand- Steam. It’s not there to replace or undercut Steam. It is Steam. It’s just a new way to get Steam and a new niche for Steam to compete in.

      This isn’t like Windows vs. Xbox 360 or something, as people seem to think. It’s more like Windows on desktops vs. Windows on tablets. It’s an attempt to take Steam and expand its reach.

      • Viroso says:

        Yeah. I think a lot of people might be too excited with SteamOS and Steam Machine, like it’s going to be this major player that’s going to go face to face against Microsoft and Sony on all possible fronts.

    • ayprof says:

      My guess is that Valve will release it early for SteamOS/Linux early, then wait a month or so to release the Windows port. People insane to play the game will make it happen, those who wait will be “uncool” for a short while and resent it enough to give SteamOS a shot the next time something comes out for SteamOS early. At the very least, if Valve does it once, expect it to happen again.

      • Viroso says:

        I think it’s unlikely. For one, Valve will just get people installing SteamOS, they won’t make money from that. Games sell a lot during a small window of time, there’s lots of marketing investment to drum up the hype and get a big release.

        Releasing on SteamOS doesn’t mean selling a Steam Machine but it means less sales and angry users. People would get angry, no doubt. The release would only hit a small fraction of players, because I think most people would not want to install a new OS. They’d restrict their market, a lot, for the initial release, which is where the bulk of the sales happen. Of course the game would sell after it came to Windows, but then they’ll have lost all that marketing excitement, they’ll have gained angry users and there’d probably be pirate copies floating about when it came to Windows.

        I think it’s unlikely not only for those reasons but because that kinda goes against what Valve usually does.

  14. whexican says:

    Don’t forget that the 780 Ghz is a a thing and should be coming out shortly to replace current 780 units with a 10% to 15% gain and hopefully another reduction in price.


  15. rock_paper_shotgun says:

    Poorly written. Are we talking about a software split? Ummmm… what can actually be done on the GPU has always been a big problem for game developers on PCs even when using Direct X or OpenGL. The hardware is just too diverse and a constant moving target. Are we talking about operation systems? Make up your mind :) The argument against linux has always been that the user base is too small. Shifting to AMD only for development in Mantle reduces your potential customer pool to a small fraction of the existing base of all PC users so the graphics had better be mind shatteringly awesome. So awesome that some people fall into comas watching the footage. It seems unlikely until Mantle becomes something that lets you develop the same code across all consoles and PCs powered by AMD (which it isn’t right now).

    SteamOS is a long game. It is a fight that they might “win”… over the next decade. As another poster said, there might be a timed release, but there is no way that Valve would entirely exclude windows users from the next “3″ in Valve’s arsenal.

    Totally disagree with the gsync “deal breaker” perspective. When people see how good it looks, they will buy it once it is reasonably priced. There is no reason not to.

    • SuicideKing says:

      Pretty much agree with all three points you make.

      Mantle is undesirable (Glide 2.0), and it’s not used in consoles currently.

      SteamOS or Machines aren’t limited to Nvidia hardware, and iirc Valve and AMD both said that final hardware will have stuff from both GPU and CPU vendors. We’re free to install SteamOS on any PC anyway.

      The monitor-side chipset for G-sync isn’t that expensive ($100) and costs should come down quickly. Plus, may be released as a DIY kit. And, unlike Mantle, this can actually be made open to other companies (or they could develop their own open standard).

      • jrodman says:

        Glide was pretty great though, because it came about before any other reasonable API was available. Sure OpenGL you could use on an SGI system, but on a PC there was nothing viable, and glide was good enough and effective enough and easy to get started with. It was in the right place at the right time.

        In the long term, Glide didn’t make sense. Other card vendors couldn’t build to it for a variety of reasons (legal, architectural, etc) and it eventually died completely with its owning company.

        NOW, there are already two entrenched, viable, maintained, cross-hardware apis. I can’t see how something vendor-specific can get anywhere. Maybe it could work if AMD becomes first-mover on the new trend in cross-hardware APIs, but that doesn’t seem to be the angle.

  16. Don Reba says:

    What if the next Half-Life title is SteamOS only? That’s what I’d do if I was Valve and wanted to encourage SteamOS adoption.

    Good thing you are not Valve. Is there any chance of you becoming Valve in future? If so, what can we do to prevent it?

  17. Jason Moyer says:

    I guess I’d have to see it in action, but G-Sync seems like an expensive/complicated solution to a problem that can be fixed by just using frame-limiting software with vsync disabled.

    • SuicideKing says:

      Um, no, that’ll still tear if the frame rate post limiting isn’t a multiple/factor of the monitor’s refresh rate.

  18. Megakoresh says:

    I have a feeling this G-sync thing will eventually come down to the price of those monitors.

  19. Perjoss says:

    I know I’m not the only one that read that headline as ‘The Beefcurtain PC’

  20. alsoran says:

    Well, whatever they are doing, AMD or Nvidia, I can’t afford it anyway.

  21. SuicideKing says:

    AnandTech reported something about the G-sync controller for the monitor being released as DIY kit, and since it only replaces the “scaler” (don’t really know what that is, can only guess) in the monitor, it shouldn’t be too much of an issue.

    The kit will initially cost around $100, a cheap alternative to getting a $200+ GPU for pumping out smooth frames.

    Anyway, G-Sync or another open alternative does stand a chance to be standardized in the industry, and would be far more beneficial than Mantle or 4K.

  22. frymaster says:

    “if the next Half-Life title is SteamOS only? That’s what I’d do if I was Valve and wanted to encourage SteamOS adoption”

    You’ve got it backwards. They aren’t selling games as a way to drive steamOS adoption, they’re pushing steamOS as a way to sell more games. especially because they’re wary of MS possibly locking the platform down like they did with metro apps

    Furthermore, that’d also cut out all the non-steamOS Linux users – and even if we change it to “Linux exclusive”, not steamos exclusive, that’s not much better. If anything, Valve are wanting to encourage developer adoption, which they can best do by releasing a cross-platform game and being very smug about how much easier the Linux-specific code was to write

  23. Predatoro01 says:

    Sooo nvidia and Linux is a thing now?
    Doesn’t sound like that here: http://www.ubergizmo.com/2012/06/linus-torvald-says-fuck-you-nvidia-for-not-supporting-linux/

  24. rsf says:

    Jeremy Laird: “[SteamOS/SteamBox] ..very much favours Nvidia thanks to the latter’s generally perceived performance advantage in Linux”

    Driver support is the main issue behind the generally perceived performance advantage. ATI’s current Opengl drivers have a lot of issues. This just due to lack of interest which will change quickly.

    Valve has said previously : “We’ve been working with NVIDIA, AMD, and Intel to improve graphic driver performance on Linux.”

    NVIDIA and AMD appear to have increased pace of work since the SteamOS announcement.

    It’s interesting that Valve managed to get better FPS in Left4dead on Linux/OpenGL than Windows/Directx.

    Jeremy Laird: “OK, things like dual-booting PCs might mitigate some of this.”

    There’s already Linux implementations on virtual machines, Colinux that runs Windows and Linux kernels side by side, linux OSes that run off CDs with no installation required and so on. The point is that if people find it useful, support for simple installations of SteamOS and flavours of Linux which run on your normal Windows PCs will be created from the Linux side of things. This is because it’s Opensource:)

    Some interested parties might even throw a kickstarter to hire programmers to get things done faster, since it’s a trendy thing with games.

    I’m surprised Valve aren’t seeking crowd funding to make SteamOS even more user friendly, or make more user friendly various common Windows programs people might miss. Valve should throw a crowdfunding drive aimed at various game companies and less large hardware companies at a minimum – those have a clear interest in seeing a Linux replacement that frees them from Microsoft’s non-benevolent antics.

    A couple of 100 million would be a tiny sum for the gaming industry as a whole, but would go a long way to making SteamOS and useful Apps user friendly. If RPS interviews a Valve dev this is something that could be suggested.

    Interesting points:
    1. Features developed for SteamOS will be ported and benefit other flavours of Linux since everything is Opensource. SteamOS will have a positive impact no matter what happens.

    2. SteamOS compatible games will eventually be playable on other Linux flavours – if they aren’t from the outset.

    3. Going by Valve’s Open Sourcing the CAD docs for the SteamBox, Valve will have no objections to games working with other Linux flavours and will actively support this.

  25. Brothabear says:

    when you say “niche” for nvdias game stream technology you weren;t fucking around…. Damn thing was build from the ground up only because of the (nvidia shield) they keep throwing in out faces…

  26. Hahaha says:

    “AMD’s mighty Radeon R9 290X”


  27. Trespasser in the Stereo Field says:

    Sigh…I have no money for any of this new crap. Just as well then that I spend 80% of my time exploring games I missed 20 years ago on GOG. Oh Nameless One, you shall find True Death while I wait for insane video card prices to drop!

    Edit: the comment in the article about buying a new OS to play a new Half Life game made me spit my coffee out onto my keyboard. Very funny, that!

  28. HisDivineOrder says:

    Like the rest of the intarwebz, you’re confused about G-sync’s advantages.

    Having lots of frames above the vsync level only creates input lag. That’s why lots of people don’t run vsync at all. Gsync removes the input lag caused by using vsync without any of the problems frame limiting seems to run into on a regular basis (mostly stutter).

    Right now, if you have a uber powerful rig, you choose between running vsync and having input lag (and possibly stutter even if you don’t go below the 60 fps level) or going without vsync, having tearing, and watching a lot of performance get wasted rendering frames you won’t see that could be dedicated instead toward more particle effects, etc.

    Gsync makes it possible to make 60 or even 30 fps smooth, allowing games to target the lower frames per second while boosting feature usage and/or resolution possibilities to much higher degrees than if you have to strive to get as many fps’s just to try and approach some measure of smoothness.

    Gsync gives you the smooth without those concerns. Gsync is the thing we’ve needed on PC forever. The only problem I see with gsync is it’s proprietary and I really, really wish it wasn’t. I wish it had been part of Displayport’s spec since Day 1. It should have been. Displayport would have been such a great time to add this in for all displays.

    As a bonus, it would have been a compelling reason to go with Displayport versus HDMI and would have given us the superior Displayport spec for all our HDTV’s instead of the crappiest version of DVI ever made, HDMI.

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>