PhysX-on-a-GeForce: Next Week

By Alec Meer on August 8th, 2008 at 9:46 am.

You may remember Kieron doing science a few weeks back about NVIDIA’s CUDA system – clever trickery that allows a GPU to perform processing feats other than pixel-pushing. There’s a lot of real-world algorithm-crunching applications for it, but of most interest to gamers is that it can make your GeForce 8, 9 or 200-series card behave like a PhysX board. NVIDIA bought out PhysX makers Ageia a while back, and we’re soon to see the fruits of such money-labours.

The big question is to what extent simulating cratesplosion will slow down the graphics rendering. We’ll get to find out next week, with the release of the GeForce Experience Pack.

Blues has word on the contents, which is mostly old PhysX stuff but retuned for CUDA:

Warmonger–full free game! Destroy walls, floors, and whole buildings top open up new paths or close existing ones. Destructive power is more than eye candy here–it’s a tactical weapon in this ground-breaking action game.

Unreal Tournament 3 PhysX Mod Pack–includes three maps with amazing effects that fundamentally change the gameplay (requires commerical version of Unreal Tournament 3)

A sneak peek at the upcoming Nurien social networking service, based on the Unreal Engine 3

A sneak peek at the upcoming game Metal Knight Zero

All new NVIDIA “The Great Kulu” tech demo that showcases the use of PhysX soft bodies in a real game play environment

All new NVIDIA “Fluid” tech demo–a simulation of realistic fluid effects with a variety of liquids

In conjunction with the release of the GeForce Experience Pack, we will also be releasing new WHQL-certified drivers that enable PhysX acceleration for all GeForce 8, 9, and GTX 200 Series GPUs. This new driver also adds support for PhysX-accelerated features in the commercially available game, Ghost Recon Advanced Warfighter 2.

Tellingly, no CellFactor – the free, not-very-good FPS originally used to promote PhysX. It’s been withdrawn from download for a while a now, though the official site‘s still running, and has talk of a forthcoming Cellfactor: Ignition. Apparently that’s a full-blown, Unreal Engine 3-based retail FPS for PC and console, but perhaps it’ll be CUDA-friendly.

You can actually already grab the UT3 maps from here, but it’s the upcoming v177.39 Forceware driver that makes the magic happen if you don’t have a PhysX board.

I’m in two minds about CUDA PhysX. If it works well, it’s unquestionably taking the once-niche PhysX to a dramatically larger audience, which could mean more is made of it than the iffy frills in b-list games it amounted to in its first incarnation. It does, however, still seem hugely unlikely that decent games will be made built around hardware PhysX support. Developers don’t want to leave anyone who doesn’t own a GeForce 8, 9 or 200 out in the cold, after all. Which may mean PhysX remains a gimmicky luxury, as before. Having NVIDIA’s weight behind PhysX could make all the difference, however.

, , .

27 Comments »

Sponsored links by Taboola
  1. heliocentric says:

    it feels a little like bad timing. I was all aboard the nvidia train until benches of the new ati cards were in the wild.

    If nvidia open up the tech to ati this could be a heavy mark against the intel future and assure the graphics cards can remain relivent. But i don’t see that happening.

  2. Kanakotka says:

    The curious thing is, why only 8 or 9 series? In the leaked beta drivers for this, my Geforce 7900 GTX works just fine… It feels like we’re being screwed over here.

  3. Dexton says:

    It is my understanding that Nvidia have opened up the tech for ATI, there is a geek somewhere that has managed to get it working with some ATI cards and has been offered full support by Nvidia and ATI to implement Physx into all the other ATI hardware. The Intel competition probably being an important factor in Nvidia’s cooperation.

  4. subedii says:

    Maybe in a graphics card generation or two. Right now the games that can really make use of the physics enhancements (like Crysis) still tend to be pretty heavy on the graphics processing to begin with.

    I think that at the moment developers are beginning to cool off on pushing the graphical envelope for a while (this is helped by the fact that we’ve pretty much reached the level of graphical capability seen in the current generation consoles, and devs aren’t really likely to push it much further until the next console generation).

    I think it’ll be a while before we really start seeing games take advantage of this. But it’ll be awesome when devs can finally start to properly take advantage of it. Fully deformable environments lend themselves to a lot of creative possibilities. Then there’s more subtle effects like particle effects and simulations.

    I remember listening to a recent GFW podcast about Stalker: Clear Sky, and how some anomalies you initially had to look out for by watching the movement of fog in and around them. I though that was brilliant, but in the end they still have to keep the “anomaly indicator” beeping in the background and the screen washout effects since not everyone’s going to have the capability to see it.

    So yeah, it’s good that physX stuff is becoming more prominent, but I’m guessing we won’t be able to see anything really making use of it just yet.

  5. heliocentric says:

    one awesomeness of this is a multi card setup can have an old gpu as a dedicated physics card. So multi carding without sli driver nonsense.

  6. redrain85 says:

    The only way to truly gain improved physics support without a framerate drop, is to include the Physx PPU chip on every new video card. That would mean a real slam dunk in performance. They own the technology now, so why not use it.

    I know it would add to the cost and complexity of nVidia’s cards. But when using the GPU to do physics calculations, the performance hit will only be unnoticable on games that aren’t demanding. And since when does a game that wants to use full-blown physics, not also want full-blown eye candy?

  7. James G says:

    I was reading an interesting article recently that described how many labs etc. are now using graphics cards for their computations, rather than CPUs. Apparently the graphics architecture is better designed for performing lots of simultaneous but near identical calculations, and with the latest DX10 architecture these functions are more programmable. This makes them ideal for some of the data processing tasks which are made up of lots of repetative, but time consuming, calculations.

    Well I found it interesting.

  8. runab0ut says:

    The question is… future game support. With Havok taking most of the physics-enabled games in triple-AAA titles, I don’t see much benefit of having the Physix enabled driver on my system.

  9. Ben Abraham says:

    @James G

    That IS interesting!

  10. Nallen says:

    I’m confused.

    Do I need a new card or is this adding the PhysX stuff to my existing 8800GTX?

    If so I presume I’ll take a kick to the FPS ballbag, and I can’t see me enjoying that!

  11. Monkfish says:

    @Nallen:

    It’ll work on your existing 8800GTX.

    Also, performance-wise the hit depends on whether the game uses PhysX extensively or not. In games where lots of PhysX effects are used it noticeably increases framerate.

    Take a look at this article at FiringSquad – particularly the benchmarks for the PhysX-laden Unreal Tournament 3 levels. You’ll see that an 8800GTS runs at only 5 fps without Nvidia GPU PhysX, and 20fps with it enabled.

  12. Vanderdecken says:

    So on my existing 8800GTS, a variable amount of the GPU time will be devoted to physics rather than graphics? Does it just lump it all together or allocate a certain amount of power? What kind of performance hit would it make, if any? Does the benefit of accelerating the physics away from the CPU outweigh the deficit of taking available (graphics) processing power away from the GPU?

  13. Sucram says:

    If there is enough PhysX physics (eugghh) in a game to make it heavily CPU bound then shifting the physics load to the GPU will significantly improve performance.

    If your were GPU bound anyway, then clearly having the GPU do another task is not going to improve matters. Don’t know if NVIDIA’s drivers are smart enough to switch between CPU and GPU physics depending on respective load.

    NGOHQ claimed they were porting PhysX support (meaning CUDA??) to ATi cards and NVIDIA said they support this. There isn’t any hard evidence for this though, so I remain sceptical of this happening in the near future. AMD have made a deal with Intel over Havok physics but have been rather quiet about it.

  14. Monkfish says:

    @Vanderdecken:

    The benefit of accelerating physics certainly can outweigh the cost to graphics processing. A good example would be in Warmonger, which is practically unplayable with GPU PhysX turned off. Run it with GPU PhysX switched on and it becomes an entirely different beast – smooth, playable with shedloads of particles being thrown around. It’s a pity the game itself wasn’t a little better, but, hey, it’s free.

    I’ve seen it for myself on my 8800GTS, as I’ve been playing around with the latest official beta Nvidia drivers (177.79) along with the leaked PhysX 8.07.18 driver. I thought it would all be a gimmick, but I’m actually impressed by what Nvidia has achieved so far.

  15. Tom says:

    As Monkfish said: I’ve been playing with the beta’s and it’s nothing but a performance improvement.

    You can get the VERY latest forceware drivers (177.79) and just google PhysX 8.07.18, it’s all good fun.

    List of supported games here: http://www.nzone.com/object/nzone_physxgames_home.html
    Interestingly Mass Effect’s in there, although it doesn’t appear to make the blindest bit of difference.

    I think the reason Cellfactor’s not mentioned is because is isn’t 100% operable on all OS’s.
    This will prolly become widely known once the PhysX pack is released.
    I know it doesn’t work for shit on Vista x64 for example. Errors on launch. Asks for .NET 2.0 which is built in to the Vista, so don’t know if it’s specifically an x64 problem or Vista in general. I have played Cellfactor though, on Vista x64. So I’m thinking this is a post SP1 problem. Installing .NET 3.5 (which includes SP’s for 2.0 and 3.0) doesn’t help at all.
    Other OS users have said it works fine.

    Looking forward STALKER CS! To get some idea of how STALKERs smoke effects will look check out Warmonger – has some really quite nice smoke effects and it is actually kinda fun now it’s not running like a heap of stoned sloth.

  16. Gap Gen says:

    Yeah, I heard talks by a couple of people doing stellar cluster research using GPUs for gravity calculations. I thought that you needed double precision for that sort of thing (which graphics cards are traditionally bad at) but apparently you can get away with single precision for a lot of things.

  17. Tom says:

    Why is it I can never get tags correct?! Rather annoying. Should I insert tags as I type, or after the main body of text is written?

  18. Chemix says:

    I’ve seen one of these 200 series cards in Best Buy from BFG for $350, sweet looking deal, too bad I couldn’t bring myself to spend the cash, though it’s partly because I don’t know if it would cooperate with my power supply.

  19. Vanderdecken says:

    @Chemix: drop along to the PC Gamer (UK, natch, http://www.computerandvideogames.com/sites/pcgamer/) forums, hit up the tech folder and we’ll see if a little advice is forthcoming.

    @Monkfish: ta for the info, I’ll be following this story with great interest as a result. Might be time for my 8800GTS to get stressed out for the first time since the Crysis Demo.

  20. feffrey says:

    I hope 64 bit drivers come out at the same time also. I hate waiting longer for drivers.

  21. KC says:

    the 32-bit physx drivers work fine on 64-bit vista, the drivers in question for nvidia are actually downloadable through their beta page, and yes, they have 64-bit drivers.

    it’s cellforce that crashes with the .net 2.0 error, its not a .net problem its actually something else

  22. KindredPhantom says:

    Sounds like it’ll be interesting.

  23. MaxMcG says:

    I thought ATI were working on their own version of it. I seem to remember something about the new cards being able to run it, once the drivers were done.

    Maybe it’s just in my head.

  24. Rook says:

    ATi/AMD and physics is all very confusing. ATi were working with Havok to use GPU physics, but that kinda died a quiet death and Intel bought out Havok. Now AMD (having bought ATi) is working with Havok on CPU physics.

    I think there’s a group who are trying to get PhysX running on ATi GPUs that now have some semi-official backing from ATi though.

  25. DSX says:

    My 7900 gtx runs games superbly… I’m smelling nvidia taking a page from MS’s DX10 shenanagans with vista to sell more cards.

  26. malkav11 says:

    I’m kind of annoyed that the cutoff is the next generation of cards past what I actually have, but I’m definitely excited about the potential of this. The thing is, they’re supporting three generations of card already with this, so I wouldn’t expect it to take more than another generation or so for games to start making serious use of the PhysX capabilities. And I wouldn’t be surprised to see them as a luxury item in more games starting now – after all, a significant proportion of gamers can’t do DX10, but DX10 support is still included in many games coming out now. And more people will be able to do PhysX with this out than can do DX10.

  27. Jonathan says:

    I’m excited if only because I’ll finaly get to play the PhysX Labs on Crazy Machines. Crazy slowdown in that if you’re not careful.