I’ve got an Ageia PhysX card sat around somewhere, a piece of hardware about which I wasn’t entirely complimentary a while back. I don’t use it because a) there’s yet to be a PhysX-enabled game which I’ve wanted to play for any reasonable length of time and b) it’s one more furiously spinning fan in a PC that I can already hear humming like the wrath of Skynet, even when I’m on the next floor of the house.

But maybe, I thought, Unreal Tournament 3 would be the game to change all that. Eyecandy junkie that I am, I was quite looking forward to testing the explodability of the PhysX-only bonus maps. Tech know-all types Bit-Tech have kind of talked me out of it before I’ve even got a copy of the game, with one page of their typically graphics-centric review of UT3 putting the boot particularly hard into the PhysX stuff. The screenshots sure look good, but it sounds very much as though my feelings about earlier games with PhysX support tending to end up looking like an explosion in a polystyrene factory aren’t going to change any time soon. Worse still, it seems the super-duper maps run at under 10 fps. That’s an ultra-whoops for Ageia, surely. Oh well. I may still be able to summon the energy to remove my PysX card from its dusty anti-static bag once I’ve got hold of UT3, but don’t bank on it.

So, what now for this increasingly beleagured hardware experiment? Can it possibly recover from even its most high-profile implementation turning out to be a little bit tragic? And does anyone else reading this have a PhysX board?


  1. The_B says:

    Make your own see-saw out of a brick and a plank of wood. Take your PhysX board and it put it at one end of the seesaw. Then, from some sort of slightly higher platform, jump onto the other end of the seesaw, making sure you shout “PHYSSSSIIIICCCCSSS!” as you do so.

    Hours of fun.

  2. Thiefsie says:

    It should have died on the drawing board. Multi cores are the way to go…. Anyone who actually bought one of these deserves all the laughter they get. How many games actually ACTUALLY use it? 3? Graw, UT3…. and that free one they are making? Hmmm

    multi gpu? (expensive ouch)

  3. Mike says:

    We gave this a whirl in the office. The levels are reasonably impressive, particularly the one with a tornado ripping through it. There were no performance issues on Dave’s rig.

    It remains a novelty, but if more developers offer worthwhile PhysX content (which I think these maps are) and the price of hardware continues to drop like a stone Ageia could just hang on.

  4. mrkstphnsn says:

    I bought one to play GRAW2 with and I don’t really think about it much to be honest. The game was average and I’m not going to buy UT3 as Twitch FPS’s don’t really do anything for me anymore. I really don’t see them taking off whilst all they do is make stuff explode into matchsticks. Sure it was nice the first time but after that … it’s just eye candy.

  5. Mario Granger says:

    The problem with PhysX is that it doesnt seem to have any real utility.

    GPU’s accelerate 3D graphics, without which you would have to rely on either a software renderer (which looks dreadful, or if it doesn’t, its eating up quite a few CPU cycles). There is an obvious need for a GPU, and you will notice a significant difference in overall game quality from the lack of one.

    Where as a world without PhysX… is the one we are in now. I certainly never feel the need to call out for physics acceleration when I’m playing Crysis or TF2, because those games have great feeling physics models that ALREADY run very smoothly.

    Aegia should have just come up with a very powerful chipset and made some inroads in getting onto the boards of nVidia’s and ATI’s tech.

  6. Mario Granger says:

    Thinking on it further, they are kind of in a hole they will never get out of. Because say there IS a killer-app one of these days for the PhysX. I still wouldn’t consider getting one, because if I’ve played so many games with very good physics models already, why should I believe that all in a sudden Aegia’s tech is necessary in facilitating the same thing?

  7. roryok says:

    i really liked the idea havok and ati/amd were throwing around a while back. bascially, buy a new graphics card, take your old one and stick it in another pci-express slot and let it handle all the physics!

    I haven’t heard anything of it for a while, but if they could get that going it would be awesome, a lot of games use havok, and this – it even encourages recycling!

  8. Rhygadon says:

    Another data point: City of Villains uses PhysX code for destructible objects in the (gloriously destructibility-oriented) “Mayhem” missions. If you have the card, the game will use it; if not, it’s all done in software. There’s an in-game slider for physics quality, and the highest setting is labeled “recommended for PhysX PPU only,” but as long as you have a reasonably fast CPU you can max it without seeing much of a performance hit.
    The interesting thing is that the game appears to use a “baked-in” version of Ageia’s software physics, which can’t be (or at least, isn’t) updated as Ageia improves their code. But if you go to Ageia’s website and install their latest driver, even if you don’t have a PhysX card, it turns out that the game will automatically hand off the physics to that external driver. This results in substantially increased performance on the “high physics” setting.
    On the CoV forums, players who did have the card reported no unique visual benefits and only marginal performance gains. This suggests that — at least for the simple flying-debris and floating-leaves effects used in that game — Ageia’s most significant contribution is their code, not their hardware.

  9. malkav11 says:

    It’s not the most terrible idea in the world. Physics are, I think, the new technology that is changing (/will change) most how games play and really elaborate physics (not the crate-pushing and floppy corpses we’ve got at the moment) will surely require all kinds of additional processing power. Handing that off to a dedicated device makes sense.

    On the other hand, it doesn’t seem like there’s really any purpose for it right at the moment, and it’s priced way too high for something that actually winds up decreasing performance more than increasing it, and only does anything at all in a few games.

  10. Mo says:

    (disclaimer: please note that while I’m pretty sure I know what I’m on about, I could infact, be full of shit)

    I’ve ranted about this far too many times, but I guess there’s no harm in doing it once more …

    Physics cards are absolutely useless.

    Why? To your everyday gamer, they seem like a great idea … it’ll do to physics what graphics accelerators did to graphics, right? Wrong. There’s a fundamental flaw with the physics card: it isn’t parallel process.

    The way graphics accelerators work is as follows: a game updates it’s logic. It then tells the graphics card, “RENDER STUFF!” And away the gfx card goes. While the game is being rendered on the GPU, the CPU is free to go off and start updating the logic again. Graphics and logic are parallel processes. That is, one does not depend on the other.

    Physics on the other hand, is *COMPLETELY* dependent on logic and vice versa. When the logic says something like …
    if ( gordon hits explosive barrel ) then kill gordon
    … it needs to wait for the physics card to finish updating before it can find out the result. Thus any speed gains are given up.

    Notice that the only real use you see of the PhysX card is with debris and special FX … you know, stuff the logic of the game doesn’t really care about. That can definitely be made into a parallel process, but anything meaningful to gameplay? Not likely.

  11. Lu-Tze says:

    @Mo : I disagree with it being a potential parallel process, however it depends on the speed of feedback you require.

    You can, potentially, run a game like so…

    Game Loop #1
    Game Loop #2 // Render Results Game Loop #1 // PhysX Results From Conditions At End of #1
    Game Loop #3 // Render Results Game Loop #2 // PhysX Results From Conditions At End of Game Loop #2

    …and so on. However, you can’t respond to the PhysX results with your Game Loop until one frame after, so there is an inherent delay. It’s far from perfect (and certainly i’d hate working on a game that ran like that) but it is possible.

    I suspect that the PhysX card could make that single Serial process run faster though, the PPU using the CPU as additional threads and solving the stuff it can do faster itself… but any moreso than if it ran entirely in software and you spent the money on another dual core? That’s for someone with more hardware and time than me to investigate :P

  12. Lu-Tze says:

    Oh also, that assumes that you actually want to interact with physics in a meaningful way, however Physics can be used in a parallel process.

    In a hypothetical example, lets create a game from the movie Twister. The Twister itself needs to be able to tear the world apart into minute parts and send them flying through the air and into the path of your car. Small incidental things (fence posts, footballs, bricks) just bounce off your car or skit away from the wheels, they serve no game purpose, only act as aesthetic debris. Should the Twister catch up with you, then you see your car get whisked off into the Twister.

    None of this impacts the game flow in any way, the renderer runs off the results of the PhysX, the car is just a big block being pushed around by the game telling PhysX to shift it a bit in that direction.

  13. Mo says:

    You can, potentially, run a game like so…

    Game Loop #1
    Game Loop #2 // Render Results Game Loop #1 // PhysX Results From Conditions At End of #1
    Game Loop #3 // Render Results Game Loop #2 // PhysX Results From Conditions At End of Game Loop #2

    True, but remember that the player can possibly change the state of the world within that one tick. Consider the following scenario:

    a box is pushed towards the player in tick 1
    the player moves towards the box in tick 2
    the player is in the box in tick 3

    The third tick is all wrong because of the lag in physics.

    I suspect that the PhysX card could make that single Serial process run faster though

    I doubt it. Apart from the overhead of transferring data over the bus AND back (gfx cards only transfer to, not back) the physics card isn’t doing anything a CPU couldn’t do just as well … and the CPU has the added benefit of having direct access to RAM (or at least, faster access).

    You’re last point though (about the Twister game) is spot on though. I mentioned this in my original post … the PhysX card works great for debris and special FX, but I imagine would be a bit useless for “proper” gameplay-changing physics.

    (apologies in advance for boring the crap out of everyone!)

  14. Marti says:

    Hey Alec

    Can you check if your card was on? You should be getting an average of 30 – 35 FPS on these maps. Sometimes if you don’t check the card ON in your profile, it will be turned off when you get into the level. Pain in the ass, I know, but that’s just the way it is set up. It happened to me here in the office as well. I’d really love if you could look at the levels again, if you please. We took major due diligence in developing these maps, and I’m proud of the work we’ve put into them… I truly want to make sure you get the right experience with the hardware on.

    –Marti – AGEIA PR Gal

  15. Alec Meer says:

    Thanks Marti, but I still haven’t picked up UT3 myself – this was purely a link to Bit-tech’s report.

  16. buy wow gold says:

    I like play online game, I also buy wow gold

  17. Perfect World Silver says:

    Since I entered into this game, I learnt skills to earn Perfect World Silver.