Music Is My Hot PhysX

By Kieron Gillen on June 20th, 2008 at 2:38 pm.

This is from Natural Motion which is a PhysX motion-thingy
And because I like my shout-out to lovefoxx strap for this piece so much, I’m going to re-use it here. The most influential man in British videogames journalism, Eurogamer’s Tom Bramwell, showed the sheer extent of his influentialitude by making me trot off and actually cover a hardware event. Me! In this case, NVIDIA talking about their near-future plans involving - basically – allowing you to turn your 3D card into a PhysX card via software and even repurposing your old 3D card as a PhysX one when you upgrade. Apparently. They could have told me computers would be able to produce milk and I’d have been equally credulous. Go read here. There are jokes and local colour.

, , , .

36 Comments »

Sponsored links by Taboola
  1. Rook says:

    AMD recently announced they were partnering with the now Intel owned Havok http://www.custompc.co.uk/news/602763/amd-announces-physics-partnership-with-havok.html
    So it’ll be interesting to see what comes out on top.

  2. leth says:

    Just so you know, these drivers are already leaked :)

    http://forums.guru3d.com/showthread.php?t=264880

    Seems that with a modded .inf file, the 9xxx cards all supports PhysX now.

    Can’t wait to try Mass Effect with this new driver :)

  3. Al3xand3r says:

    Or let’s all finally abandon the physics processor unit mindset since developers are starting to have plenty of cpu cores to utilise for such purposes in order to have no need for additional hardware anyway. It’s already too much for our pockets ya know, let’s not give this type of thing any more popularity, even if in this particular case it can be saving money…

    Leth, you really have two of those cards in order to be able to sacrifice one for physics only purposes?

  4. Ross says:

    Is this the NVIDIA event that Tricia Helfer was speaking at? Sorry to reinforce the stereotype, but if so, that would be reason enough to cover a hardware event.

  5. The Sombrero Kid says:

    @Al3xand3r the ideas that you use some of it’s power not the lot a physx card is no where near as powerful as a 9800 gx2 has 1.5 thousand million transistors a physx card has 100 million, you wouldn’t need a whole 9800gx2 to do the job of a physx card just a wee bit of it

  6. Taxman says:

    Meh nothing will happen with GPU physics. All it’s good for are tech demos now.

    Games makers target the consoles first these days and aren’t going to spend money on PC GPU physics unless it can be done for nothing.

    Not everyone has a second, PCI-e port spare or will want to trade GPU performance for physics.

    The kind of games they demo like Backbreaker arent even popular on the PC.

    CUDA is tied to Nvidia hardware so only a small selection of developers will ever target it. The GP-GPU cross company effort is around OpenCL which all the big names are signed up to so until that is set in stone dont expect to see much of anything. Nvidia is of course doing it’s level best to muddy the waters with PR so as to build a mindset that GP-GPU = CUDA.

  7. Kieron Gillen says:

    Ross: It wasn’t, alas. Though oddly I was at a convention with a load of Battlestar people recently.

    KG

  8. Kevlmess says:

    @Al3xand3r:
    The traditional CPU isn’t the best tool for complex vector calculations such as physics. GPUs, on the other hand, are made especially for that stuff, meaning that they can deliver a lot more physics for the buck/pound/euro.

    The funny thing is that the early GPUs were actually general purpose DSP chips harnessed to do 3D math. And now they’ve got this whole new idea that GPUs could be used for general purpose DSP, too! The mind boggles.

  9. Kieron Gillen says:

    Taxman: I didn’t really go into this that much, but this tech works with anyone running on NVidia – so people can use this shit on 360. In fact, the main thing they were demoing physics stuff with – the Natural Motion stuff – is that it’s tech which debuted in GTA4. You can already do GPU-physics on a console. And, apparently, people are.

    (Which is useful for the PC, as its’ a case of scaling up the physics interaction – a 360 could handle 1000 objects, while a cutting edge PC 10,000.)

    Of course, I’m just quoting what they’re saying. I wouldn’t dream of suggesting it’ll play out like that.

    KG

  10. Meat Circus says:

    NVIDIA and ATI are really starting to sell the GPGPU thing, and it’s mainly because they’re living in fear of the coming of LARRABEE.

    As well they might, frankly. Intel will eat Nvidia alive if they can’t get a decent GPGPU proposition bedded in by late 2009/early 2010.

  11. leth says:

    Kieron, if you have a G92 series or later Nvidia card, you can test the PhysX feature yourself, using the drivers in that thread that I linked.

  12. Meat Circus says:

    @Ross:

    You know she doesn’t look like that in real life, right? It’s all done with lasers.

    In person she’s somewhere between Ruth Kelly and Christopher Biggins. FACT.

  13. Rosti says:

    “…somewhere between Ruth Kelly and Christopher Biggins.”

    Please let me never find myself there. Ever.

  14. Jorcin says:

    Kieron – 360’s GPU is ati though…
    PS3 does use nvidia, but it’s based on GF7 series and this physics stuff is for GF8 and up.

  15. Ross says:

    @Meat Circus:

    My entire world view is shattered.

    :(

  16. eyemessiah says:

    @Leth

    *Of course* KG has a “G92 series or later Nvidia card”. He reviews PC games on the INTERNET. His computer is made of pure gold, exquisite diamonds, the finest exotic silks, the bones of Mayan God-Kings and has EVERY GRAPHICS CARD EVER MADE and they are all connected into a vast GRAPHICS ARRAY which has ALREADY finished rendering the final frames of the end sequences of games that HAVE NOT EVEN GONE INTO PRODUCTION, with both AF and AA turned up to OVER 9000!

  17. deABREU says:

    cansei de ser sexy is a terrible band. you should get some better taste, mate.

  18. Andrew says:

    deABREU, I will defend to the death Kieron’s excellent taste in chirpy Brazilian electropop.

  19. Gap Gen says:

    but unless you’re working in economics or astrophysics modelling,
    Well, I do astrophysics modelling, and while double-precision coprocessor cards like the ClearSpeed Advance boards are interesting (I did some work experience with them years and years ago) you need to program specifically for them, which basically means coding right from the start. You don’t want to do this, basically, unless you’re going to do that already. So what most people do is just use vast clusters of ordinary PCs and split up the jobs that way.

    It would be cool, but you need to want to accommodate the API for this stuff into your code, which not everyone is willing to do.

    Would be interesting to hear from Lu Tze on this, though.

  20. Gap Gen says:

    Also, using graphics shaders to do physics isn’t new – Plasma Pong uses it, and its website links to the relevant papers – but I guess it helps having an easy-to-use language for it.

    Also, for research, you tend to want double precision (i.e. 64-bit variables), otherwise your accuracy goes out of the window (depending on what you’re doing). The ClearSpeed board does this well apparently, but most graphics cards (I believe) are much better at single precision, which is OK for games because no-one cares about standard errors as long as it looks whizzy and not-fake.

  21. Al3xand3r says:

    Okay Kevlmess, then I guess we should all buy physx cards or spare GPUS and let games developers enjoy yet further ignorance of the multi core CPUs power, as, ya know, there’s not that much more they use it for (and so far they haven’t used even dual core anywhere near its full potential as multi core processing is still @ its infancy) and yet every next generation of CPUs will have yet more of them hawt cores. Great stuff, certainly bung for my buck if half my future say, quad or 8 (or more) core cpu is never used to its full potential as visuals are handled by the GPUs and physics by the PPUs and sound by the SPUs, and who knows what else new feature that needs dedicated hardware they’ll come up to milk sheep with, while earlier mentioned poor CPU cores remain largely unused as they’ll obviously be less efficient than dedicated hardware so there’s no reason to bother exploiting them at all, I can’t wait for the future, bring it now please, I want more hardware stuff to spend money I don’t need on!

  22. Gap Gen says:

    Sure, you can do everything on a multicore CPU, but it’s slower than having a graphics card as well, which was always the point, given that even a single core can do multithreading. Valve’s argument for using the CPU to do stuff was that in general, new bargain-basement-level computers are getting dual core CPUs but abysmal graphics cards. Their comments a couple of years ago or so were based on the markets, not on an ideal situation computationally (as, say, Crysis is). As has been said, GPUs are basically very good at certain types of calculation, so they’re ideal for speeding up things that would take a multicore CPU longer.

    That said, if you take parallel processing to its extreme (say, if you’re using a thousand or more processors) if you have even a small amount of non-parallelisable code that everything has to wait for, then you spend most of your time waiting for that to complete, so the advantage in using 1000 cores over 100 diminishes. This isn’t really a massive issue in many cases at 8 cores or less, though.

  23. Al3xand3r says:

    Well I would imagine that as time passes multi core architecture will grow more efficient also (just as everything in tech does) so there’s less of that wait thing to do. I guess CPUs in general will also become more efficient to gain a little ground against that “not as good as dedicated hardware” point.

    I’d just like developers to start trying harder to use what they already have available better, you know?

    Obviously with computers, realistically they can never exploit a given model to its full potential as new stuff get released all the time, but it’s a bit much to see a crawl in regards to multi core processing and at the same time see companies push more dedicated hardware like PPUs.

    It looks counter intuitive to me. Not to mention that so far it also offers very little so I don’t see why people think so highly of it… I’d rather they spend more time in multi core developments before reaching for PPUs, it’s not like they exploit physics that much nowadays, all they really use it for is extra eye candy. Games that use it for more than that are usually puzzle esque games which use much simpler physics which should also not really need PPUs if only extra cores were used better…

    As for this Nvidia thing, to get a little more on topic, seriously, what GPU is powerful enough to run any modern games at their max potential settings and still have power left to be used as a PPU? It’s my understanding that stuff like Crysis are impossible for current hardware (I wouldn’t know for sure, I don’t have any latest top end stuff) to run @ absolute max settings with constant 60plus fps so where can they find power to sacrifice (except for when using spare GPUs for that) for other purposes? Unless giving a boost to the physics processing can improve overall performance more than visual processing of course, but don’t more and better physics also add to the visual load anyway? More exploding and moving and rolling stuff or whatever on screen etc?

  24. Gap Gen says:

    Well, it’s not a case of efficiency, it’s a case of design. For example, you can have ten taxis to take kids on a school trip, or you can hire a coach. Most schools opt for the latter. So the point of that analogy was that GPUs are very good at certain physics calculations, whereas you need more £s of CPU to do as much as a GPU can. I think this is the point, anyway.

    Plus, as has been said, the physics processing needed is generally less than the graphics processing. I assume that the load balancing is done properly, otherwise games won’t work optimally, and devs can’t make their games as good.

  25. Al3xand3r says:

    Yeah, obviously, but if you already have the 10 taxis and are paying for them, why not use them instead of hire someone extra, like the coach, while you keep paying for the taxis you don’t use on top of that? That’s the analogy I was trying to make, it’s not like we’ll be able to chose to buy single or dual core cpus that are still supported and efficient when 8 or 16 core cpus are the standard…

    Thanks for making it easier to get my point accross…

    I just see that multi core technology keeps advancing but only in regards to hardware, didn’t they recently unveil that like, infinite core CPU or whatever (yeah, not infinite, but an absurd number)?

    Both PPU usage and multi core usage is at its infancy in regards to how much games actually benefit from them but I think that there’s more benefit to be had in the long term from multi core processing with how rapidly the number of cores will be extended and yet developers aren’t pushing for it…

  26. Gap Gen says:

    Well, the reason graphics cards are good is because they’re specifically designed to do what they do (remember software graphics?), which can also apply to physics. So you do get more bang for your buck with graphics cards. The main reason physics cards didn’t pick up was because there was no market for it.

  27. SuperNashwan says:

    But does “more” physics really mean anything for game design? I can’t say I’ve played a game recently and wished for more stuff bouncing around hyper-realistically.

  28. Gap Gen says:

    Well, it’s a big thing for immersion. Euphoria’s system will presumably make things seem more realistic in terms of character actions and so on.

  29. Vivian says:

    CSS aren’t terrible, they’re great – just a bit rammed down peoples throats of late. I especially liked how many people loved lets make love etc but had never heard of death from above.

    lovefoxx is a bit chubby though.

  30. Erlam says:

    I think we need to get the fuck away from the graphics/physics optimization, and focus on A.I., because honestly, what the hell is different in games now from, say, 8 years ago?

    The A.I. still kamikaze charges you, they have almost no idea how to actually use cover; but don’t worry, it looks really realistic when you grenade the NPC stuck looking at a table.

  31. Gap Gen says:

    I think the main difference is that the basic theories behind graphics and physics are well understood, but we have no idea how to make a real AI

  32. Sucram says:

    I find the move towards divergent feature sets between ATi and NVIDIA cards disappointing.

    DX10 was meant to put and end to this (and might have if t weren’t for the DX10.1 debacle), now both companies and talking about GPGPU’s and Physics but using different API’s and with NVIDIA supporting PhysX while AMD goes with Havok.

    It’s another nightmare for developers and one which means they still can’t fully integrate fancy physics into their gameplay.

  33. Al3xand3r says:

    Doesn’t PhysX also support Havok?

  34. malkav11 says:

    Physics cards didn’t catch on because, with no install base, it mostly hasn’t been worth developer time to take advantage of them (plus they increased the amount of rendering the GPU had to do, most of the time, which resulted in little if any net improvement.). And who’s going to spend $300 or whatever ridiculous sum they were priced at when they don’t actually *do* anything in most situations.

    GPU-based physics (/secondary GPU cards dedicated to physics) offer the advantage of being something everyone’s already buying, so that developers can reasonably expect those capabilities to be present (or at least widely available.). I’d look forward to it if it weren’t for the tiny fact that I’m currently running a pair of 7-series cards that aren’t compatible. There’s been no need to upgrade, with everything I’ve wanted to run operating swimmingly at high settings. Oblivion, Bioshock, Mass Effect, etc. Crysis I’ve been avoiding, but other than that, everything.

  35. Lu-Tze says:

    “Would be interesting to hear from Lu Tze on this, though.”

    I think this comes under the domain of “things I really shouldn’t comment on if I want to keep my job”.

    I’ll stick to stuff like “Isn’t GTAIV awesome?”.