Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Nvidia open-source PhysX to help AI and robotics research

De-stroy

As if trying to grow superhumanly intelligent minds inside grey boxes and wires weren't dangerous enough, now people want to teach them that the only physical impact they can have on the world is being an arsehole knocking around anything not nailed down. Nvidia have open-sourced their PhysX, physics simulation engine, with the primary goal of assisting research into AI, robotics, and self-driving cars. Look forward to a generation of robots which delight in knocking items off shelves, stop to admire billowing flags, and are fascinated with making human bodies judder violently as they get stuck in the ground.

"We're doing this because physics simulation — long key to immersive games and entertainment — turns out to be more important than we ever thought," Nvidia say in their announcement.

Open-sourcing lets people tinker more deeply with the physics engine, bending it to their will in ways they couldn't with a closed SDK. Which is probably helpful for some of the uses Nvidia suggest:

  • In AI, researchers need synthetic data — artificial representations of the real world — to train data-hungry neural networks.
  • In robotics, researchers need to train robotic minds in environments that work like the real one.
  • For self-driving cars, PhysX allows vehicles to drive for millions of miles in simulators that duplicate real-world conditions.
  • In high performance computing, physics simulations are being done on ever more powerful machines with ever greater levels of fidelity.

Or you could use it for video games, I suppose.

They've whacked PhysX 3.4 up on GitHub under a BSD 3 license.

Elsewhere in physicsland, Nvidia plan to launch PhysX 4.0 later this month. They show it off with a video of robots being utter dicks (which also seems to demonstrate nah, don't use PhysX 3.4 for your robot research?):

Watch on YouTube

Read this next