Nvidia has launched a new version of its Isaac robot simulation engine in the Omniverse, the company’s metaverse sim for engineers.
The Omniverse started years ago as a proprietary Nvidia project called Holodeck, named after the virtual reality simulation in Star Trek. But it morphed into a more ambitious industrywide effort based on plumbing made possible by the Universal Scene Description (USD) technology Pixar developed to make its movies. Nvidia has spent years and hundreds of millions of dollars on the project, and now it’s updating its robot simulations for it.
The Omniverse takes inspiration from the sci-fi concept of the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One.
An open beta

The new Isaac simulation engine is now in open beta so companies and designers can test how their robots function in a simulated environment before they commit to manufacturing the robots, Nvidia senior product marketing manager Gerard Andrews told VentureBeat in an interview.
Andrews showed me some images and videos of robots working in a digital factory that BMW is creating as a “digital twin.” Once the factory design is done, the digital counterpart will be replicated in the real world as a physical copy. Now the Isaac-based robots will operate more realistically, based on newly available sensors for the robots and more robust simulations.
The simulation not only creates better photorealistic environments, it also streamlines synthetic data generation and domain randomization to build ground-truth datasets to train robots in applications from logistics and warehouses to factories of the future.
“Isaac Sim is going into open beta. We’ve had an early adopter program, which has reached thousands of developers in hundreds of individual companies,” Andrews said. “They tried it out and kicked the tires and gave us some good feedback. And we’re proud to take this to the market based on that feedback and a lot of enthusiasm we are seeing from these customers.”
He said Isaac Sim is a realistic simulation derived from core technologies, such as accurate physics, real-time ray tracing, path tracing, and materials that behave as they’re supposed to.
“One of the big problems you have is the sim-to-real gap, where the gap between the virtual world and the real world — if it exceeds a certain amount, then the engineers or developers just won’t use simulation,” Andrews said. “They’ll just abandon it and say it is not working.”
Andrews said the Isaac Sim running on Omniverse will be a game-changer in the utility of simulators. He added that simulation has to be good enough that it’s worth the time it takes to learn how to use the tools.
“A lot of the use cases we have around manipulation robots, navigating robots, generating synthetic data to train the AI in those robots — we have those use cases built into Isaacs already,” Andrews said. “And then finally, the big benefit that we get from being a part of the Omniverse platform is seamless connectivity and interoperability with all these other tools that people may be using in their 3D workloads. We can bring those assets into our simulation environment where we’re developing the robot, training the robot, or testing the robot.”
The Omniverse and Isaac
The Omniverse is the underlying foundation for Nvidia’s simulators, including the Isaac platform — which now includes several new features.
Built on the Nvidia Omniverse platform, Isaac Sim is a robotics simulation application and synthetic data generation tool. It enables roboticists to train and test their robots more efficiently by providing a realistic simulation of the robot interacting with compelling environments that can expand coverage beyond what’s possible in the real world.
This release of Isaac Sim also adds improved multi-camera support and sensor capabilities and a PTC OnShape CAD importer to make it easier to bring in 3D assets. These new features will expand the breadth of robots and environments that can be successfully modeled and deployed in every aspect: from design and development of the physical robot to training the robot and “digital twin” deployment and testing.
Developers have long seen the benefits of having a powerful simulation environment for testing and training robots. All too often, shortcomings have limited the simulators’ adoption, but Isaac Sim addresses these drawbacks, Andrews said.
Realistic simulation
I was looking at the images of Isaac robots in the press material, and I thought they were photos. But those are 3D-animated images of robots in the Omniverse.
In order to deliver realistic robotics simulations, Isaac Sim leverages the Omniverse platform’s powerful technologies, including advanced graphics processing unit (GPU)-enabled physics simulation with PhysX 5, photorealism with real-time ray and path tracing, and Material Definition Language (MDL) support for physically based rendering.
Isaac Sim is built to address many of the most common robotics use cases, including manipulation, autonomous navigation, and synthetic data generation for training data. Its modular design allows users to easily customize and extend the toolset to accommodate many applications and environments.
“This image is a digital twin of BMW’s new factory that their factory planners worked on. They brought it into the Omniverse world. And the cool thing about being in Omniverse is that I can put my simulated robot right in this world and collect the training data that I’m going to use for my AI models, do my testing, do all sorts of different scenarios. And that’s kind of one of the beauties of being a part of the Omniverse platform,” Andrews said. “I’ve been challenged to come up with a catchy phrase, and I never really came up with a catchy phrase, but it’s something around the realistic robot models and the complex scenes that they’re going to operate in.”
To me, it’s kind of like designing products inside one of Pixar’s film worlds, only one that is far more realistic.
With Omniverse, Isaac Sim benefits from Omniverse Nucleus and Omniverse Connectors, enabling the collaborative building, sharing, and importing of environments and robot models in Pixar’s Universal Scene Description (USD) standard. Engineers can easily connect the robot’s brain to a virtual world through the Isaac SDK and ROS/ROS2 interface, fully featured Python scripting, and plugins for importing robotic and environment models.
Synthetic Data Generation is an important tool that is increasingly used to train the perception models found in today’s robots. Getting properly labeled real-world data is a time-consuming and costly endeavor. But in the case of robotics, some of the required training data could be too difficult or dangerous to collect in the real world. This is especially true of robots that must operate in close proximity to humans.
Issac Sim has built-in support for a variety of sensor types that are important in training perception models. These sensors include RGB, depth, bounding boxes, and segmentation, Andrews said.
So how realistic should it be?
“You just want to, within reason, close that sim-to-real gap,” Andrews said. “If you have a small error, that can accumulate in your simulation. It can pick up over time, like an error in physics modeling where you don’t do something right with how the wheels [function], then the first time you simulate it, your robot may be fine. But that error builds up, and the robot may find itself completely off course in the real world.”
He added, “The closer you can get into the reality, there’s just a better experience you’re going to have when the engineers try to use it. In the world of simulation, you always face this idea of ‘Now that I have the real hardware, what’s the value of still using the simulator?'”
Getting better data
In the open beta, Nvidia has the capability to output synthetic data in the KITTI format. This data can then be used directly with the Nvidia Transfer Learning Toolkit to enhance model performance with use case-specific data, Andrews said.
Domain Randomization varies the parameters that define a simulated scene, such as the lighting, color, and texture of materials in the scene. One of the main objectives of domain randomization is to enhance the training of machine learning (ML) models by exposing the neural network to a wide variety of domain parameters in simulation. This will help the model generalize well when it encounters real-world scenarios. In effect, this technique helps teach models what to ignore. Isaac Sim supports the randomization of many different attributes that help define a given scene. With these capabilities, ML engineers can ensure that the synthetic dataset contains sufficient diversity to drive robust model performance.
Simulations can save time and other resources

In real life, 50 engineers may be working on a project, but they might have only one hardware prototype. With something like Isaac, all 50 software engineers could work on it at the same time, Andrews said. And the engineers no longer have to be in the same physical space, as they can work on parts of the project remotely.
“I was designing processor cores, and people always wanted to simulate it before they had the real hardware, but when their chip came back, the simulator was put on the side,” Andrews said. “In the robotics use case, I still feel like there’s value for the simulator, even when you have hardware, because the robots themselves are expensive.”
On top of that, it could be dangerous to test a robot in the real world if its controls aren’t right. It might run into a human or worse. But if you test it in the Omniverse, the simulation won’t hurt anybody.
Over time, Nvidia has added things like multi-camera support, a fisheye camera lens, and other sensors that improve the robot’s functions and its ability to sense the environment. The more components are improved in the real world, the more the Isaac simulation can be updated in the Omniverse, Andrews said.