It used to be that when BMW would refit a factory to build a new car, the only way the automaker could check if the chassis would fit through the production line was to fly a team out and physically push the body through the process, making note of any snags.
Now, process engineers can simply run a simulation, sending a 3D model of the car through a near-identical digital twin of the factory. Any mistakes are spotted before the production line is built, saving time and money.
Such is the power of the industrial metaverse. Forget sending your avatar to virtual meetings with remote colleagues or poker nights with distant friends, as Mark Zuckerberg envisioned in 2021 when he changed Facebook’s name to Meta; the metaverse idea has found its killer app in manufacturing.
While the consumer version of the metaverse has stumbled, the industrial metaverse is expected to be worth $100 billion globally by 2030, according to a World Economic Forum report. In this context, the concept of the metaverse refers to a convergence of technologies including simulations, sensors, augmented reality, and 3D standards. Varvn Aryacetas, Deloitte’s AI strategy and innovation practice leader for the UK, prefers to describe it as spatial computing. “It’s about bridging the physical world with the digital world,” he says. This can include training in virtual reality, digital product design, and virtual simulations of physical spaces such as factories.
In 2022, Nvidia—the games graphics company that now powers AI with its GPUs—unveiled Omniverse, a set of tools for building simulations, running digital twins, and powering automation. It acts as a platform for the industrial metaverse. “This is a general technology—it can be used for all kinds of things,” says Rev Lebaredian, vice president of Omniverse and simulation technology at Nvidia. “I mean, representing the real world inside a computer simulation is just very useful for a lot of things—but it’s absolutely essential for building any system that has autonomy in it.”
Home improvement chain Lowe’s uses the platform to test new layouts in digital twins before building them in its physical stores. Zaha Hadid Architects creates virtual models of its projects for remote collaboration. Amazon simulates warehouses to train virtual robots before letting real ones join the floor. And BMW has built virtual models for all its sites, including its newest factory in Debrecen, Hungary, which was planned and tested virtually before construction.
To simulate its entire manufacturing process, BMW filled its virtual factories with 3D models of its cars, equipment, and even people. It created these elements in an open-source file format originated by Pixar called Universal Scene Description (OpenUSD), with Omniverse providing the technical foundation for the virtual models and BMW creating its own software layers on top, explains Matthias Mayr, virtual factory specialist at BMW.
“If you imagine a factory that would take half an hour to walk from one side to the other side, you can imagine it’s also quite a large model,” Mayr says. Hence turning to a gaming company for the technology—they know how to render scenes you can run through. Early versions of the virtual factory even had gaming-style WASD keyboard navigation, but this was dropped in favor of a click-based interface akin to exploring Google Street View in a browser, so anyone could easily find their way.
BMW also uses Omniverse for collaboration on car design and customization visualizations for customers, but a key benefit is being able to model production lines. New cars mean a new assembly process, but refitting a factory is a daunting process. Previously, key information was held in silos—production crews understood details of the assembly process, external suppliers had specs of new parts or machinery, architects had detailed building plans—and costs would pile up for every delay or mistake. “The later you find a problem, the worse it is,” says Lebaredian.
Now, problems are worked out virtually, with a central location for standardized data to be held. There’s still a critical human element: Mapping a facility requires sending a laser scanner strapped to a person running through a factory to capture point cloud data about how everything is arranged. Design engineers also need to create a 3D model of every stage of a car as it’s assembled. This level of detail allows BMW to virtually test the assembly process, complete with simulations of robotics, machines, and even human workers, as BMW has data tracking how long it takes employees to assemble a part.
The main idea is to avoid errors—does that machine even fit there?—but the system also enables optimization, such as moving a rack of components closer to a particular station to save steps for human assemblers. “You can optimize first and gain a lot of efficiency in the first production, and in the construction phase, you have fewer mistakes,” Mayr says. “It’s less error prone.”
Omniverse being a Nvidia platform, AI is naturally next. BMW is already layering in generative AI to help with navigation of its virtual models—they’re so massive that finding a particular point in the digital factory can still require asking a human expert for directions. But the aim is to use AI to optimize production lines too. “Because you have the whole data available, not just for one plant, it will be able to make good suggestions,” says Mayr—lessons learned in one factory could more easily be applied to others.
And then there’s robotics and other autonomous systems. Here, Omniverse can offer a digital space for testing before deploying in the real world, but it can also generate synthetic training data by running simulations, just as driverless car systems are trained with virtual video footage generated by AI. “Real-world experience isn’t going to come mostly from the real world—it comes from simulation,” says Lebaredian.
Aryacetas predicts that the biggest impact from the industrial metaverse will be embodied or physical AI—in other words, robots. “Robots aren’t fully there yet, but they’re rapidly training up to understand the physical world around them—and that’s being done because of these underlying spatial computing technologies,” he says.
The future of the metaverse isn’t avatars in a virtual world; it’s digital twins teaching industrial robots how to step out into the physical one.