Nvidia unveils new technologies for the industrial metaverse and releases tools for AI graphics at SIGGRAPH.
SIGGRAPH (Special Interest Group on Computer Graphics and Interactive Techniques) is an annual conference on computer graphics held since 1974. At this year’s event, Nvidia’s founder and CEO Jensen Huang shows how the company plans to bring AI, computer graphics, and the (industrial) ) metaverse together.
“The combination of AI and computer graphics will power the Metaverse, the next evolution of the Internet,” Huang said at the start of the Nvidia presentation.
Nvidia’s presentation featured different elements of this project in four sections:
- The new Nvidia Omniverse Avatar Cloud Engine (ACE)
- expansion plans for the Universal Scene Description (USD) standard to serve as the “language of the metaverse,”
- comprehensive enhancements to Nvidia’s Omniverse,
- and tools for AI graphics.
Nvidia aims to enable enterprises to use digital avatars
Nvidia’s Omniverse Avatar Cloud Engine (ACE) includes a set of AI models designed to facilitate the development and customization of lifelike virtual assistants or digital humans. These include models for turning audio into lip movements or making emotions visible. It also includes speech understanding models such as Nvidia’s Megatron.
By moving these models and services to the cloud, ACE gives companies of all sizes instant access to the massive computing power needed to develop and deploy assistants and avatars, Nvidia said. The avatars could understand multiple languages, respond to voice prompts, interact with the environment and make intelligent recommendations, thanks to AI models.
Projects such as Nvidia’s Maxine already used ACE. Omniverse ACE is expected to be available next spring and run on embedded systems and all major cloud services.
Nvidia wants to make Universal Scene Description the language of the metaverse
A metaverse needs a standard way to describe all content within 3D worlds, said Rev Lebaredian, vice president of Omniverse and simulation technology at Nvidia. The company considers Pixar’s Universal Scene Description (USD) to be the standard scene description for the next Internet era, Lebaredian added. He compared USD to HTML on the 2D Web.
USD is an open-source framework developed by Pixar for exchanging 3D computer graphics data and is used in many industries, such as architecture, design, robotics, and CAD.
As a result, Nvidia said it will work with Pixar, as well as Adobe, Autodesk, Siemens, and a number of other leading companies, on a multi-year strategy to extend USD’s capabilities beyond visual effects. For example, USD is expected to better support industrial metaverse applications in architecture, engineering, manufacturing, scientific computing, robotics, and industrial digital twins.
“Our next milestones aim to make USD performant for real-time, large-scale virtual worlds and industrial digital twins,” Lebaredian says. Nvidia also plans to help develop support for international character sets, geospatial coordinates, and real-time streaming of IoT data.
Nvidia focuses on networked Omniverse
The company also showed off numerous enhancements to Nvidia’s Omniverse. Huang described Omniverse as “a USD platform, a toolkit for building metaverse applications and a compute engine to run virtual worlds.” The new release includes several updated core technologies and more connections to common tools.
These connections, called Omniverse Connectors, are currently in development for Unity, Blender, Autodesk Alias, Siemens JT, SimScale, the Open Geospatial Consortium, and others. Beta versions for PTC Creo, Visual Components, and SideFX Houdini are now available, he said. Siemens Xcelerator is also part of the Omniverse network, he said, and is expected to enable digital twins for more industrial customers.
Omniverse also gets neural graphics capabilities developed by Nvidia. This includes Instant NeRF for quickly creating 3D objects and scenes from 2D images and GauGAN360. Instant NeRF is an evolution of GauGAN that generates 8K, 360-degree panoramas.
An extension for Nvidia’s Modulus, a machine learning framework, also enables developers to speed up AI-based physics simulations by up to 100,000x, he said.
Nearly a dozen partners will showcase Omniverse capabilities at SIGGRAPH, including hardware, software, and cloud service providers from AWS and Adobe to Dell, Epic, and Microsoft. In a demo, Industrial Light & Magic, for example, showed how the company’s Omniverse DeepSearch AI searches massive asset databases using natural language and gets results even when search term metadata is missing.
Nvidia delivers tools for AI graphics and improves visual effects simulation OpenVDB
One of the key pillars for the emerging metaverse is neural graphics, Nvidia said. In neural graphics (or neural rendering), neural networks accelerate and improve computer graphics.
“Neural graphics intertwines AI and graphics, paving the way for a future graphics pipeline that is amenable to learning from data,” said Sanja Fidler, vice president of AI at Nvidia. “Neural graphics will redefine how virtual worlds are created, simulated, and experienced by users.”
Nvidia is showing 16 research papers on the topic at SIGGRAPH. These include Instant NGP, a tool for various neural rendering applications. This and other AI tools are now available in the PyTorch Libary Kaolin Wisp for Neural Fields research.
Nvidia also announced NeuralVDB, an evolution of the open-source OpenVDB standard. Over the past decade, OpenVDB won Academy Awards as a core technology used in the visual effects industry, Nvidia said.
Since then, it expanded beyond the entertainment industry to industrial and scientific use cases involving sparse volumetric data, such as industrial design and robotics, it said.
Last year, Nvidia already showed off NanoVDB, a support for GPUs that speed up computations many times over. NeuralVDB builds on that work, using machine learning for compact neural representations that reduce memory requirements by up to 100x. This, they say, allows them to interact with extremely large and complex data sets in real-time.