Nvidia Omniverse adds generative AI and Unity with update
Nvidia has released a new update to its Omniverse simulation tools that will let developers tap generative AI and Unity’s game engine. The latest release delivers enhanced performance and usability, new deployment options, and newOmniverse connectors to expand the ecosystem. The company made the announcement at a virtual event ahead of the CES 2023 tech […]
…
Nvidia has released a new update to its Omniverse simulation tools that will let developers tap generative AI and Unity’s game engine.
The latest release delivers enhanced performance and usability, new deployment options, and new
Omniverse connectors to expand the ecosystem. The company made the announcement at a virtual event ahead of the CES 2023 tech trade show in Las Vegas.
Nvidia also said that new customers using Omniverse include include Dentsu, Mercedes Benz, Zaha Hadid Architects. And Mercedes Benz plans to create a “digital twin” of a car factory in Germany. Then it will build a real-life factory based on the digital twin design.
With Omniverse, everyone from creators to game developers to enterprises can create simulations to test things that are better evaluated in virtual worlds before they’re deployed in the real world — like self-driving cars. And Omniverse is becoming more popular as hundreds of enterprises and hundreds of thousands of creators adopt it. It’s also aggregates technologies that are stepping stones to the metaverse.
Event
GamesBeat Summit: Into the Metaverse 3
Join the GamesBeat community online, February 1-2, to examine the findings and emerging trends within the metaverse.
“One of the most exciting things that we have is Nvidia Omniverse coming to RTX laptops,” said Richard Kerris, vice president of the Omniverse platform, in a press briefing. “We’re going to now start seeing Nvidia’s Omniverse Launcher, which is the starting point for using Omniverse, preinstalled on these new Nvidia Studio laptops. That means we’re going to have Omniverse available to millions of creators on the planet. And with a simple click of a button, we’ll be able to install Omniverse and take advantage of some of the capabilities that the platform has further existing workflows.”
While Nvidia owns Omniverse and uses the software to help sell its graphics processing units (GPU chips), it has adopted the Universal Scene Description (USD) open-source standard (originally developed by Pixar) to allow companies to exchange 3D assets and enable them to run on pretty much any kind of hardware. (See our long story on USD here). Nvidia has also launched Omniverse Cloud, a way to access the software from more than 100 countries using run-of-the-mill hardware.
Omniverse is a platform for collaboration and simulation. It brings together existing workflows and lets developers, engineers and creators simulate environments in real time with physically accurate materials and AI. And a bunch of third-party tools now connect to Omniverse. Nvidia showed a demo of six artists in different parts of the world collaborating on a 3D scene in real time.
The Omniverse is a good tool for building digital twins, and digital twins are becoming extremely popular among enterprises such as Mercedes Benz. It allows them to perfect the design for a factory in an inexpensive but accurate virtual simulation. Once the real-world factory is built based on the design, the company can collect real world data from the sensors and feed that back to the digital twin. And that allows for further improvements in the design, said Kerris.
Omniverse enterprise
The latest release of Nvidia Omniverse Enterprise, available now, brings increased performance, generational leaps in real-time RTX ray and path tracing, and streamlined workflows to help teams build
connected 3D pipelines, and develop and operate large-scale, physically accurate, virtual 3D worlds like
never before.
Artists, designers, engineers and developers can benefit from various enhancements across common
Omniverse use cases, including breaking down 3D data silos through aggregation, building custom 3D
pipeline tools and generating synthetic 3D data.
The release includes support for the breakthrough innovations within the new Nvidia Ada Lovelace
architecture, including third-generation RTX technology and DLSS 3, delivering up to three times performance gains when powered by the latest GPU technology, such as Nvidia RTX 6000 Ada Generation, Nvidia L40 and OVX systems.
The update also delivers features and capabilities such as new Omniverse Connectors, layer-based live
workflows, improved user experience and customization options, including multi-view ports, editable
hotkeys, lighting presets and more.
Enhancing 3D creation
The Omniverse ecosystem is expanding, with more capabilities and tools that allow teams to elevate 3D
workflows and reach new levels of real-time, physically accurate simulations. The latest additions
include:
- New Connectors: Omniverse Connectors enable more seamless connected workflows between disparate 3D applications. New Adobe Substance 3D Painter, Autodesk Alias, PTC Creo, Kitware’s Paraview Omniverse Connectors are now supported on Omniverse Enterprise. Users can also easily export data created in NX Software from Siemens Digital Industries Software.
- Omniverse DeepSearch: Now generally available, this AI-powered service lets teams intuitively search through extremely large, untagged 3D databases with natural language or using 2D reference images. This unlocks major value in previously unwieldy digital backlots and asset collections that lacked the ability to search.
- Omniverse Farm: A completely renewed user interface provides improved usability and performance, plus Kubernetes support.
- Omniverse Cloud: New cloud containers for Enterprise Nucleus Server, Replicator, Farm and Isaac Sim for AWS provide enterprises more flexibility in connecting and managing distributed teams all over the world. AWS’ security, identity and access-management controls allow teams to maintain complete control over their data. Containers are now available on Nvidia NGC.
Strengthening core components for building 3D worlds
Omniverse Enterprise is designed for maximum flexibility and scalability. This means creators, designers, researchers and engineers can quickly connect tools, assets and projects to collaborate in a shared virtual space.
Omniverse Enterprise brings updates to the core components of the platform, including Omniverse Kit SDK, the powerful toolkit for building extensions, apps, microservices or plug-ins, now makes it easier than ever to build advanced tools and Omniverse applications with new templates and developer workflows.
It also includes Omniverse Create, a reference app for composing large-scale, USD-based worlds. And it now includes Nvidia DLSS 3 and multi-viewport support, making it easier for Omniverse Enterprise users to fluidly interact with extremely large and complex scenes.
And it has Omniverse View, a reference app for reviewing 3D scenes, and it has been streamlined to focus purely on the review and approval experience. New collaborative, real-time, interactive capabilities — including markup, annotation, measure and simple navigation — make stakeholder presentations easier and more interactive than ever.
Then there’s Omniverse Nucleus, the database and collaboration engine, now includes improved IT management tools, such as expanded version control to handle atomic checkpoints on the server. Updated Large File Transfer service enables users to move files between servers, on premises or in the cloud to benefit hybrid workflows. And new self-service deployment instructions for Enterprise Nucleus Server on AWS are now available, letting customers deploy and manage Nucleus in the cloud.
Customers dive into Omniverse Enterprise
Many customers around the world have experienced enhanced 3D workflows with Omniverse
Enterprise.
Dentsu International, one of the largest global marketing and advertising agency networks, looks for solutions that enable collaborative and seamless work, with a central repository for completed projects.
In addition to enhancing current pipelines with Omniverse Enterprise, Dentsu is looking to incorporate
Nvidia generative AI into its 3D design pipeline with software development kits like Omniverse ACE and
Audio2Face.
Zaha Hadid Architects (ZHA) is an architectural design firm that has created some of the world’s most singular building designs. ZHA focuses on creating transformative cultural, corporate and residential spaces through cutting-edge technologies. With Omniverse Enterprise, the team can accelerate and automate its workflows, as well as develop custom tools within the platform.
“We are working with Nvidia to incorporate Omniverse as the connective infrastructure of our tech
stack. Our goal is to retain design intent across the various project stages and improve productivity,” said
Shajay Bhooshan, associate director at Zaha Hadid Architects, in a statement. “We expect Nvidia Omniverse to play a critical, supportive role to our efforts to create a platform that’s agnostic, version-controlled and a single source of truth for design data, as it evolves from idea to delivery.”
Nvidia Omniverse Enterprise is available by subscription from BOXX Technologies, Dell Technologies, Z
by HP and Lenovo, and channel partners including ASK, PNY and Leadtek. The platform is optimized to
run on Nvidia-Certified, RTX-enabled desktop and mobile workstations, as well as servers, with new
support for Nvidia RTX Ada generation systems.
Omniverse Avatar Cloud Engine
Developers and teams building avatars and virtual assistants can now register to join the early-access program for Nvidia Omniverse Avatar Cloud Engine (ACE), a suite of cloud-native AI microservices that make it easier to build and deploy intelligent virtual assistants and digital humans at scale.
Omniverse ACE eases avatar development, delivering the AI building blocks necessary to add intelligence and animation to any avatar, built on virtually any engine and deployed on any cloud. These AI assistants can be designed for organizations across industries, enabling organizations to enhance existing workflows and unlock new business opportunities.
ACE is one of several generative AI applications that will help creators accelerate the development of 3D worlds and the metaverse. Members who join the program will receive access to the prerelease versions of NVIDIA’s AI microservices, as well as the tooling and documentation needed to develop cloud-native AI workflows for interactive avatar applications.
Bringing interactive AI avatars to life with Omniverse ACE
Methods for developing avatars often require expertise, specialized equipment and manually intensive workflows. To ease avatar creation, Omniverse ACE enables seamless integration of Nvidia’s AI technologies — including pre-built models, toolsets and domain-specific reference applications — into avatar applications built on most engines and deployed on public or private clouds.
Since it was unveiled in September, Omniverse ACE has been shared with select partners to capture early feedback. Now, NVIDIA is looking for partners who will provide feedback on the microservices, collaborate to improve the product, and push the limits of what’s possible with lifelike, interactive digital humans.
The early-access program includes access to the prerelease versions of ACE animation AI and
conversational AI microservices, including 3D animation AI microservice for third-party avatars, which uses Omniverse Audio2Face generative AI to bring to life characters in Unreal Engine and other rendering tools by creating realistic facial animation from just an audio file. A number of other updates are also available.
Partners such as Ready Player Me and Epic Games have experienced how Omniverse ACE can enhance workflows for AI avatars.
The Omniverse ACE animation AI microservice supports 3D characters from Ready Player Me, a
platform for building cross-game avatars.
“Digital avatars are becoming a significant part of our daily lives. People are using avatars in games, virtual events and social apps, and even as a way to enter the metaverse,” said Timmu Tõke, CEO of Ready Player Me, in a statement. “We spent seven years building the perfect avatar system, making it easy for developers to integrate in their apps and games and for users to create one avatar to explore various worlds — with Nvidia Omniverse ACE, teams can now more easily bring these characters to life.”
Epic Games’ advanced MetaHuman technology transformed the creation of realistic, high-fidelity digital humans. Omniverse ACE, combined with the MetaHuman framework, will make it even easier for users to design and deploy engaging 3D avatars.
Generative AI comes to Omniverse
Whether creating realistic digital humans that can express raw emotion or building immersive virtual
worlds, those in the design, engineering, creative and other industries across the globe are reaching new
heights through 3D workflows, Nvidia said.
Animators, creators and developers can use new AI-powered tools — dubbed generative AI — to reimagine 3D environments, simulations and the metaverse — the 3D evolution of the internet.
Based on the Universal Scene Description (USD) framework, the Omniverse platform — which enables the development of metaverse applications — is expanding with Blender enhancements and a new suite of experimental generative AI tools for 3D artists.
Nvidia announced these features, as well as Omniverse preinstallation on Nvidia Studio laptops and thousands of new, free USD assets to help accelerate adoption of 3D workflows.
Plus, a new release for Blender, now available in the Omniverse Launcher, is bringing 3D generative AI
capabilities to Blender users everywhere. A new panel lets Blender users easily transfer shape keys and
rigged characters. The challenge of reattaching a rigged character’s head can now be solved with a
one-button operation from Omniverse Audio2Face — an AI-enabled tool that automatically generates
realistic facial expressions from an audio file.
“One of the hottest topics in the industry these days is generative AI and so we brought using our research team and new aspect in AI toy box are bringing these generative AI capabilities into the platform,” Kerris said.
Another new panel for scene optimization lets users create USD scenes within their multi-app 3D
workflows more easily and in real time.
In addition, Audio2Face, Audio2Gesture and Audio2Emotion — generative AI tools that enable instant
3D character animation — are getting performance updates that make it easier for developers and
creators to integrate into their current 3D pipelines.
Creators can generate facial expressions from an audio file using Audio2Face; realistic emotions ranging
from happy and excited to sad and regretful with Audio2Emotion; and realistic upper-body movement
using Audio2Gesture. These audio-to-animation tools are game-changers for 3D artists, eliminating the
need to perform tedious, manual tasks.
AI-assisted creator tools are expanding to even more communities of creative and technical professionals. When Nvidia Canvas was introduced, it empowered artists to seamlessly generate
landscapes and iterate on them with simple brushstrokes and AI. Coming soon, all RTX users will be able
to download an update to Canvas that introduces 360 surround images to create and conceptualize
panoramic environments and beautiful images. The AI ToyBox, which features extensions derived from
Nvidia Research, enables creators to generate 3D meshes from 2D inputs.
Omniverse’s powerful AI tools simplify complex tasks. Creators of all levels can tap into these resources
to produce high-quality outputs that meet the growing demands for content and virtual worlds in the
metaverse.
“The demand for 3D skills is skyrocketing, but learning 3D can be pretty scary to some, and definitely
time consuming,” said Jae Solina, aka JSFilmz, in a statement. “But these new platform developments not only let creatives and technical professionals continue to work in their favorite 3D tools, but also supercharge their craft and even use AI to assist them in their workflows.”
Omniverse Launcher, the portal to download Omniverse content and reference applications, has also
been made available to system builders so they can preinstall it, enabling optimized, out-of-the-box
experiences for 3D creators on Nvidia Studio-validated laptops. Gigabyte and Aorus will be the first
laptops launching in 2023 with Omniverse Launcher preinstalled, expanding platform access to a growing
number of 3D content creators.
Nvidia RTX Remix is a free modding platform, built on Omniverse, that enables modders to quickly create and share #RTXON mods for classic games, each with full ray tracing, enhanced materials, NVIDIA DLSS 3 and NVIDIA Reflex. Its release in early access is coming soon.
The Portal with RTX game was built with RTX Remix, and to demonstrate how easy it is for modders to turn RTX ON in their mods, we shared RTX Remix with the original creator of Portal: Prelude, an unofficial Portal prequel released in 2008.
Omniverse users can also access thousands of new, free USD assets, including a USD-based Nvidia RTX
Winter World Minecraft experience, and learn to create their own Nvidia SimReady assets for complex
simulation building. Using Omniverse, creators can charge their existing workflows using familiar tools such as Autodesk Maya, Autodesk 3ds Max, Blender, Adobe Substance 3D Painter, and more with AI, simulation tools and real-time RTX-accelerated rendering.
All types of 3D creators can take advantage of these new tools to push the boundaries of 3D simulation
and virtual world-building. Users can reimagine digital worlds and animate lifelike characters with new
depths of creativity through the bridging of audio-to-animation tools, generative AI and the metaverse.
Nvidia also said that it has enhanced tools for Isaac Sim to make it easier to design and deploy robots in enterprise settings.
Lastly, Nvidia said early access for the Unity Omniverse Connector is now available. Blender alpha release, now available in the Omniverse Launcher, enables users to repair geometry, generate automatic UVs and decimate high-resolution CAD data to more usable polycounts.
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.