首页 News 正文

① After years of development, the current Omniverse platform has made significant progress in the system ecosystem based on universal data formats and five functional modules, becoming the cornerstone of virtual reality In the fields of AI and data science, developers can even use Omniverse to simulate what will happen on Earth in the future.
1、 The cornerstone of virtual reality
NVIDIA Omniverse is essentially a one-stop tool integration platform designed by NVIDIA for real-time collaboration and simulation. According to Nvidia's product introduction, e-works, Omniverse's predecessor came from NVIDIA Holodeck (VR Collaborative Design Platform). In the process of creating digital twins for customers, they increasingly realized that this is a complex engineering that integrates multiple technologies and disciplines. From small parts to large factories and cities, the application focus of digital twins for different physical entities varies greatly; In response to this, NVIDIA has consolidated its expertise in GPU data processing, CUDA operations, real-time ray tracing RTX technology, and other software and hardware capabilities, as well as its long-term experience in graphics, AI, and simulation ecosystems, to launch the Omniverse platform in a more efficient and compatible way, thereby solving various pain points in the mapping process between the real world and the digital world, such as the collaboration of various digital tools and efficient and realistic rendering and simulation.
Image: Early development process of Omniverse
Source: NVIDIA Product Introduction, Minsheng Securities Research Institute
At the beginning of its release, Nvidia Omniverse built a platform framework with USD universal data format and five major functional modules. According to Nvidia's product introduction and OpenUSD information express, the original purpose of the Omniverse platform was to integrate various DCC software such as Sketch UP, Rhino, 3DS Max, etc. (for example, even if assets in BIM software can be converted into 3D models that Rhino can recognize, a large amount of information about building components will still be lost). Omniverse chose to use USD format digital assets as the platform's data format, combined with five functional modules to achieve integration: Nucleus is responsible for managing collaboration, Connect is responsible for connecting third-party DCC software, Kit is a toolkit for building native Omniverse applications and microservices, Simulator ION is responsible for physical simulation, RTX Renderer is responsible for real-time rendering, etc.
Figure: Five functional modules of Omniverse
Source: NVIDIA Product Introduction, Minsheng Securities Research Institute
After years of development, the current Omniverse platform has made significant progress in the system ecosystem based on universal data formats and five functional modules, becoming the cornerstone of virtual reality.
1.1 OpenUSD
Nvidia teamed up with industry giants to create the OpenUSD alliance and deeply optimized Omniverse with USD format data. Nvidia has established the non-profit organization AOUSD (OpenUSD Alliance) with companies such as Pixar, Adobe, Apple, and Autodesk to promote interoperability of 3D content through OpenUSD. Currently, USD format data can well describe the geometric shapes, materials, physical properties, and behavioral representations of the 3D world. Omniverse conducts in-depth development for USD data, including but not limited to:
1) Introducing Material Definition Language (MDL) allows developers to easily share physical property based materials among supported applications;
2) Strengthen the USD physical characteristic simulation capability;
3) Safely load and save materials between Omniverse server and local file system using Python 3 asynchronous API;
4) Provide USDView for intuitive inspection of USD scenarios;
5) For example, geospatial coordinates, connecting glTF file formats, real-time programming, being able to run in web browsers, and real-time streaming of IoT data and other functions
Figure: OpenUSD System
Source: NVIDIA official website, Minsheng Securities Research Institute
1.2 Continuously improving the system ecosystem
Nvidia has built a comprehensive Omniverse development stack, extending from workstations to the cloud, allowing developers to build advanced and scalable solutions with less coding. According to NVIDIA's official website, both independent and enterprise developers can easily build and sell their own extended applications, connectors, and microservices on the Omniverse platform, such as:
1) Use low code and no code Python or C++development languages;
2) Utilizing NVIDIA's accumulated AI, rendering, and simulation technologies over the past 20 years, easily modify or integrate over 500 pre built Omniverse extensions and add them to your projects and distributions
3) Publish your application and extensions on the Omniverse platform, and reach numerous customers across various industries in the NVIDIA ecosystem through the Omniverse Exchange Publishing Portal.
Image: Omniverse's complete development stack
Source: NVIDIA official website, Minsheng Securities Research Institute
Nvidia continues to promote the integration of the Omniverse platform and its workflow, laying a solid foundation for future Omniverse Enterprise Edition and cutting-edge technology platforms represented by the robot platform Isaac Sim. Nvidia first utilized OpenUSD to break through data silos, then utilized technologies such as RTX to achieve precise physical simulation capabilities, and finally provided the ability to combine real-world and synthetic data to train AI, which is widely used by developers around the world. With a comprehensive service process, customers from various industries have integrated this Omniverse with their own workflows, including technology giants such as AWS, BMW, Sony, etc.
Figure: Collaboration between Omniverse and other workflows
Source: NVIDIA official website, Minsheng Securities Research Institute
2、 Enterprise Solution
Enterprise Omniverse provides enterprises with a full stack cloud environment built on NVIDIA OVX infrastructure in a PaaS manner to design, develop, deploy, and manage 3D industrial digital applications based on OpenUSD. Nvidia provides a fully optimized infrastructure and development platform, which can be subscribed and used immediately; It also supports Omniverse Cloud APIs (such as Avatar Cloud Engine, ChatUSD, DeepSearch, Picasso, and RunUSD) to accelerate advanced 3D applications and development; It also supports the use of NVIDIA's global cloud streaming infrastructure to provide high fidelity, interactive 3D experiences for any web device.
Image: Omniverse's PaaS service
Source: NVIDIA official website, Minsheng Securities Research Institute
2.1 AI and Data Science
In the fields of AI and data science, developers can even use Omniverse to simulate what will happen on Earth in the future, such as how to deal with global warming and how to deal with climate change. The Nvidia Earth-2 project has achieved breakthroughs through the combination of GPU accelerated computing, deep learning, and neural networks embedded with physical information, as well as three major technologies of artificial intelligence supercomputers. With a large amount of available observation and model data for learning, it has for the first time acquired the technology to build ultra-high resolution climate models, which can predict extreme weather changes in the region for the next few decades at the speed of light.
2.2 Manufacturing industry
The Omniverse shared collaboration environment provides manufacturing practitioners with a cloud based collaboration platform similar to Slack. Taking the BMW project as an example, BMW is promoting the use of the NVIDIA Omniverse platform internally to coordinate the production of 31 factories worldwide. The challenge BMW needs to face is not only coordination among factories, but also how to share data imported from different sources. BMW employees use various software tools such as Autodesk Revit, Dassault Syst è mes CATIA, and data point clouds. Through Omniverse, they can clearly access data from these sources, which is in stark contrast to the current factory planning technology that requires the transfer of different application data. In addition, the Omniverse platform will also support users to merge real-time data from all relevant databases in collaborative simulations, so there is no need to re import data; And a new era collaboration model where one colleague can use motion capture technology to generate virtual images and appear next to another colleague at any other workplace to solve problems together.
2.3 Creative Industry
For the creative industry, Omniverse can provide unique features to address new challenges such as accelerating iteration speed and obtaining accurate simulation results and realism. Taking the Cubic Science and Technology project as an example, the company faces many challenges in designing architecture, among which collaboration and communication of design ideas are more challenging issues for the project team. For example, when team members work remotely and are dispersed, they need to convert and synthesize data from different software tools, datasets, and other project contributors, leading to increased complexity and slow progress in the design process; Meanwhile, the design team continues to face pressure from tight delivery deadlines, leading to increasing expectations for efficient collaboration, high fidelity rendering iteration speed, as well as accurate simulation and realism. From initial conceptual design, global collaboration, to rapid design reviews and demonstrations, Omniverse can change every stage of architecture, engineering, and construction work. Through NVIDIA Omniverse and related components provided by Omniverse, such as Omniverse Create and Omniverse View, as well as NVIDIA RTX real-time ray tracing technology, Cubic Technology has easily implemented architectural design solutions for industrial park projects.
III Isaac Sim robot embodied intelligence platform
Nvidia is committed to combining Omniverse with cutting-edge technology to create a top-notch development platform, with Isaac Sim robot platform being one of the representatives. The Isaac Sim robot platform fully utilizes the powerful simulation technology of the Omniverse platform, including the use of NVIDIA& Reg; PhysX& Reg; Advanced GPU physical simulation with 5 implementations, photo level realism with real-time ray and path tracking, and support for MDL material definition for physics based rendering; Provide a modular architecture suitable for various applications, including operations, navigation, and synthetic data generation for training data; Developers can easily connect robots to the virtual world through the Isaac ROS/ROS 2 interface, fully featured Python scripts, and plugins for importing robots and environment models.
Figure: NVIDIA Isaac Sim Robot Platform Technical Framework
Source: NVIDIA official website, Minsheng Securities Research Institute
The Isaac Sim platform provides developers with a digital twin platform that covers the entire process of robot simulation development, training, deployment, and maintenance.
3.1 Robot simulation development
The NVIDIA Isaac platform consists of NVIDIA Isaac Sim and NVIDIA Isaac ROS. The former is a simulator that provides a simulation environment for testing robot algorithms, while the latter is a hardware acceleration software optimized for NVIDIA Jetson, which includes machine learning, computer vision, and localization algorithms. Based on the NVIDIA Isaac platform for HIL testing, you can verify and optimize the performance of the robot software stack, thereby obtaining safer, more reliable, and more efficient products. By using Isaac ROS, developers can create complex robot applications that accurately perform complex tasks.
Image: Nvidia Isaac ROS architecture
Source: Nvidia Enterprise Solutions WeChat official account, Minsheng Securities Research Institute
3.2 Construction of robot training environment
Isaac Sim utilizes Omniverse's powerful connector capabilities and built-in support for popular product design formats: the advanced URDF importer has been tested on various robot models; CAD files can be imported directly from Onshape and STEP files with minimal post-processing; To make it easier to add resources to different environments, Isaac Sim supports the Shapenet importer, which provides access to a large number of 3D resources.
3.3 Robot Training
Training robots requires a large and diverse dataset, and preparing these datasets can be time-consuming, costly, dangerous, and even impossible in some extreme situations. By utilizing Isaac Sim's Omniverse replicator, developers can use synthetic data in the early stages of the development cycle to accelerate concept validation or validation ML workflows; In the later stages of the development cycle, synthetic data is used to enhance real data, thereby shortening the time for training product models.
Nvidia has released the Eureka robot training assistant, which outperforms programs written by human experts in over 80% of tasks through generated reward programs that allow robots to perform trial and error learning, resulting in an average performance improvement of over 50% for robots. Eureka will also use the GPU in Isaac Gym to accelerate simulation, which can quickly evaluate the quality of a large number of reward candidates, thereby improving training efficiency; Eureka will then summarize key statistical data based on training results and guide LLM to improve the generation of its reward function. Through this approach, AI can self improve. Eureka taught various types of robots, including quadruped robots, bipedal robots, quadcopter robots, dexterous hands, collaborative robotic arms, etc., to complete different types of tasks.
3.4 Robot deployment
Isaac AMR is a platform that can be used to build next-generation AMR fleets, including software services from the edge to the cloud, computing, and a set of reference sensors and robot hardware for simulating, validating, deploying, optimizing, and managing AMR fleets, ensuring advanced surveying, autonomous, and simulation capabilities in large, highly dynamic, and unstructured environments. According to the blueprint provided by Isaac AMR, it is easy to deploy the most advanced AMR at lower cost and faster speed. Isaac AMR is built on the NVIDIA Nova Orin reference architecture, which integrates multiple sensors including stereo cameras, fisheye cameras, 2D and 3D LiDAR, as well as powerful NVIDIA Jetson AGX Orin system modules. It also obtains some of the most advanced artificial intelligence and hardware acceleration algorithms, and executes these algorithms in real-time at the edge with 275TOPS of computing power, making it Isaac AMR's "studious friend" brain and "constantly observing road conditions" eyes. Third party enterprises and other developers can also conduct secondary development based on Isaac AMR and combined with their own needs.
Figure: Nvidia Isaac AMR deployment diagram
Source: Nvidia Enterprise Solutions WeChat official account, Minsheng Securities Research Institute
Disclaimer: All information in this document comes from Nvidia's official website, Nvidia's official series WeChat official account, and does not constitute investment advice under any circumstances.
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

因醉鞭名马幌 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    43