Nvidia CEO Jensen Huang announced today at the Consumer Electronics Show (CES) in the United States that the company will collaborate with Toyota, a company with annual sales of one billion units, to develop autonomous driving technology based on Nvidia’s technology. This move is seen as a direct competition with Tesla, and it is expected that the global automotive industry will be impacted by visual computing AI.
In his announcement, Huang stated that the success of Tesla and Waymo in the United States has proven the potential of visual AI in the field of autonomous driving, further indicating that the autonomous vehicle industry could reach trillions of dollars in value. Huang emphasized that the ability for autonomous driving to operate depends on three built-in computers in the vehicle, including an AI computing computer that provides a large amount of computational power, a simulation computer that can detect and judge the environment, and a data generation system. In different commercial cases, these computers could range from one to two devices.
“I am very pleased to announce today that Toyota will be joining Nvidia in the development of next-generation autonomous vehicles,” Huang said. “With over a billion cars on the road driving over a trillion miles each year, these will be the vehicles of autonomous driving in the future. This will be a massive industry, with a value exceeding trillions of dollars, and it is expected to grow at a rate of $5 billion annually.”
According to the presentation, Nvidia’s autonomous driving platform, called “AGX,” is equipped with Nvidia Drive OS and is designed as a fully functional and secure autonomous driving vehicle. In addition to the core computing chip “Sol,” it also includes numerous cameras, radar, and ultrasonic sensors. The detected driving information will be automatically transformed into appropriate driving judgments and output instructions within the platform.
Huang mentioned that the prototype vehicle of this platform computer has also passed the highest level of ASIL safety automotive standards. It has gone through extensive testing, with 70,000 lines of code, 15,000 engineer hours invested, and 2 million actual tests.
Huang also showcased the new system “Omnimap,” which combines Omniverse with Nvidia Cosmos. The visual effects demonstrated on the AGX platform were accumulated through extensive remote calculations on DGX computers. Each AGX unit seems to have a supercomputer internally, depicting the virtual Omniverse world. In various environments, the accumulated data from AGX can be processed by DGX to compute neural networks and images, achieving more accurate and precise Omnimap rendering. DGX in the background can simulate the possible variations in different scenes and environmental conditions, thereby allowing AGX’s autonomous driving to become more mature. With computer calculations, driving data that may have only been obtained a few hundred times can now reach billions of effective miles in Omniverse, adding higher standards for safety and autonomous driving.
“With the physical-based capabilities, we can generate countless data behind the scenes to train AI. This artificial intelligence is based on physical principles and is very accurate and reasonable. We are very excited about this because, just like the development of the computer industry in the past, we will see similar progress in the field of autonomous driving in the coming years, and it will accelerate in the next few years,” Huang said.
Although Tesla was only mentioned briefly at the beginning, Nvidia’s collaboration with traditional automakers such as Toyota is seen as direct competition with Tesla. The biggest difference between Nvidia’s disclosed vehicle platform and Tesla’s current autonomous driving is that Nvidia’s vehicle platform is a universal model. The test vehicles do not appear to be completely newly built autonomous vehicles, but rather traditional cars equipped with computers and sensors to achieve autonomous driving. This means that there may be a high threshold for traditional cars to upgrade to Nvidia’s autonomous driving, and these automakers may not have the qualifications to obtain the autonomous driving system for every vehicle. It will depend on the compatibility between Nvidia’s AGX and each automaker’s vehicles, and the most likely scenario is that automakers will launch brand new models for autonomous driving.
Compared to Tesla, which was designed from the beginning for autonomous driving, even though current Tesla vehicles do not have fully autonomous driving capabilities, they have a higher probability of painlessly upgrading compared to Nvidia’s AGX solution with traditional automakers.
In terms of the logic of autonomous driving solutions, Tesla has completely shifted to a purely visual mode and no longer relies on any radar sensors for fully autonomous driving. On the other hand, Nvidia’s AGX combines visual image AI with hybrid solutions such as laser radar to achieve autonomous driving. This inherent difference in the development speed and system characteristics of the two approaches may result in different outcomes. While Tesla has the advantage of being a pioneer, competition from Nvidia’s extensive computational power may lead to a situation where each has its own strengths.