Feature Articles

NVIDIA’s autonomous car ambition gets a big boost with Drive Constellation and Toyota

By Vijay Anand - 20 Mar 2019

NVIDIA’s autonomous car ambition gets a big boost from Drive Constellation and Toyota

Validating autonomous vehicles is key before they take to the roads.

Making autonomous vehicles a reality

Besides championing real-time ray tracing and building new platforms to serve the increasingly growing workloads of data scientists, NVIDIA’s other long drawn involvement is in the car industry. From the days of just powering the car instrumentation panel and entertainment screens, NVIDIA is now a strong proponent of autonomous vehicles and is not only able to deliver a capable hardware platform to tackle that, but is also deeply focused on supporting the entire ecosystem from software development/interface stacks to tackling legal and safety requirements in making the autonomous vehicle future a reality.

A visual mock-up of Drive Sim at work with the NVIDIA Constellation system hidden away that's simulating these feeds seen by the supposed virtual car.

 

Simulation to progress autonomous vehicle reliability

The Drive Constellation compute platform is made up of the Drive Constellation Vehicle (standing server) and the Drive Constellation Simulator (server lying flat).

To that extent, NVIDIA has now officially launched its Drive Constellation – a cloud based platform that will help drive autonomous vehicles millions, if not billions of miles, virtually and exposing it to a broad range of incidents and scenarios that could take forever to occur in the real world and could be dangerous. This kind of training is necessary so that autonomous or autonomous assistive cars can take off sooner with appropriate legislation recognizing the array of tests and scenarios the cars are subjected to. Not to mention, the safety assurance that customers would require for approaching a new breed of cars. After all, safety is a core requirement for any car – more so an autonomous one where the car would be in control of people’s lives.

Danny Shapiro, NVIDIA's Senior Director of Automotive, was personally giving the media a runthrough of how the Drive Constellation and Drive Sim work hand-in-hand for their new autonmous vehicle validation system.

First previewed in last year’s GTC event, NVIDIA is now confident to offer this to any vendor that has autonomous car driving goals. The NVIDIA Drive Constellation data center solution integrates NVIDIA Drive AGX Pegasus and runs the NVIDIA Drive Sim software to offer extensive testing and validation of the self-driving car put to test.

With virtual simulation, we can increase the robustness of our algorithms by testing on billions of miles of custom scenarios and rare corner cases, all in a fraction of the time and cost it would take to do so on physical roads. - Rob Csongor, VP and GM of Automotive at NVIDIA.

To get into the details, the Drive Constellation is a computing platform based off two different purpose-specific servers. The first is the Drive Constellation Simulator which uses a several NVIDIA Tesla Volta GPUs in a system (that’s somewhat like the DGX-1) to run the Drive SIM software that generates the sensor output (cameras, lidar, radar, and others) from the virtual car driven in a virtual world. Using its powerful GPUs, virtual cars can be subjected to photorealistic data streams to mimic a range of real-world test environments and difficult scenarios. Even weather can be simulated as required like rain, hail, and glare from the sun that could affect the how the sensor responds – all this without having to physically drive the car on the road and subject it to these dangerous conditions and that of others in the actual environment should something go wrong.

Simulation is life-like and that's the intent.

The second server is the Drive Constellation Vehicle which contains the NVIDIA Drive AGX Pegasus AI car computer along with a whole host of input monitors (just as if it was in a real car) and runs the complete autonomous vehicle software stack to process all the simulated sensor data as if it’s coming from the sensors of a car driving on the physical road.

The driving decisions from Drive Constellation Vehicle are then fed back into Drive Constellation Simulator, enabling bit-accurate, timing-accurate hardware-in-the-loop testing. In fact, safety agencies like TÜV SÜD are already using the Drive Constellation platform to formulate their self-driving validation standards.

 

Safety Force Field to the rescue!

Further to the launch of the Drive Constellation virtual reality autonomous vehicle simulation system, NVIDIA has also defined a new computational defensive driving policy to safeguard autonomous vehicles from collisions. Dubbed the NVIDIA Safety Force Field (or SFF for short), this robust layer of safety policies is designed to prevent serious mishaps and will be built-in to the Drive control stack.

While the NVIDIA Drive AGX Pegasus is supposed to avert such possibilities of a collision, it doesn’t hurt to have a predefined added layer of a computational framework that determines a set of acceptable actions to keep the vehicle safe and ensure it doesn’t contribute to or cause an unsafe scenario. According to NVIDIA, the policies used to determine these safeguards have undergone robust validation with real-world simulations.

Here's what else you would be interested to know:-

The policy jointly considers braking (longitudinal) and steering (lateral) constraints, enabling a wide range of possible actions. Customers can combine SFF with their own driving software, using it as a safety shield layer in their motion planning software that monitors and prevents unacceptable actions.

SFF is mathematically designed such that autonomous vehicles equipped with SFF will, like magnets that repel each other, keep themselves out of harm’s way and not contribute to unsafe situations. - David Nister, VP of Autonomous Driving Software at NVIDIA.

 

NVIDIA and Toyota collaborate to accelerate the use of autonomous vehicles

At NVIDIA GTC 2019, Toyota Research Institute Advanced Development (TRI-AD) and NVIDIA announced a new collaboration to develop, train and validate self-driving vehicles.

The partnership builds on an ongoing relationship with Toyota to utilize the NVIDIA Drive AGX Xavier autonomous vehicle computer and is based on close development between teams from NVIDIA, TRI-AD in Japan and Toyota Research Institute (TRI) in the United States. The broad partnership includes advancements in:-

  • A.I. computing infrastructure using NVIDIA GPUs
  • Simulation using the NVIDIA Drive Constellation platform
  • In-car autonomous vehicle computers based on Drive AGX Xavier or Drive AGX Pegasus

The agreement includes the development of an architecture that can be scaled across many vehicle models and types, accelerating the development and production timeline, and simulating the equivalent of billions of miles of driving in challenging scenarios.

We believe large-scale simulation tools for software validation and testing are critical for automated driving systems. - Dr. James Kuffner, CEO of NVIDIA and Toyota Research Institute-Advanced Development Partner.

Self-driving vehicles for everyday use and commercial applications in countless industries will soon be commonplace. Everything that moves will be autonomous. Producing all these vehicles at scale will require a connected collaboration for all elements of the system. Our relationship with TRI-AD and TRI is a model for that collaboration. - NVIDIA founder and CEO Jensen Huang.

 

Pair the right sensors for the right job

Sony Semiconductor Solutions was also present at GTC 2019 to promote Sony's Automotive CMOS image sensors and to showcase to other vendors that not all image sensors are equal. Namely, their IMX390 image sensors that are ideal for near-field pedestrian and object detection and their high resolution IMX424 that are more adept to make out vehicle types and lane markings from much further distances. Also, the quality of sensors and its characteristics make a huge difference in how it handles bright (process via HDR) and dimly lit environments such as this night scene below. Thanks to their automotive CMOS sensors' high sensitivity, capturing details in very low light is still a possibility. These sensors also easily work with any of NVIDIA's AGX or Drive system options.

Join HWZ's Telegram channel here and catch all the latest tech news!
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.