Nvidia has announced its next chip for self-driving cars years in advance:
First outlined as part of NVIDIA's DRIVE roadmap at GTC 2018, NVIDIA CEO Jensen Huang took the stage at GTC China this morning to properly introduce the chip that will be powering the next generation of the DRIVE platform. Officially dubbed the NVIDIA DRIVE AGX Orin, the new chip will eventually succeed NVIDIA's currently shipping Xavier SoC, which has been available for about the last year now. In fact, as has been the case with previous NVIDIA DRIVE unveils, NVIDIA is announcing the chip well in advance: the company isn't expecting the chip to be fully ready for automakers until 2022.
What lies beneath Orin then is a lot of hardware, with NVIDIA going into some high-level details on certain parts, but skimming over others. Overall, Orin is a 17 billion transistor chip, almost double the transistor count of Xavier and continuing the trend of very large, very powerful automotive SoCs. NVIDIA is not disclosing the manufacturing process being used at this time, but given their timeframe, some sort of 7nm or 5nm process (or derivative) is pretty much a given. And NVIDIA will definitely need a smaller manufacturing process – to put things in comparison, the company's top-end Turing GPU, TU102, takes up 754mm2 for 18.6B transistors, so Orin will pack in almost as many transistors as one of NVIDIA's best GPUs today.
[...] All told, NVIDIA expects Orin to deliver 7x the 30 INT8 TOPS performance of Xavier, with the combination of the GPU and DLA pushing 200 TOPS. It goes without saying that NVIDIA is still heavily invested in neural networks as the solution to self-driving systems, so they are similarly heavily investing in hardware to execute those neural nets.
[...] Finally, while NVIDIA hasn't disclosed any official figures for power consumption, it's clear that overall power usage is going up relative to Xavier. While Orin is expected to be 7x faster than Xavier, NVIDIA is only claiming it's 3x as power efficient. Assuming NVIDIA is basing all of this on INT8 TOPS as they usually do, then the 1 TOPS/Watt Xavier would be replaced by the 3 TOPS/Watt Orin, putting the 200 TOPS chip at around 65-70 Watts. Which is admittedly still fairly low for a single chip at a company that sells 400 Watt GPUs, but it could add up if NVIDIA builds another multi-processor board like the DRIVE Pegasus.
The design will include 12 ARM "Hercules" (Cortex-A78) cores rather than Nvidia-designed custom ARM cores.
Also at Wccftech.
Related: Nvidia Demos a Car Computer Trained with "Deep Learning"
Nvidia Announces Jetson Nano Single-Board Computer
Nvidia Supercomputer to Crunch Autonomous Vehicles Data
Related Stories
Many cars now include cameras or other sensors that record the passing world and trigger intelligent behavior, such as automatic braking or steering to avoid an obstacle. Today’s systems are usually unable to tell the difference between a trash can and traffic cop standing next to it, though.
This week at the International Consumer Electronics Show in Las Vegas, Nvidia, a leading marking of computer graphics chips, unveiled a vehicle computer called the Drive PX ( http://www.nvidia.com/object/drive-px.html ) that could help cars interpret and react to the world around them.
Nvidia Announces Jetson Nano Dev Kit & Board: X1 for $99
Today at GTC 2019 Nvidia launched a new member of the Jetson family: The new Jetson Nano. The Jetson family of products represents Nvidia new focus on robotics, AI and autonomous machine applications. A few months back we had the pleasure to have a high level review of the Jetson AGX as well as the Xavier chip that powers it. The biggest concern of the AGX dev kit was its pricing – with retail costs of $2500 ($1299 as part of Nvidia's developer programme), it's massively out of range of most hobbyist users such as our readers.
[...] The Jetson Nano is a full blown single-board-computer in the form of a module. The module form-factor and connector is SO-DIMM and is similar to past Nvidia modules by the company. The goal of the form-factor is to have the most compact form-factor possible, as it is envisioned to be used in a wide variety of applications where a possible customer will design their own connector boards best fit for their design needs.
At the heart of the Nano module we find Nvidia's "Erista" chip which also powered the Tegra X1 in the Nvidia Shield as well as the Nintendo Switch. The variant used in the Nano is a cut-down version though, as the 4 A57 cores only clock up to 1.43GHz and the GPU only has half the cores (128 versus 256 in the full X1) active. The module comes with 4GB of LPDDR4 and a 16GB eMMC module. The Jetson Nano module will be available to interested parties for $129.
$99 without storage.
Related: Nvidia Reveals Jetson Xavier SoC for Robots
An Anonymous Coward writes:
Nvidia has just announced their new supercomputer that will be used to train AI networks for self-driving cars. See https://www.autonomousvehicleinternational.com/news/computing/nvidia-supercomputer-to-handle-ai-data-from-autonomous-vehicles.html
Full text of the press release is below. There is a photo of the installation at the link:
Computing technology developer Nvidia has built the world’s 22nd fastest supercomputer – DGX SuperPOD – to provide the AI infrastructure needed to meet the demands of the company’s autonomous vehicle deployment program.
DGX SuperPOD was built in just three weeks using 96 Nvidia DGX-2H supercomputers and Mellanox interconnect technology. Delivering 9.4 petaflops of processing capability, it has the power needed for training the vast number of deep neural networks required for safe self-driving vehicles.
A single data-collection vehicle generates 1TB of data per hour. Multiply that by years of driving over an entire fleet, and you quickly get to petabytes of data. That data is used to train algorithms on the rules of the road — and to find potential failures in the deep neural networks operating in the vehicle, which are then re-trained in a continuous loop.
“Few AI challenges are as demanding as training autonomous vehicles, which requires retraining neural networks tens of thousands of times to meet extreme accuracy needs,” said Clement Farabet, vice president of AI infrastructure at Nvidia. “There’s no substitute for massive processing capability like that of the DGX SuperPOD.”
Powered by 1,536 Nvidia V100 Tensor Core GPUs interconnected with Nvidia NVSwitch and Mellanox network fabric, the DGX SuperPOD hardware and software platform takes less than two minutes to train ResNet-50. When this AI model came out in 2015, it took 25 days to train on the then state-of-the-art system, a single Nvidia K80 GPU. DGX SuperPOD delivers results that are 18,000 times faster.
While other TOP500 systems with similar performance levels are built from thousands of servers, DGX SuperPOD takes a fraction of the space and is roughly 400 times smaller than its ranked neighbors.
Will this be enough computrons? It seems every time another announcement is made in this field, it includes yet more compute power to train AIs on ever larger data sets. From what your AC has seen, there is still a good way to go before these network attached cars can match the competence of a good driver (not impaired, not distracted)--which might be, imo, one reasonable target before wide release of the technology.
Alternatively, will someone come up with better AI algorithms (more like people?) that will vastly change/reduce the amount of training required?
(Score: 0) by Anonymous Coward on Wednesday December 18 2019, @10:13PM (2 children)
What will they think of next?
(Score: 2) by takyon on Wednesday December 18 2019, @10:28PM (1 child)
Automated Carmageddon.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by Freeman on Wednesday December 18 2019, @11:54PM
Seems legit, just one small over the air update way.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 2) by c0lo on Wednesday December 18 2019, @11:16PM (5 children)
The human brain energy consumption is roughly 20% of the entire body.
If we'd take it as a metric, the self-driving car SoCs available are still primitive in this regard.
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 3, Interesting) by takyon on Wednesday December 18 2019, @11:20PM (4 children)
https://www.anandtech.com/show/11913/nvidia-announces-drive-px-pegasus-at-gtc-europe-2017-feat-nextgen-gpus [anandtech.com]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Wednesday December 18 2019, @11:50PM (3 children)
So now our cars are going to contribute to global warming?
(Score: 2) by Freeman on Wednesday December 18 2019, @11:58PM (1 child)
. . . please note, cars don't run on rainbows, and ever so few run on hydrogen. Personally, I think we missed the environmentally friendly boat by not picking up hydrogen vehicles and running with it.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: -1, Troll) by Anonymous Coward on Thursday December 19 2019, @12:35AM
Nope... the big miss was when the idiot hippies stopped development of nuclear tech. If it weren't for those tree-hugging left wing morons we would have safe, clean nuclear power plants in every city. No AGW. And nuclear powered cars would have eliminated demand for fossil fuels relegating middle east goat-fuckers back to the medieval irrelevance they enjoyed before petrodollars made them so uppity.
(Score: 2) by takyon on Thursday December 19 2019, @12:17AM
After a few node shrinks, what took a kilowatt will be down to 100 Watts.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by TheGratefulNet on Thursday December 19 2019, @12:24AM (1 child)
...like other nv chips in automotive do.
not naming names. nope. but some auto makers have been having LOTS of problems with nvidia and black (going dark) screens in cars.
personally, I think nv is clueless about how to fix it. its been going on for nearly 2 years now. admittedly, its a 'hard to fix' bug but still, I am not at all convinced nv can do a proper job in this field.
"It is now safe to switch off your computer."
(Score: 2) by Bot on Thursday December 19 2019, @12:14PM
So we'll let a company that can't fix a dark screen in two years manage some tons of metal at high speed.
I think this is a clever plot to convince us bots that the robocalypse is redundant.
Account abandoned.
(Score: 0) by Anonymous Coward on Saturday December 21 2019, @08:11PM
fuck you, nvidia!