After several twists and turns, Nvidia’s GTC 2020 online conference has finally come. But what is surprising is that the conference was not live broadcast, but composed of 7 recorded and edited videos.
Of course, the product is the point. This year’s GTC 2020 Huang Renxun first brought a new professional-grade GPU A100 based on the Ampere architecture. It integrates over 54 billion transistors and is currently the world’s largest 7nm process processor. The performance of the tensor computing core has been greatly improved. FP32 The performance reaches 19.5 trillion times per second, with 6912 CUDA cores, 40GB of memory, and a memory bandwidth of 1.6TB/s.
In terms of the computing power of the tensor core, the FP16 tensor computing power of this GPU is 2.5 times that of the V100, and it also adds 32-bit tensor floating-point computing power. The computing power in FP16 precision is 5 times that of the V100. times, the INT8 is increased by 200 times, and the AI processing capability is stronger.
Based on the A100 GPU, NVIDIA has launched a new stack AI system called DGX A100, which integrates 8 A100s with a floating-point computing performance of 5 petaflops/s, a memory of 320GB, and a memory bandwidth of 12.4TB/s. With such strong performance, the price is of course “considerable”, and the DGX A100 is priced at $199,000.
Surprisingly, A100 GPUs can not only be combined, but also split applications. At this conference, Nvidia introduced a multi-instance GPU (MIG) that can split a single A100 GPU into 7 independent GPUs. Huang Renxun introduced that the DGX A100 has been released and will be used for the research of the COVID-19 novel coronavirus.
Of course, NVIDIA, which is enthusiastic about autonomous driving, has also combined the A100 GPU with the Orin system-on-chip (SoC) series to expand its own autonomous driving platform, ranging from ADAS systems to DRIVEAGX PegasusRobotaxi platforms.
In addition, Nvidia has also released two EGX edge AI chips, the EGXA100 and Jetson Xavier NX, the former for larger commercial general-purpose servers and the latter for micro edge servers.
The Links: 7MBP200VEA120-50 CC1120RHBR MITSUBISHI-IGBT