sales@hkmjd.com
Service Telephone:86-755-83294757
In the field of artificial intelligence (AI), the performance of chips is directly related to the speed and efficiency of training and inference. In recent years, NVIDIA has dominated the data center and AI market with its powerful GPU technology. How…
In the field of artificial intelligence (AI), the performance of chips is directly related to the speed and efficiency of training and inference. In recent years, NVIDIA has dominated the data center and AI market with its powerful GPU technology. However, AMD, another chip giant, is also launching high-performance AI chips in an attempt to shake NVIDIA's market position.2024 On October 10, AMD released a series of new AI products at the Advancing AI 2024 conference in San Francisco, with its flagship AI chip, the MI325X, being particularly notable. According to AMD, MI325X surpasses NVIDIA's H200 chip in a number of performance indicators, which is undoubtedly a major breakthrough for AMD in the field of AI chips.
AMD MI325X AI chip introduction
AMD Instinct MI325X is a GPU gas pedal designed for AI training and inference. It utilizes AMD's latest CDNA 3 GPU architecture with 153 billion built-in transistors and 256GB of HBM3e (5th generation) high-bandwidth memory with a memory bandwidth of 6 TB/s. In terms of floating-point operations, the MI325X delivers peak theoretical computing performance at 8-bit floating-point (FP8) and 16-bit floating-point (FP16) precision of 2.6 PFLOPs (2.6 quadrillion floating-point operations per second) and 1.3 PFLOPs, which is approximately 30 percent higher than the NVIDIA H200.
At the launch event, AMD CEO Lisa Su presented training and inference performance data for the MI325X on the Meta Llama model. The inference performance of the server platform consisting of 8 MI325X on the Llama-3.1 405B model is 40% higher than that of the NVIDIA H200 HGX platform; on AI training, the performance of a single MI325X training the Llama-2 7B is 1.1 times that of the H200, and the performance of 8 MI325X training the Llama-2 70B is on par with the H200. These figures fully demonstrate the powerful performance of MI325X in AI training and inference.
AMD and NVIDIA's market competition
AMD and NVIDIA are increasingly competing in the AI chip market. NVIDIA released its next-generation AI chip, the H200, in November 2023, which integrates 141GB of RAM and is more adept at “reasoning. ”The H200 offers a 60% to 90% increase in performance over its predecessor, the H100, when used for reasoning or generating answers to questions. However, the launch of the AMD MI325X has challenged NVIDIA's leadership in the AI chip market.
AMD has been committed to providing a full suite of CPU, GPU and networking solutions to meet all the needs of the modern data center. the release of the MI325X not only improves AMD's competitiveness in the AI chip market, but also further solidifies its position in the data center market. According to AMD, the MI325X is expected to go into production in the fourth quarter of 2024 and be available to partner vendors such as Dell, HP and Lenovo from the first quarter of 2025.
Time:2024-11-18
Time:2024-11-18
Time:2024-11-18
Time:2024-11-18
Contact Number:86-755-83294757
Enterprise QQ:1668527835/ 2850151598/ 2850151584/ 2850151585
Business Hours:9:00-18:00
E-mail:sales@hkmjd.com
Company Address:Room1239, Guoli building, Zhenzhong Road, Futian District, Shenzhen, Guangdong
CopyRight ©2022 Copyright belongs to Mingjiada Yue ICP Bei No. 05062024-12
Official QR Code
Links: