Advanced Micro Devices (AMD) on Thursday (Oct. 10) unveiled a new artificial intelligence (AI) chip aimed at breaking Nvidia’s stronghold on the lucrative data center GPU market. The launch of AMD’s Instinct MI325X accelerator marks an escalation in the AI hardware arms race, with implications for businesses investing in AI.
AMD’s announcement came during its Advancing AI 2024 event, where the company revealed a broad portfolio of data center solutions for AI, enterprise, cloud and mixed workloads. This portfolio includes the new Instinct MI325X accelerators, 5th Gen AMD EPYC server CPUs, AMD Pensando Salina DPUs, AMD Pensando Pollara 400 NICs and AMD Ryzen AI PRO 300 series processors for enterprise AI PCs.
The generative AI boom, fueled by technologies like large language models, has created a high demand for powerful GPUs capable of training and running complex AI systems. Nvidia has been the primary beneficiary of this trend, with its data center revenue jumping in recent earnings reports.
“Nvidia’s dominant position in the AI chip market has remained virtually unchallenged,” Max (Chong) Li, an adjunct professor at Columbia University and founder CEO of decentralized AI data provider Oort, told PYMNTS. “AMD’s new chip should at least provide some competition, which could lead to pricing pressure in the long term. Some reports have estimated that Nvidia earns as much as a 75% profit margin on AI chips. Should AMD start to eat market share, one would assume prices will begin to drop, as they often do across most industries when companies compete for customers.”
The CUDA ecosystem is Nvidia’s proprietary parallel computing platform and programming model, which has become the standard for AI and high-performance computing tasks. AMD’s challenge extends beyond hardware performance to providing a compelling software ecosystem for developers and data scientists.
AMD has invested in its ROCm (Radeon Open Compute) software stack, reporting at the event that it has doubled AMD Instinct MI300X accelerator inferencing and training performance across popular AI models. The company said over one million models run seamlessly out of the box on AMD Instinct, triple the number available when MI300X launched.
“AMD’s launch of the Instinct MI325X chip marks a significant step in challenging NVIDIA’s dominance in the data center GPU market, but it’s unlikely to dramatically alter the competitive landscape immediately,” Dev Nag, CEO of QueryPal, a support automation company, told PYMNTS. “NVIDIA’s 95% market share in AI chips is deeply entrenched, largely due to their mature and dominant CUDA ecosystem.
“The success of AMD’s initiative hinges not just on the performance of their chips, but on their ability to address the software side of the equation,” he added. “NVIDIA spends about 30% of its R&D budget on software and has more software engineers than hardware engineers, meaning that it will continue to push its ecosystem lead forward aggressively.”
AMD’s entry into the market could affect businesses looking to adopt AI technologies. Increased competition might lead to more options and better pricing in the long term. Nag suggested that over the next 2-3 years, “as AMD refines its offerings and potentially gains market share, we could see more options at various price points. This could make AI hardware more accessible to small and medium-sized enterprises that have been priced out of the current market.”
According to Nag, immediate price drops are unlikely.
“Current demand for AI chips far outstrips supply, giving manufacturers little incentive to lower prices,” he told PYMNTS. “AMD appears to be positioning itself as a value option rather than significantly undercutting Nvidia on price.”
AMD’s focus on open standards could have broader implications.
“If successful, it could lead to more cost-effective solutions by reducing dependency on proprietary ecosystems like CUDA,” Nag said. “This approach could encourage more interoperability and flexibility in AI development, potentially making it easier for businesses to adopt and integrate AI solutions.”
Industry partners have responded positively to AMD’s announcement. The company showcased collaborations with major players, including Dell, Google Cloud, HPE, Lenovo, Meta, Microsoft, Oracle Cloud Infrastructure and Supermicro.
AMD Chair and CEO Lisa Su said in a statement that the data center AI accelerator market could grow to $500 billion by 2028. Even a tiny slice of this market could represent significant revenue for AMD, making its push into AI chips a critical strategic move.
For businesses across various sectors, from retail to manufacturing, developing a more competitive AI chip market could speed up the integration of AI into core operations and customer-facing services. More accessible and powerful AI hardware could make tasks like demand forecasting, process optimization and personalized customer experiences more feasible for a broader range of companies.
“Lower prices always lower barriers to entry and enable more businesses and people to take advantage of newer technologies,” Li said. “Take, for example, mobile phones. Back when they first debuted, the public’s view of a mobile phone user was that of a wealthy person in a fancy car making calls on the go. Now, most people in developed and many people in emerging countries tend to have at least a basic smartphone; soon, access to AI is likely to experience a similar adoption boom.”
The post AMD Challenges Nvidia With New AI Chip appeared first on PYMNTS.com.