Qualcomm's Bold Move: Launching AI Accelerator Chips to Challenge Nvidia and AMD

Qualcomm’s Bold Move: Launching AI Accelerator Chips to Challenge Nvidia and AMD

Qualcomm Enters AI Chip Market to Challenge Nvidia and AMD, Shares Jump 11%

October 27, 2025 — Qualcomm, a leader in wireless and mobile chips, now steps into the AI chip space for data centers. Its move pits the firm against Nvidia and AMD. The news pushed Qualcomm’s stock up by 11% on Monday.

New AI Chips for Data Center Inference

Qualcomm plans two new AI chips. The AI200 will start in 2026, and the AI250 will follow in 2027. Both come with liquid-cooled server racks that hold a full set of chips. Each rack may house up to 72 chips that work together as one strong AI unit. These systems run large AI models in big labs.

The tech builds on the Hexagon units found in Qualcomm’s smartphone chips. Durga Malladi, head of data center and edge, said the firm proved its strength in other fields before moving to data centers.

“We first built strength in other fields. Then it was natural to step up to the data center level,” Malladi said.

Entering the Fastest-Growing Market in Tech

Investors pour funds into AI-based data centers. McKinsey sees about $6.7 trillion flow into data centers by 2030. Most money goes to systems with AI chips.

Nvidia leads with graphics processors that hold more than 90% of the chip market. Its parts help train large models like the ones behind ChatGPT. AMD is second, and OpenAI now wants to add more chip makers to its list.

Big tech names such as Google, Amazon, and Microsoft also make their own chips for cloud work.

Focus on AI Inference and Operational Efficiency

Qualcomm builds its chips to run AI tasks, not to train new models. The firm says its rack systems may lower running costs for cloud providers. Each rack uses about 160 kilowatts. This use is close to that of Nvidia’s GPU racks.

Malladi said Qualcomm’s plan lets cloud users pick whole systems or choose separate chip parts. He hinted that firms like Nvidia and AMD might also buy some of Qualcomm’s central units.

“Our goal is to let customers choose full systems or mix parts,” Malladi explained.

Competitive Advantages and Partnerships

Qualcomm did not share prices or counts. Still, it points out that its chips take less power, cost less, and manage memory well. The new cards support up to 768 gigabytes of memory, more than what Nvidia and AMD provide in similar parts.

In May, Qualcomm worked with Saudi Arabia’s Humain to supply AI chips to local data centers. Humain will run systems that use up to 200 megawatts. This step shows strong demand for Qualcomm’s AI chips.

Market Impact

Qualcomm’s move into AI data centers ups the contest in a market led by Nvidia. The news and stock jump give hope to investors about Qualcomm’s ability to earn space in AI chip work.

As AI speeds growth in cloud computing, big tech, and research, Qualcomm’s chip know‐how may help it gain a key role in the AI chip market.


This article is based on reporting from CNBC and statements from Qualcomm executives.