Qualcomm's AI chips will compete with Nvidia and AMD. What does this mean for the market?

Chipmaker Qualcomm has announced the release of AI chips for data centers based on Hexagon neural processing units (NPUs). Previously, these processors were released by the manufacturer for smartphones. The company plans to launch the AI200 chip in 2026 and the AI250 in 2027. Qualcomm will offer off-the-shelf rack solutions with liquid cooling.
Qualcomm says the AI200 and AI250 chips are focused on inferencing - running and executing AI models, that is, generating responses to user requests, rather than learning. The company claims that its rackmount systems will ultimately be cheaper to run for customers such as cloud providers: the power consumption of a single rack is 160 kW, comparable to that of Nvidia-based racks. The company has not named prices for chips, boards or racks, nor the exact number of NPUs that can be installed in a single rack.
Qualcomm claims its AI chips offer advantages over competitors in power consumption, cost of ownership and a new approach to memory. The company claims its boards support 768 gigabytes of memory - more than Nvidia and AMD solutions.
Previously, Qualcomm specialized mainly in chips for wireless communications and mobile devices, rather than large data centers. A year earlier, its Snapdragon chips beat Intel processors in the competition: Microsoft recommended Qualcomm to manufacturers for installation in Copilot+ standard notebooks as more power-efficient solutions optimized for AI.
What this means for the market
Qualcomm's entry into the data center market marks the emergence of a new player in the fastest growing technology segment - equipment for AI-oriented server farms, CNBC writes. McKinsey estimates that total capital investment in data centers through 2030 will reach $6.7 trillion, with the majority of that going to systems built around AI chips.
Now the industry is dominated by Nvidia: its graphics processing units (GPUs) occupy more than 90% of the market, and the growth in sales has allowed the company's market capitalization to exceed $4.5 trillion, the channel recalls. Nvidia chips were used to train OpenAI's GPT models, which are the basis of the ChatGPT chatbot.
However, companies like OpenAI are looking for alternatives: this month, the startup announced plans to buy AI chips from AMD, the second-largest GPU maker. Google, Amazon and Microsoft are also developing their own AI chips for their cloud services. The Gaudi series of AI gas pedals is also produced by Intel.
How did the stock react
Qualcomm shares soared 21.6 percent to $205.5 in Monday trading, the highest since July 2024.
Nvidia shares were adding 2.6 percent and Advanced Micro Devices was up 0.3 percent.
This article was AI-translated and verified by a human editor
