Intel's new Gaudi 3 accelerators massively undercut Nvidia GPUs as AI race heats up

zohaibahd

Posts: 139   +1
Staff
What just happened? Intel threw down the gauntlet against Nvidia in the heated battle for AI hardware supremacy. At Computex this week, CEO Pat Gelsinger unveiled pricing for Intel's next-gen Gaudi 2 and Gaudi 3 AI accelerator chips, and the numbers look disruptive.

Pricing for products like these is typically kept hidden from the public, but Intel has bucked the trend and provided some official figures. The flagship Gaudi 3 accelerator will cost around $15,000 per unit when purchased individually, which is 50 percent cheaper than Nvidia's competing H100 data center GPU.

The Gaudi 2, while less powerful, also undercuts Nvidia's pricing dramatically. A complete 8-chip Gaudi 2 accelerator kit will sell for $65,000 to system vendors. Intel claims that's just one-third the price of comparable setups from Nvidia and other rivals.

For the Gaudi 3, that same 8-accelerator kit configuration costs $125,000. Intel insists it's two-thirds cheaper than alternative solutions at that high-end performance tier.

To provide some context to Gaudi 3 pricing, Nvidia's newly launched Blackwell B100 GPU costs around $30,000 per unit. Meanwhile, the high-performance Blackwell CPU+GPU combo, the B200, sells for roughly $70,000.

Of course, pricing is just one part of the equation. Performance and the software ecosystem are equally crucial considerations. On that front, Intel insists the Gaudi 3 keeps pace with or outperforms Nvidia's H100 across a variety of important AI training and inference workloads.

Benchmarks cited by Intel show the Gaudi 3 delivering up to 40 percent faster training times than the H100 in large 8,192-chip clusters. Even a smaller 64-chip Gaudi 3 setup offers 15 percent higher throughput than the H100 on the popular LLaMA 2 language model, according to the company. For AI inference, Intel claims a 2x speed advantage over the H100 on models like LLaMA and Mistral.

However, while the Gaudi chips leverage open standards like Ethernet for easier deployment, they lack optimizations for Nvidia's ubiquitous CUDA platform that most AI software relies on today. Convincing enterprises to refactor their code for Gaudi could be tough.

To drive adoption, Intel says it has lined up at least 10 major server vendors – including new Gaudi 3 partners like Asus, Foxconn, Gigabyte, Inventec, Quanta, and Wistron. Familiar names like Dell, HPE, Lenovo, and Supermicro are also on board.

Still, Nvidia is a force to be reckoned with in the data center world. In the final quarter of 2023, they claimed a 73 percent share of the data center processor market, and that number has continued to rise, chipping away at the stakes of both Intel and AMD. The consumer GPU market isn't all that different, with Nvidia commanding an 88 percent share.

It's an uphill battle for Intel, but these massive price differences may help close the gap.

Permalink to story:

 
Depends on what software runs on it. I find it hard to believe they would show similar performance in practical tasks

Why? There is nothing sort of special on Nvidia hardware. Just hype and "it's Nvidia". FYI, you won't have bestseller just with best features, performance, power consumption etc.
 
Why? There is nothing sort of special on Nvidia hardware. Just hype and "it's Nvidia". FYI, you won't have bestseller just with best features, performance, power consumption etc.
AMD's gpus are selling for about the same price per fps as Nvidias. They are not offering any shocking discounts. If Intel could make hardware that offered about the same performance as Nvidia's, it would not need to offer it at half the price.
Where is that tiny text that tells exactly why intel is so much cheaper? I am guessing the easiest answer--software.
I do not know if this is true, but it only sounds logical.
 
AMD's gpus are selling for about the same price per fps as Nvidias. They are not offering any shocking discounts. If Intel could make hardware that offered about the same performance as Nvidia's, it would not need to offer it at half the price.
Where is that tiny text that tells exactly why intel is so much cheaper? I am guessing the easiest answer--software.
I do not know if this is true, but it only sounds logical.

For GPUs it makes no sense for AMD to discount prices, because market buys Nvidia anyway. Also AMD does not have too much money to compete with price.

For AI accelerators, situation is about same, just swap AMD with Intel. However difference is that Intel has money. Unlike AMD, Intel can "buy" market share with cheaper prices. In this case it's enough to put some extra on pricing, not extraextraextraextra like Nvidia does. In other words, Nvidia pricing has so much air that with reasonable pricing, Intel is much cheaper.
 
I think it is good to have alternatives, but Intel clearly is trying to compete on price across their entire range of products, which is not good for them.
 
AMD's gpus are selling for about the same price per fps as Nvidias. They are not offering any shocking discounts. If Intel could make hardware that offered about the same performance as Nvidia's, it would not need to offer it at half the price.
Where is that tiny text that tells exactly why intel is so much cheaper? I am guessing the easiest answer--software.
I do not know if this is true, but it only sounds logical.
Still their MI300 series cards are much cheaper than Ngreedia's too.

Industry experst don't see Intel as a player at all. They say they have a poor jistory of support and their software stack is not widely embraced unlike AMD's ROCm. I honestly believe smaller thrid party players will dominate over Intel in the next few years in the AI market with specialised AISC's that are faster and way more efficient.
 
If Intel could make hardware that offered about the same performance as Nvidia's, it would not need to offer it at half the price.
Where is that tiny text that tells exactly why intel is so much cheaper? I am guessing the easiest answer--software.
I do not know if this is true, but it only sounds logical.
That's part of it as the article states, most companies are deep into CUDA making switching away from Nvidia impractical.
But hey if it's cheaper to buy intel and rewrite the code some will switch.

The other part is that even at half or one third the price the margins are likely to still be considerable. It's just that Nvidia is making *that much profit*. There's a reason why their net worth is on route to be the highest of any company in the world. (#2 at the moment but Microsoft isn't far ahead)
 
CUDA is a tool to lock customers in Nvidia ecosystem and execs doing the buying decisions surely hate the fact.

AMD and intel have a history of offering more open software, so they have atleast a better reputation in that respect. The industry needs a common open platform, like Vulcan. Intel and AMD should start laying the groundwork.
 
Back