Pat Gelsinger, CEO of Intel, speaking on CNBC’s Squawk Box at the WEF Annual Meeting in Davos, Switzerland on January 16, 2024.
Adam Galici | CNBC
Intel on Tuesday unveiled its latest artificial intelligence chip, called the Gaudi 3, as chipmakers rush to produce semiconductors that can train and deploy large artificial intelligence models like the one powered by OpenAI’s ChatGPT.
Intel says its new Gaudi 3 chip is twice as efficient as and can run AI models one and a half times faster than by Nvidia H100 GPU. It also comes in different configurations, such as a bundle of eight Gaudi 3 chips on one motherboard or a card that can be installed in existing systems.
Intel tested the chip in models such as of Meta Open source Lama and Falcon supported by Abu Dhabi. He said Gaudi 3 can help train or develop models, including the Stable Diffusion model or OpenAI Whisper for speech recognition.
Intel says its chips use less power than Nvidia’s.
Nvidia has one is appreciated 80% of the AI chip market with its graphics processors, known as GPUs, which have been the top chip of choice for AI makers for the past year.
Intel said the new Gaudi 3 chips will be available to customers in the third quarter, including enterprises Dingle, horsepower and Supermicro will build systems with the chips. Intel did not provide a price range for the Gaudi 3.
“We expect it to be extremely competitive” with Nvidia’s latest chips, said Das Kamhout, vice president of Xeon software at Intel, on a call with reporters. “From our competitive pricing, to our distinctive open embedded network-on-a-chip, to using industry-standard Ethernet. We believe it’s a strong offering.”
The data center AI market is also expected to grow as cloud providers and enterprises build infrastructure to develop AI software, suggesting there is room for other competitors even if Nvidia continues to make the vast majority of AI chips.
Running genetic artificial intelligence and buying Nvidia graphics can be expensive, and companies are looking for additional suppliers to reduce costs.
The artificial intelligence boom has more than tripled Nvidia’s stock over the past year. Intel stock is up only 18% over the same time period.
AMD is also trying to expand and sell more AI chips for servers. Last year, it introduced a new data center GPU called the MI300X, which already counts Meta and Microsoft as customers.
Earlier this year, Nvidia unveiled the B100 and B200 GPUs, which are the successors to the H100 and also promise performance gains. These chips are expected to start shipping later this year.
Nvidia has been so successful thanks to a powerful suite of proprietary software called CUDA that allows AI scientists to access all of a GPU’s hardware capabilities. Intel is partnering with other chip and software giants, including Google, Qualcomm and Arm to create open software that is not proprietary and could allow software companies to easily switch chip providers.
“We’re working with the software ecosystem to create open reference software, as well as building blocks that allow you to put together a solution that you need, rather than being forced to buy a solution,” Sachin Katti, senior vice president of networking at Intel Group, said on a call with journalists.
The Gaudi 3 is based on a five-nanometer process, a relatively recent manufacturing technique, suggesting the company is using an external foundry to manufacture the chips. In addition to designing the Gaudi 3, Intel also plans to make AI chips, possibly for outside companies, at a new factory in Ohio expected to open in 2027 or 2028, CEO Patrick Gelsinger told reporters last month.