Intel Vision 2024 offers a new look at Gaudi 3 AI chip

After first announcing the existence of the Gaudi 3 AI accelerator last year, Intel is set to put the chip in the hands of OEMs in the second quarter of 2024. Intel announced this and other news, including a brand new Xeon 6 and an open Ethernet standard for AI. workloads, in a preview briefing held April 1 ahead of the Intel Vision conference, which will be held April 8-9 in Phoenix, Arizona.

Gaudi 3 AI Accelerator to Ship to Dell, Hewlett Packard Enterprise, Lenovo and Supermicro

The Gaudi 3 will launch with Dell, Hewlett Packard Enterprise, Lenovo and Supermicro as OEM partners.

Intel Gaudi 3 will be available from vendors in three form factors: mezzanine card, universal motherboard or PCle CEM. Gaudi 3 has 40% faster large language model training time compared to NVIDIA's H100 AI chip and 50% faster LLM inference compared to the NVIDIA H100, Intel said.

Gaudi 3 can go toe-to-toe with NVIDIA's recently announced AI accelerator chip, Blackwell. Gaudi 3 is “highly competitive,” said Jeff McVeigh, corporate vice president and general manager of Intel's Software Engineering Group. McVeigh noted that real-world testing has not yet been possible for the two products.

The new Xeon 6 brand will arrive in the second quarter

The Xeon 6 processors, which come in both Performance-core and Efficient-core variants, will ship soon. E-core processors will ship in the second quarter of 2024, with P-core processors shortly after.

PEOPLE ALSO LIKE:  Can Microsoft Copilot write simple SQL commands for you?

The two variants of the Xeon 6 processors share the same platform foundation and software stack. The Performance core is optimized for AI and compute-intensive workloads, while the Efficient core is optimized for efficiency in the same workloads. The Intel Xeon 6 processor with E-core shows a 2.4x performance per watt improvement compared to previous generations and a 2.7x performance per rack improvement compared to previous generations.

The Xeon 6 processor shows marked power savings compared to the second generation Intel Xeon processor because it requires fewer server racks, resulting in a power reduction of up to 1 megawatt.

Network interface card supports the open Internet standard for AI workloads

As part of Intel's effort to provide a wide range of AI infrastructure, the company announced an AI network interface card for Intel Ethernet network adapters and Intel IPUs. AI network interface cards, which are already in use by Google Cloud, will provide a secure way to offload functions such as storage, networking and container management and manage AI infrastructure, Intel said. The intent is to be able to train and run inference on the increasingly large generative AI models that Intel predicts organizations will want to deploy across Ethernet.

Intel is working with the Ultra Ethernet Consortium to create an open standard for AI networking over Ethernet.

AI network interface cards are expected to be available in 2026.

Wide-ranging scalable systems strategy aims to facilitate AI adoption

To prepare for what the company predicts will be the future of AI, Intel plans to implement a scalable systems strategy for enterprises.

PEOPLE ALSO LIKE:  Learn to code and get Microsoft Visual Studio for only $65

“We want it to be open and for companies to have choices in hardware, software and applications,” Sachin Katti, senior vice president and general manager of Intel's Network and Edge Group, said at the preview briefing.

To achieve this, the Scalable Systems strategy provides Intel products for all AI segments within the enterprise: hardware, software, frameworks and tools. Intel is working with a variety of partners to make this strategy a reality, including:

  • Google Cloud.
  • Such.
  • Cohesiveness.
  • NAVER.
  • Bosch.
  • Ola/Kutrim.
  • NielsenIQ.
  • Seeker.
  • IBF.
  • Group CtrlS.
  • Landing AI.
  • Roboflow.

Intel predicts a future of AI agents and functions

Katti said in the previous report that companies are in an era of AI co-pilots. Next could come an era of AI agents, which can coordinate other AIs to perform tasks autonomously, followed by an era of AI functions. Increasing AI capabilities could mean groups of officers taking over the work of an entire department, Sachin said.

SEE: Articul8, creators of a generative AI software platform, split from Intel in January. (Technological Republic)

Intel Competitors

Intel is trying to differentiate itself from its competitors by focusing on interoperability in the open ecosystem. Intel competes in the AI ​​chip space with:

  • NVIDIA, which announced the next-generation Blackwell chip in March 2024.
  • AMD, which in February 2024 announced a new architectural solution for AI inference based on AMD Ryzen Embedded processors.

Intel competes for dominance in chip manufacturing with Taiwan Semiconductor Manufacturing Co., Samsung, IBM, Micron Technologies, Qualcomm and others.

TechRepublic covers Intel Vision remotely.

Source link

Leave a Comment