AMD’s new MI300 Series chip lineup and AI industry forecast

AMD released the MI300 lineup on Wednesday, saying these accelerator chips can run AI software faster than the competition.

AMD’s new MI300 Series chip lineup and AI industry forecast
AMD Chief Executive Lisa Su holds the company's new MI300X chip at an event outlining AMD's artificial intelligence strategy in San Francisco, U.S., June 13, 2023. REUTERS/Stephen Nellis/File Photo

The backstory: As artificial intelligence (AI) progresses, there's a growing need for accelerators, which are chips that help train AI models. Nvidia is the big player here, grabbing an estimated 80-95% of the AI chip market. The company hit a US$1 trillion market value in May, joining the ranks of tech giants like Apple and Microsoft. Its success is largely due to its vital role in AI by providing its essential components. But competitors like AMD and Intel are also stepping into the ring with their own AI-focused chips. 

More recently: AMD made a big announcement in June about its new AI chip, the MI300X, a part of the Instinct MI300 series. It can use up to 192GB of memory, which means it can fit even bigger AI models than other chips. For context, Nvidia’s competing H100 only supports 120GB of memory. AMD said the chip would be ready to ship to customers later in the year.

The development: AMD released the MI300 lineup on Wednesday, saying these accelerator chips can run AI software faster than the competition. CEO Lisa Su shared the news at an event in San Jose, California, and she also estimated that the AI chip industry will hit over US$400 billion in the next four years – more than double its forecast from August. To put it in perspective, the entire chip industry was valued at US$597 billion in 2022, according to IDC. The MI300X has more than 150 billion transistors, 2.4 times the memory of Nvidia's leading H100 and 1.6 times the memory bandwidth. Meta, OpenAI and Microsoft all said they have plans to use the Instinct MI300X.

Key comments: 

“What this performance does is it just directly translates into a better user experience,” said AMD CEO Lisa Su. “When you ask a model something, you’d like it to come back faster, especially as responses get more complicated.”

“I think it’s clear to say that Nvidia has to be the vast majority of that right now,” said Su, referring to the AI chip market. “We believe it could be US$400 billion-plus in 2027. And we could get a nice piece of that.”

“OpenAI is working with and in support of an open ecosystem,” said  OpenAI engineer Philippe Tillet in a statement. “We plan to support AMD’s GPUs including MI300” in the latest release of Triton.