A few minutes every morning is all you need.
Stay up to date on the world's Headlines and Human Stories. It's fun, it's factual, it's fluff-free.
The backstory: Nvidia is a US tech firm founded in 1993, and it's one of the world's most valuable companies. As businesses across the globe race to develop artificial intelligence (AI) technology, Nvidia's chips have become a core component used to train and build this tech. It supplies major tech companies like Amazon, Microsoft, Google, Meta and Dell. And the demand has been so great that it's led to a chip shortage. But even TSMC chairman Mark Liu said last year that he thought Nvidia was on track to become the world's biggest semiconductor firm.
It typically takes 1,000 people to build one chip, and each person needs to have a thorough understanding of each step in the process, according to Bryan Catanzaro, Nvidia's Vice President of Applied Deep Learning Research, who spoke to The Wall Street Journal. The high demand for Nvidia's graphics processing units (GPUs) means that AI developers are being put on month-long waiting lists to get the prized tech. So Nvidia has begun using its own AI to fast-track the production process.
The development: Now, Nvidia looks to be on track to become even more valuable than Amazon. The company has seen a 40% surge in its market capitalization so far this year, bringing it to around a whopping US$1.73 trillion as of Wednesday. It's just 3% below Amazon and 6% behind Google's parent company, Alphabet. In 2023, Nvidia's stock more than tripled, making it the US market's fifth most valuable company.
But there is concern that, in the near future, Nvidia's rapid growth could slow as other big tech companies, like Meta and Amazon, create their own chips. If successful, the companies that make up a chunk of Nvidia's revenue may become self-reliant. To put it in context, five companies made up 46% of the company's earnings in the last quarter. On top of that, experts have pointed out that once the training boom for AI has passed, the demand for these components could go down.
"AI chip demand will eventually normalize once the initial training build has been completed. The inference phase of AI is going to require less computing power than the training phase. High-powered PCs and phones could be capable enough to run inferences locally, reducing the need for growing GPU plants," said Sandeep Gupta.
Nvidia "is acutely sensitive to fluctuation in the topline growth and procurement decisions of top customers given the concentration of NVDA's sales," says Barclays. "This is especially relevant given the aforementioned competition in the chips market and potential for NVDA to lose revenue share if its tech customers develop self-sufficient chip manufacturing capacity."
"Nvidia's got great chips, and more importantly, they have an incredible ecosystem," said Dave Brown, who runs Amazon's chip efforts. That makes getting customers to use a new kind of A.I. chip "very, very challenging," he said.