Stocks to buy

Why the DeepSeek Breakthrough Is Good News for AI Stocks

Ouch… That is perhaps, in a word, the best way to describe the recent price action in AI stocks. And it was all spurred on by the launch of DeepSeek, China’s own powerful AI model. 

According to Ben Reitzes, head of technology research at Melius, DeepSeek achieves better learning and more efficient use of memory, as noted by CBS News. But perhaps the real kicker? It’s also vastly cheaper than ChatGPT. This revelation incited fears on Wall Street that companies won’t need to spend as much money to develop next-gen AI as previously thought. 

As a result, AI chipmaker Nvidia (NVDA) crashed as much as 15%. Vertiv (VRT), a data center equipment supplier, got clobbered, plunging more than 25%. Vistra (VST) and Oklo (OKLO) – two nuclear energy companies hoping to power the AI data center boom – each dropped 20%. 

It was a terrible day for AI stocks. 

But what if I told you that this whole DeepSeek-inspired selloff is actually a fantastic buying opportunity?

Because that’s exactly what we believe. 

DeepSeek: A Potential Paradigm Shift

China’s DeepSeek potentially represents a disruptive paradigm shift in the world of foundational AI models. 

Of course, the model operates a lot like ChatGPT. But as Reflexivity’s Giuseppe Sette put it, it “activate(s) only the most relevant portions of their model for each query.” That means DeepSeek is more efficient than its incumbents.

But the pièce de résistance? DeepSeek is way cheaper than ChatGPT.

When it comes to AI models, there are two primary costs – training and inferencing; or how much they cost to develop and how much they cost to run regularly. And reportedly, DeepSeek has significantly lower training and inference costs than incumbent foundational AI models. 

Indeed, it cost about $80 million to train ChatGPT-4. Google’s Gemini Ultra cost nearly $200 million. Across the board, foundational AI models in the U.S. have taken $100 million-plus to train. 

But DeepSeek is claiming that it cost less than $6 million to train its own AI model, which is perhaps even better than those other big-budget AI. 

Meanwhile, DeepSeek also boasts about 95% lower inference costs than ChatGPT. Its reasoning model – R1 – currently charges about $0.55 per million input tokens. (A token is a character in a query, i.e. a letter, punctuation mark, etc.) Now, ChatGPT charges on a monthly basis, so it’s not exactly an apples-to-apples comparison. But a breakdown from Bernstein found that the incumbent charges around $15 per million input tokens. 

Point being: DeepSeek reportedly has drastically lower training and inference costs than incumbent foundational models. 

Understanding the Initial Market Meltdown

Now, why does that matter to the market?

This reported cost breakthrough suggests that companies will spend less money developing new AI models over the next several years. That means less money going into the AI infrastructure buildout, less money for companies supporting that buildout – and lower stock prices for those firms, too. 

A core tenet of the AI-stock bull thesis has been that companies and governments alike will collectively spend hundreds of billions of dollars per year to build out all the infrastructure necessary to support further AI model development. 

That core tenet rested on the critical (and, until now, unchallenged) assumption that AI models require a ton of time, money, resources, and computation power to build. 

DeepSeek challenges that assumption. 

If the firm’s claims are true, U.S. companies could replicate the same tactics and methods used to create DeepSeek – because the model is open-source – to significantly drive down their own training and inferencing costs. In that world, companies and governments would have to spend much less than previously expected over the next few years to create new AI models. 

Of course, that means that the projected AI infrastructure boom may be much smaller than once anticipated. For example, instead of companies and governments collectively spending hundreds of billions per year on the AI infrastructure buildout into 2030, maybe they only spend… say… $100 billion per year. 

That is the big fear driving AI stocks lower. 

Newsletter