The Economics of ChatGPT: Can Large Language Models Turn a Profit?

On Wednesday, June 5, something amazing happened on the U.S. stock market. Nvidia (NVDA 1.75%) became the first computer “hardware” stock to reach a $3 trillion valuation, largely on the success of its semiconductor chips for artificial intelligence functions.

Previously, only computer software makers such as Apple and Microsoft, who ordinarily earn higher profit margins than hardware makers, had hit this mark. But thanks to the astounding profit margins Nvidia has been able to earn on its AI chips, it has joined the $3 trillion club as well.

But not all the news is good. Sounding a cautionary note last week, ARK Investment head Cathie Wood warned investors that for Nvidia to deserve its rich valuation, “AI now has to play out elsewhere” and prove its value both to the companies that are developing artificial general intelligence and to the customers buying their services. Failing this, demand for AI chips will wither, and with it Nvidia’s valuation.

So what are the chances that software companies such as OpenAI, Microsoft, and Alphabet will make money on AI? Will payments from companies such Apple, which is promising to put AI from OpenAI, and perhaps Google too, on its iPhones, be enough to turn AI companies profitable?

Taking ChatGPT for a test drive

As luck would have it, I recently had the opportunity try to answer the question. I write a lot about defense stocks and I post summaries of defense contract awards on Twitter (now X) for the benefit of other investors. Of course, doing this by hand is time-consuming. Might it be possible to use a program like ChatGPT, for example, to automate this work? 

To find out, I recruited my daughter, Annabelle, to put her newly minted computer science degree to work. I would provide the data, and she would build an application programming interface (API) to access ChatGPT — and ask it to simplify my raw data into shorter summaries of the contracts.

In the process, we stumbled upon ChatGPT’s rate sheet.

And my jaw dropped.

How much does ChatGPT cost?

Pricing its output in “tokens,” which it describes as “pieces of words,” and selling these tokens to users in batches of 1 million, OpenAI (the company that owns ChatGPT) charges anywhere from $0.02 to $15 per million tokens, depending on which particular large language model a user requires. We decided that a model called GPT-3.5 Turbo would suit our purposes.

Its cost: $1.50 per million tokens.

That’s really not a lot of money when you consider that those 1 million tokens will generate about 750,000 words of text — and save me countless hours of labor over the course of a year. As a ChatGPT user, I was thrilled. But as an OpenAI investor, I admit I was a little worried about whether the company will be able to make any money like this, especially given all the talk of late about how AI is an energy hog and electricity prices are rising.

To dig into this question a little, we asked ChatGPT a simple question: “What is the electricity cost to ChatGPT of answering this question?”

ChatGPT spat out a convoluted response explaining how it calculated its energy cost to answer the question. The whole answer consumed 390 words, burning through about 0.052% of my tokens. So I figure the answer cost me roughly $0.00078.

But how much did it cost ChatGPT owner OpenAI to produce the answer? In 2022, OpenAI CEO Sam Altman suggested that costs could be pretty high — as much as a few cents per query.

But that was two years ago, before the “optimizing” began. When I asked ChatGPT last week “What is the electricity cost to ChatGPT of answering this question?” it responded that OpenAI probably paid closer to $0.000006255 for the energy to answer this question — just six ten-thousandths of a cent! And if that’s the case, then despite the low cost to me, OpenAI was still able to charge me roughly 125 more to generate the answer than it spent on the electricity to produce it — assuming that the answer is accurate.

Granted, energy isn’t the only cost that OpenAI, Microsoft, Alphabet, and others will incur in providing AI services. They also have to pay the cost of training their large language models, and they still have to buy the AI chips from Nvidia. At prices of $25,000 and up per chip, that’s a significant upfront cost. Still, scaled over billions of requests per day, I actually do believe these companies can make a profit, especially as competition from Nvidia rivals such as Intel and Advanced Micro Devices helps to push AI chip prices down and lower upfront costs.

Long story short, it’s possible that, over time and working at scale, OpenAI, Microsoft, Alphabet, and all the other companies working to make the AI revolution a reality really will be able to make money from this. That’s good news for them.

And to address Wood’s concern, it’s probably good news for Nvidia as well.

Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Rich Smith has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Apple, Microsoft, and Nvidia. The Motley Fool recommends Intel and recommends the following options: long January 2025 $45 calls on Intel, long January 2026 $395 calls on Microsoft, short August 2024 $35 calls on Intel, and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top