this post was submitted on 15 Aug 2025
420 points (95.3% liked)

Technology

74073 readers
2818 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 14 points 16 hours ago* (last edited 16 hours ago) (19 children)

I don’t buy the research paper at all. Of course we have no idea what OpenAI does because they aren’t open at all, but Deepseek's publish papers suggest it’s much more complex than 1 model per node… I think they recommended like a 576 GPU cluster, with a scheme to split experts.

That, and going by the really small active parameter count of gpt-oss, I bet the model is sparse as heck.

There’s no way the effective batch size is 8, it has to be waaay higher than that.

[–] FaceDeer@fedia.io 8 points 16 hours ago (18 children)

And perhaps even more importantly, the per-token cost of GPT-5's API is less than GPT-4's. That's why OpenAI was so eager to move everyone onto it, it means more profit for them.

[–] Jason2357@lemmy.ca 8 points 16 hours ago (1 children)

I don’t believe api costs are tied all that closely to the actual cost to openAI. They seem to be selling at a loss, and they may be selling at an even greater loss to make it look like they are progressing. The second openAI seems like they have plateaued, their stock evaluation will crash and it will be game over for them.

[–] FaceDeer@fedia.io 1 points 16 hours ago (2 children)

I based my argument on actual numbers that can be looked up and verified. You "believe" that they "seem" to be doing something else. Based on what?

[–] dan@upvote.au 8 points 16 hours ago* (last edited 16 hours ago) (2 children)

Their point is that those API prices might not match reality, and the prices may be artificially low to build hype and undercut competitors. We don't know how much it costs OpenAI, however we do know that they're not making a profit.

[–] brucethemoose@lemmy.world 5 points 16 hours ago (1 children)

Or it might not. It would be a huge short term risk to do so.

As FaceDeer said, that we truly don't know.

[–] dan@upvote.au 5 points 15 hours ago (2 children)

OpenAI are not profitable today, and don't estimate they'll be profitable until 2029, so it's almost guaranteed that they're selling their services at a loss. Of course, that's impossible to verify - since they're a private company, they don't have to release financial statements.

[–] NotMyOldRedditName@lemmy.world 1 points 6 hours ago* (last edited 6 hours ago)

There's a difference between selling at a loss, and having a loss.

OpenAI let's people use models for free with very little limits other than reducing the model quality over time, and they have very generous limits before they limit you at that.

That all costs money and is a loss for them.

If they get someone who's willing to pay, and they charge $20/m and on average, they net $5 profit per customer, they aren't selling it at a loss, they just need more customers. It's possible that a paid customer uses it even more though and it actually does incur a loss per paid customer and they're doing that to try and gain users while they figure out how to lower their costs, but that seems less likely.

[–] brucethemoose@lemmy.world 4 points 15 hours ago (1 children)

That’s not what I’m saying. They’ve all but outright said they’re unprofitable.

But revenue is increasing. Now, if it stops increasing like they’ve “leveled out”, that is a problem.

Hence it’s a stretch to assume they would decrease costs for a more expensive model since that would basically pop their bubble well before 2029.

[–] dan@upvote.au 2 points 13 hours ago* (last edited 12 hours ago)

Revenue is increasing, but according to their own estimates, it has to increase 10x in order for them to become profitable.

[–] FaceDeer@fedia.io 2 points 16 hours ago (1 children)

Sure, they might not. But he gives no basis for saying that other than what he "believes."

People in this community, and on the Fediverse in general, seem to be strongly anti-AI and would like to believe things that make it sound bad and unprofitable. So when an article like this comes along and says exactly what you want to believe it's easy to just nod and go "knew it!" Rather than investigating the reasons for those beliefs and risking finding out something you didn't want to know.

[–] dan@upvote.au 7 points 16 hours ago* (last edited 15 hours ago) (1 children)

that make it sound bad and unprofitable

It is unprofitable, though.

OpenAI recently hit $10 billion in ARR and are likely to hit $12.7b by the end of the year, but they're still losing a lot of money. They don't think they'll make a profit until 2029, and only if they hit their target of $125 billion revenue. That's a huge amount of growth - 10x in 4 years - so I'm interested as to if they'll actually hit it.

[–] FaceDeer@fedia.io 1 points 15 hours ago

Okay, make it sound worse and even more unprofitable.

Making their AI models cheaper to run (such as by requiring less electricity) is one step along that path to profitability.

[–] brucethemoose@lemmy.world 7 points 16 hours ago

To be fair, OpenAI's negative profitability has been extensively reported on.

Your point stands though; there's no evidence they're trying to decrease revenue. On the contrary, that would be a huge red flag to any vested interests.

load more comments (16 replies)
load more comments (16 replies)