this post was submitted on 15 Aug 2025
405 points (95.1% liked)

Technology

74073 readers
2679 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The University of Rhode Island's AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT's reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI's GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

(page 2) 47 comments
sorted by: hot top controversial new old
[–] kescusay@lemmy.world 13 points 14 hours ago (2 children)

How the hell are they going to sustain the expense to power that? Setting aside the environmental catastrophe that this kind of "AI" entails, they're just not very profitable.

[–] gdog05@lemmy.world 12 points 13 hours ago

Look at all the layoffs they've been able to implement with the mere threat that AI has taken their jobs. It's very profitable, just not in a sustainable way. But sustainability isn't the goal. Feudal state mindset in the populace is.

[–] homesweethomeMrL@lemmy.world 4 points 12 hours ago

Not just”not profitable”, they don’t make any money at all. Loss only.

[–] DarkCloud@lemmy.world 7 points 13 hours ago

This bubble needs to pop, the sooner the better.

[–] brucethemoose@lemmy.world 14 points 15 hours ago* (last edited 15 hours ago) (1 children)

I don’t buy the research paper at all. Of course we have no idea what OpenAI does because they aren’t open at all, but Deepseek's publish papers suggest it’s much more complex than 1 model per node… I think they recommended like a 576 GPU cluster, with a scheme to split experts.

That, and going by the really small active parameter count of gpt-oss, I bet the model is sparse as heck.

There’s no way the effective batch size is 8, it has to be waaay higher than that.

[–] FaceDeer@fedia.io 8 points 15 hours ago (2 children)

And perhaps even more importantly, the per-token cost of GPT-5's API is less than GPT-4's. That's why OpenAI was so eager to move everyone onto it, it means more profit for them.

[–] Jason2357@lemmy.ca 8 points 15 hours ago (1 children)

I don’t believe api costs are tied all that closely to the actual cost to openAI. They seem to be selling at a loss, and they may be selling at an even greater loss to make it look like they are progressing. The second openAI seems like they have plateaued, their stock evaluation will crash and it will be game over for them.

[–] FaceDeer@fedia.io 1 points 15 hours ago (2 children)

I based my argument on actual numbers that can be looked up and verified. You "believe" that they "seem" to be doing something else. Based on what?

[–] dan@upvote.au 8 points 14 hours ago* (last edited 14 hours ago) (2 children)

Their point is that those API prices might not match reality, and the prices may be artificially low to build hype and undercut competitors. We don't know how much it costs OpenAI, however we do know that they're not making a profit.

[–] brucethemoose@lemmy.world 5 points 14 hours ago (1 children)

Or it might not. It would be a huge short term risk to do so.

As FaceDeer said, that we truly don't know.

[–] dan@upvote.au 5 points 14 hours ago (2 children)

OpenAI are not profitable today, and don't estimate they'll be profitable until 2029, so it's almost guaranteed that they're selling their services at a loss. Of course, that's impossible to verify - since they're a private company, they don't have to release financial statements.

[–] brucethemoose@lemmy.world 4 points 14 hours ago (1 children)

That’s not what I’m saying. They’ve all but outright said they’re unprofitable.

But revenue is increasing. Now, if it stops increasing like they’ve “leveled out”, that is a problem.

Hence it’s a stretch to assume they would decrease costs for a more expensive model since that would basically pop their bubble well before 2029.

[–] dan@upvote.au 2 points 12 hours ago* (last edited 11 hours ago)

Revenue is increasing, but according to their own estimates, it has to increase 10x in order for them to become profitable.

load more comments (1 replies)
[–] FaceDeer@fedia.io 2 points 14 hours ago (1 children)

Sure, they might not. But he gives no basis for saying that other than what he "believes."

People in this community, and on the Fediverse in general, seem to be strongly anti-AI and would like to believe things that make it sound bad and unprofitable. So when an article like this comes along and says exactly what you want to believe it's easy to just nod and go "knew it!" Rather than investigating the reasons for those beliefs and risking finding out something you didn't want to know.

[–] dan@upvote.au 7 points 14 hours ago* (last edited 14 hours ago) (1 children)

that make it sound bad and unprofitable

It is unprofitable, though.

OpenAI recently hit $10 billion in ARR and are likely to hit $12.7b by the end of the year, but they're still losing a lot of money. They don't think they'll make a profit until 2029, and only if they hit their target of $125 billion revenue. That's a huge amount of growth - 10x in 4 years - so I'm interested as to if they'll actually hit it.

[–] FaceDeer@fedia.io 1 points 14 hours ago

Okay, make it sound worse and even more unprofitable.

Making their AI models cheaper to run (such as by requiring less electricity) is one step along that path to profitability.

[–] brucethemoose@lemmy.world 7 points 14 hours ago

To be fair, OpenAI's negative profitability has been extensively reported on.

Your point stands though; there's no evidence they're trying to decrease revenue. On the contrary, that would be a huge red flag to any vested interests.

[–] dan@upvote.au 2 points 14 hours ago (2 children)

How does OpenAI getting less money (with a cheaper model) mean more profit? Am I missing something?

[–] IrateAnteater@sh.itjust.works 8 points 14 hours ago (1 children)

Usually, companies will make their product say 25% cheaper to produce, then sell it to the public at a 20% discount (while loudly proclaiming to the world about that 20% price drop) and pocket that 5% increase in profits. So if OpenAI is dropping the price by x, it's safe to assume that the efficiency gains work out to x+1.

[–] dan@upvote.au 1 points 14 hours ago* (last edited 14 hours ago) (1 children)

Thanks! This makes sense, however OpenAI are not yet profitable. It's definitely possible that they're losing less money with the new models, though.

[–] IrateAnteater@sh.itjust.works 0 points 13 hours ago (1 children)

That "not profitable" label should be taken with a grain of salt. Startups will do all the creative accounting they can in order to maintain that label. After all, don't have to pay taxes on negative profits.

[–] dan@upvote.au 2 points 12 hours ago

In the end, it still means their losses are greater than their profits.

They've still got taxes they need to pay, too - things like payroll taxes, real estate taxes, etc.

[–] FaceDeer@fedia.io 1 points 14 hours ago

If the model is cheaper to run then they are able to reduce the price without reducing profit, which gives them an advantage over competitors and draws in more customer activity. OpenAI is far from a monopoly.

[–] Boxscape@lemmy.sdf.org 12 points 15 hours ago (1 children)
[–] ThePantser@sh.itjust.works 2 points 13 hours ago

OpenAI just needs to harness lightning. Incoming weather control tech.

[–] Melonpoly@lemmy.world 6 points 13 hours ago (6 children)

It takes less energy to dry a full load of clothes

[–] LodeMike@lemmy.today 6 points 12 hours ago (2 children)

40 watt-hours? That's the energy usage of a very small laptop.

[–] jj4211@lemmy.world 2 points 11 hours ago (1 children)

Well over the course of an hot or two, but it's correct that a dryer run even with heat pump is significantly more than 40wh

load more comments (1 replies)
load more comments (1 replies)
load more comments (5 replies)
[–] nightwatch_admin@feddit.nl 11 points 15 hours ago (1 children)

Of course there are comments doubting the accuracy, which by itself is valid, but they are merely doing it to defend AI. IMHO, even at a fifth of the estimates, we’re talking humongous amounts of power, all for a so-so search engine, half arsed chatbots and dubious nsfw images mostly. And let’s not forget: it may be inaccurate and estimates are TOO LOW. Now wouldn’t that be fun?

[–] simple@piefed.social 6 points 14 hours ago (2 children)

but they are merely doing it to defend AI.

No they're not, you can agree the research is garbage without defending AI. It literally assumes everything. GPT5 could be using eight times the power. It could be using half the power. It could be using a quadrillion times the power. Nobody knows, because they keep it secret.

[–] Catoblepas@piefed.blahaj.zone 2 points 12 hours ago (1 children)

It’s highly unlikely they reduced power usage—one of the most consistent criticisms of LLM and image generation—without advertising it.

[–] simple@piefed.social 4 points 12 hours ago

It's highly unlikely they would bring more attention to one of the biggest issues AI is causing even if they did make it slightly better

load more comments (1 replies)
[–] vegeta@lemmy.world 8 points 15 hours ago
[–] Valmond@lemmy.world 7 points 15 hours ago (1 children)

40Wh or 18Wh which is it?

That's my old gaming PC running a game for 2min42sec-6minutes ... Roughly.

[–] TropicalDingdong@lemmy.world 14 points 15 hours ago (1 children)
[–] FauxLiving@lemmy.world 1 points 7 hours ago

Doesn't matter, their audience isn't intetested in accuracy they only want more things to feel outraged about

[–] DrFistington@lemmy.world 6 points 15 hours ago

But we get a huge increase in accuracy, from 30% to 30.5%! And it only took 5x the energy consumption!

[–] Steve@startrek.website 3 points 13 hours ago

And it sucks even worse.

load more comments
view more: ‹ prev next ›