this post was submitted on 14 Aug 2025
130 points (97.1% liked)

Technology

82494 readers
5199 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] redsunrise@programming.dev 31 points 6 months ago (3 children)

Obviously it's higher. If it was any lower, they would've made a huge announcement out of it to prove they're better than the competition.

[–] ChaoticEntropy@feddit.uk 4 points 6 months ago

I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.

[–] Ugurcan@lemmy.world 3 points 6 months ago* (last edited 6 months ago) (3 children)

I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.

Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).

And 2025’s investors doesn’t give a flying fuck about energy efficiency.

[–] PostaL@lemmy.world 4 points 6 months ago (1 children)

And they don't want to disclose the energy efficiency becaaaause ... ?

[–] AnarchistArtificer@slrpnk.net 4 points 6 months ago

Because the AI industry is a bubble that exists to sell more GPUs and drive fossil fuel demand

[–] RobotZap10000@feddit.nl 2 points 6 months ago* (last edited 6 months ago) (1 children)

They probably wouldn't really care how efficient it is, but they certainly would care that the costs are lower.

[–] Ugurcan@lemmy.world 1 points 6 months ago

I’m almost sure they’re keeping that for the Earnings call.

[–] Sl00k@programming.dev 1 points 6 months ago

It also has a very flexible "thinking" nature, which means far far less tokens spent on most peoples responses.

[–] morrowind@lemmy.ml 0 points 6 months ago (1 children)

It's cheaper though, so very likely it's more efficient somehow.