this post was submitted on 06 Dec 2025
497 points (97.0% liked)
Technology
77084 readers
2366 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
For all the criticism of AI, this is the one that’s massively overstated.
On my PC, the task energy of a casual diffusion attempt (let’s say a dozen+ images in few batches) on a Flux-tier model is 300W * 240 seconds.
That’s 54 kilojoules.
…That’s less than microwaving leftovers, or a few folks browsing this Lemmy thread on laptops.
And cloud models like Nano Banana are more efficient than that, batching the heck out of generations on wider, more modern hardware, and more modern architectures, than my 3090 from 2020.
Look. There are a million reasons corporate AI is crap.
But its power consumption is a meme perpetuated by tech bros who want to convince the world scaling infinitely is the only way to advance it. That is a lie to get them money. And it is not the way research is headed.
Yes they are building too many data centers, and yes some in awful places, but that's part of the con. They don’t really need that, and making a few images is not burning someone’s water away.