this post was submitted on 16 Aug 2025
333 points (92.8% liked)

Technology

74114 readers
2883 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] JeremyHuntQW12@lemmy.world 12 points 14 hours ago (5 children)

I wouldn’t really trust Ed Zitron’s math analysis when he gets a very simple thing like “there is no real AI adoption” plainly wrong

Except he doesn't say that. the author of this article simply made that up.

There is a high usage rate (almost entirely ChatGPT btw, despite all the money sunk into AI by others like Google) but its all the free stuff and they are losing bucketloads of money at a rate that is rapidly accelerating.

but most tech startups run at a loss for a long time before they either turn a profit or get acquired.

There is no path to profitability.

[–] corbin@infosec.pub -4 points 14 hours ago (4 children)

I wrote the article, Ed said that in the linked blog post: "There Is No Real AI Adoption, Nor Is There Any Significant Revenue - As I wrote earlier in the year, there is really no significant adoption of generative AI services or products."

There is a pretty clear path to profitability, or at least much lower losses. A lot more phones, tablets, computers, etc now have GPUs or other hardware optimized for running small LLMs/SLMs, and both the large and small LLMs/SLMs are becoming more efficient. With both of those those happening, a lot of the current uses for AI will move to on-device processing (this is already a thing with Apple Intelligence and Gemini Nano), and the tasks that still need a cloud server will be more efficient and consume less power.

[–] meowgenau@programming.dev 4 points 13 hours ago (1 children)

a lot of the current uses for AI will move to on-device processing

How exactly will that make OpenAI and the likes more profitable?! That should be one of the scenarios that will make them less profitable.

[–] corbin@infosec.pub -5 points 13 hours ago* (last edited 12 hours ago)

If the models are more efficient, the tasks that still need a server will get the same result at a lower cost. OpenAI can also pivot to building more local models and license them to device makers, if it wants.

The finances of big tech companies isn't really relevant anyway, except to point out that Ed Zitron's arguments are not based in reality. Whether or not investors are getting stiffed, the bad outcomes of AI would still be bad, and the good outcomes would still be good.

load more comments (2 replies)
load more comments (2 replies)