this post was submitted on 27 Apr 2026
940 points (99.1% liked)

Technology

84171 readers
3041 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] deliriousdreams@fedia.io 2 points 11 hours ago (1 children)

He may be correct in that this has already happened previously to other companies so they were forewarned and took basically no steps to mitigate the chances or protect their backups.

The AI company bears some responsibility for the act because they programmed it. But the company using the tool also didn't take the precautions they should have taken.

AI is crap and companies shouldn't be so gung ho about it, because this and situations like it appear to be prone to happen if for no other reason. But any tool you don't respect will bite you eventually.

[–] Jankatarch@lemmy.world 2 points 11 hours ago* (last edited 11 hours ago)

I was talking about how they don't even consider blaming themselves lmao.

Using most random & unreliable tools known to man was their decision after all.

But they can afford learning nothing I guess.