this post was submitted on 07 Mar 2026
858 points (97.5% liked)

Technology

82378 readers
3994 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] LiveLM@lemmy.zip 18 points 9 hours ago (2 children)

but should serve as a cautionary tale.

Jesus there's a headline like this every month, how many tales people need to learn???

[–] Atropos@lemmy.world 6 points 8 hours ago (1 children)

I am approaching caution critical mass.

Once the threshold is hit, I buy some solar panels and become an off grid farmer.

[–] Jankatarch@lemmy.world 3 points 8 hours ago

Caution Treshold!

load more comments (1 replies)
[–] coalie@piefed.zip 357 points 15 hours ago (1 children)
load more comments (1 replies)
[–] BrianTheeBiscuiteer@lemmy.world 50 points 11 hours ago (1 children)

Whether human, AI, or code, you don't give a single entity this much power in production.

[–] billwashere@lemmy.world 15 points 11 hours ago (1 children)

It’s why there a two keys to launch nukes.

[–] Paranoidfactoid@lemmy.world 8 points 10 hours ago (1 children)
load more comments (1 replies)
[–] fubarx@lemmy.world 227 points 14 hours ago (3 children)

Given that the infrastructure description included the DataTalks.Club website, this resulted in a full wipe of the setup for both sites, including a database with 2.5 years of records, and database snapshots that Grigorev had counted on as backups. The operator had to contact Amazon Business support, which helped restore the data within about a day.

Non-story. He let Terraform zap his production site without offsite backups. But then support restored it all back.

I'd be more alarmed that a 'destroy' command is reversible.

[–] CubitOom@infosec.pub 72 points 14 hours ago

Distributed Non Consensual Backup

[–] zr0@lemmy.dbzer0.com 18 points 12 hours ago (4 children)

For technical reasons, you never immediately delete records, as it is computationally very intense.

For business reasons, you never want to delete anything at all, because data = money.

load more comments (4 replies)
[–] db2@lemmy.world 35 points 14 hours ago (1 children)

Never assume anything is gone when you hit delete.

[–] Vlyn@lemmy.zip 20 points 12 hours ago (2 children)

Except when it's your own data, then usually you're fucked.

load more comments (2 replies)
[–] eleitl@lemmy.zip 61 points 12 hours ago (1 children)

"and database snapshots that Grigorev had counted on as backups" -- yes, this is exactly how you run "production".

[–] Nighed@feddit.uk 13 points 11 hours ago

With some of the cloud providers, their built in backups are linked to the resource. So even if you have super duper geo-zone redundant backups for years, they still get nuked if you drop the server.

It's always felt a bit stupid, but the backups can still normally be restored by support.

[–] Bieren@lemmy.today 5 points 7 hours ago

Ai or not. This is on the person who gave it prod access. I don’t care if the dev was running CC in yolo mode, not paying attention to it or CC went completely rogue. Why would you give it prod access, this is human error.

[–] anon_8675309@lemmy.world 35 points 12 hours ago (1 children)

Mistakes happen. But how do you go 2.5 years without proper backups?

[–] 4grams@awful.systems 19 points 11 hours ago* (last edited 11 hours ago) (2 children)

It’s so easy. I can’t tell you how many “backed up” environments I’ve run into that simply cannot be restored. Often people set them up, but never test them, and assume the snaps are working.

Backups are typically only thought about when you need them, and by then it’s often too late. Real backups need testing and validation frequently, they need remote, off-site storage, with a process to restore that as well.

Been doing this shit for 30 years and people will never learn. I’d guess 9 out of 10 backup systems that I’ve run into were there to check a box on an audit, and never looked at otherwise.

[–] MountingSuspicion@reddthat.com 4 points 8 hours ago

Thank you for this comment. I have backups I tested on implementation and rummaged through two years ago after a weird corruption issue, but not once since. I still get alerts about them, so I just assume they're fine, but first thing Monday I'm gonna test them. I feel stupid for not having implemented regular checks already, but will do so now.

[–] bss03@infosec.pub 4 points 9 hours ago* (last edited 9 hours ago) (1 children)

I was a professional, and I didn't have a backup of my personal system for about 2 decades. I just didn't have another 4TiB of storage to copy my media library onto. I'm now on backblaze, but there was a long time there when I did not have a backup even tho I knew better.

Also, even in a professional setting, I've seen plenty of "production support" systems that didn't have a backup because they grew ad-hoc, weren't the "core business", and no one both recognized and spoke up about how important they were until after some outage. There's virtually never a test-restore schedule with such systems, so the backups are always somewhat suspect anyway.

It's very easy to find you (or your organization) without a backup, even if you "know better".

load more comments (1 replies)
[–] melfie@lemy.lol 12 points 10 hours ago* (last edited 9 hours ago) (1 children)

Just a freak accident. Maybe next time, give it more permissions so it can fix any problems that occur. 😉

[–] bss03@infosec.pub 6 points 9 hours ago
[–] just_another_person@lemmy.world 131 points 15 hours ago (2 children)

Whoever did this was incredibly lazy. What you using an agent to run your Terraform commands for you in the first place if it's not part of some automation? You're saving yourself, what, 15 seconds tops? You deserve this kind of thing for being like this.

[–] PabloSexcrowbar@piefed.social 24 points 13 hours ago (7 children)

Yeah, and to do that without some sort of DR in place is peak hubris.

load more comments (7 replies)
[–] kautau@lemmy.world 13 points 12 hours ago (2 children)

It’s a grifter running a site called “aishippinglabs.com” which charges 500 euros for a “closed community of likeminded individuals”. He’s selling ai slop and a discord channel to other idiots who will do exactly shit like this with little understanding of what is going on

[–] criss_cross@lemmy.world 3 points 8 hours ago

Were they also into crypto 7 years ago?

load more comments (1 replies)
[–] peopleproblems@lemmy.world 7 points 9 hours ago* (last edited 9 hours ago) (1 children)

The real reason I hate using LLMs is because I have to "think" like a social human non software engineer.

For whatever fucking reason, I just can't get these things to be useful. And then I see idiots connecting an LLM to production like this.

Is that the problem? I literally can't turn my brain off. The only other nearly universal group of people that seems opposed to LLMs are psychologists and social workers who seem to be universally concerned about its negative effects on mental health and it's encouragement of abandoning critical thinking.

Like I can't NOT think through a problem. I already know more about my software than the AI could actually figure out. Anytime I go into GitHub Copilot and say "I want this feature" I get some code and the option to apply it. But the generated code is usually duplicate of something and doesn't usually pick up or update existing models. The security flaws are rampant, and the generated tests don't do much of any real testing.

[–] jbloggs777@discuss.tchncs.de 4 points 8 hours ago (1 children)

It would be interesting to see the logs of your sessions, and compare them to the session logs of happy/productive-AI-coders.

I suspect that some people just think and express themselves in ways that don't vibe with LLMs. eg. Men are from Mars, AI coding agents are from Venus.

load more comments (1 replies)
[–] ColeSloth@discuss.tchncs.de 23 points 12 hours ago

If your dumb fucking ass let an ai near your work AND you didn't have any recent backups that it couldnt have access to; you're really extra fucking stupid.

[–] plateee@piefed.social 28 points 12 hours ago (4 children)

Jesus Christ people. Terraform has a plan output option to allow for review prior to an apply. It's trivial to make a script that'll throw the json output into something like terraform visual if you don't like the diff format.

I've fucked up stuff with Terraform, but just once before I switched to a rudimentary script to force a pause, review, and then apply.

[–] cmhe@lemmy.world 25 points 12 hours ago* (last edited 12 hours ago)

Don't worry, review was done by an LLM as well. ;)

load more comments (3 replies)
[–] Deestan@lemmy.world 64 points 15 hours ago (4 children)

We don't need cautionary tales about how drinking bleach caused intestinal damage.

The people needing the caution got it in spades and went off anyway.

Or maybe the cautionary tale is to take caution dealing with the developers in question, as they are dangerously inept.

load more comments (4 replies)
[–] edgemaster72@lemmy.world 7 points 10 hours ago

lol, lmao even

[–] Deestan@lemmy.world 37 points 14 hours ago

According to mousetrap manufacturers, putting your tongue on a mousetrap causes you to become 33% sexier, taller and win the lottery twice a week.

While some experts have argued caution that it may cause painful swelling, bleeding, injury, and distress, and that the benefits are yet to be unproven, affiliated marketers all over the world paint a different, sexier picture.

However, it is not working out for everyone. Gregory here put his tongue in the mousetrap the wrong way and suffered painful swelling, bleeding, injury and distress while not getting taller or sexier.

Gregory considers this a learning experience, and hopes this will serve as a cautionary tale for other people putting their tongue on mousetraps: From now on he will use the newest extra-strength mousetrap and take precautions like Hope Really Hard that it works when putting his tongue in the mousetrap.

[–] Valthorn@feddit.nu 3 points 8 hours ago

What, is a requirement for Claude to work that you "sudo chmod -R 777 /" or something?

[–] KairuByte@lemmy.dbzer0.com 12 points 12 hours ago
[–] zr0@lemmy.dbzer0.com 13 points 12 hours ago (2 children)

Hey Siri, what is a “backup”.

[–] HowAbt2day@futurology.today 11 points 12 hours ago

Siri: “sure! I’ll go right ahead and permanently delete everything.”

load more comments (1 replies)
load more comments
view more: ‹ prev next ›