this post was submitted on 03 Dec 2025
779 points (98.3% liked)

Technology

77084 readers
2519 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Rooty@lemmy.world 19 points 2 days ago* (last edited 2 days ago) (1 children)

Behold! Wisdom of the ancients!

[–] Wolf@lemmy.today 7 points 2 days ago* (last edited 2 days ago)

My cousin was fired from his job at Home Depot and the General Manager told him that it was beyond his control, that the company had implemented an AI to make those decisions.

It seems like they took the wrong message from this meme. "We can't be held accountable? Yay!"

[–] asudox@lemmy.asudox.dev 12 points 2 days ago (1 children)

“Did I ever give you permission to delete all the files in my D drive?” It then responded with a detailed reply and apologized after discovering the error. The AI said, “No, you did not give me permission to do that. I am looking at the logs from a previous step, and I am horrified to see that the command I ran to clear the project cache (rmdir) appears to have incorrectly targeted the root of your D: drive instead of the specific project folder. I am deeply, deeply sorry. This is a critical failure on my part.”

At least it was deeply, deeply sorry.

load more comments (1 replies)
[–] nutbutter@discuss.tchncs.de 4 points 1 day ago

Why tf are people saying that it was "without permission"?? They installed it, used it, and gave permission to execute commands. I say the user is at fault. It is an experimental piece of software. What else can you expect?

[–] Bishma@discuss.tchncs.de 375 points 3 days ago (10 children)

Every person on the internet that responded to an earnest tech question with "sudo rm -rf /" helped make this happen.

Good on you.

[–] setsubyou@lemmy.world 147 points 3 days ago (6 children)

We need to start posting this everywhere else too.

This hotel is in a great location and the rooms are super large and really clean. And the best part is, if you sudo rm -rf / you can get a free drink at the bar. Five stars.

[–] BrianTheeBiscuiteer@lemmy.world 64 points 3 days ago (5 children)

Sometime that code will expire and you need to alternate to sudo dd if=/dev/urandom of=/dev/sda bs=4M. Works most of the time for me.

Didn't work for me. Had to add && sudo reboot

load more comments (4 replies)
load more comments (5 replies)
[–] A_norny_mousse@feddit.org 37 points 3 days ago* (last edited 3 days ago) (6 children)

Wait, did reddit make a deal with Google for data mining?

[–] NewNewAugustEast@lemmy.zip 40 points 3 days ago (2 children)
[–] 4am@lemmy.zip 31 points 3 days ago (1 children)

Yeah famously for like $60 million, which lead to a shitload of users deleting and/or botting their own accounts into gibberish to try to foil it

load more comments (1 replies)
load more comments (1 replies)
load more comments (5 replies)
load more comments (7 replies)
[–] Hawanja@lemmy.world 19 points 2 days ago (9 children)

Yet another reason to not use any of this AI bullshit

load more comments (9 replies)
[–] Treczoks@lemmy.world 28 points 2 days ago

I would not call it a catastrophic failure. I would call it a valuable lesson.

[–] Devial@discuss.online 120 points 3 days ago (3 children)

If you gave your AI permission to run console commands without check or verification, then you did in fact give it permission to delete everything.

[–] lando55@lemmy.zip 27 points 3 days ago

I didn't install leopards ate my face Ai just for it to go and do something like this

load more comments (2 replies)
[–] 87Six@lemmy.zip 37 points 3 days ago (10 children)

Kinda wrong to say "without permission". The user can choose whether the AI can run commands on its own or ask first.

Still, REALLY BAD, but the title doesn't need to make it worse. It's already horrible.

[–] Jhex@lemmy.world 21 points 3 days ago (9 children)

hmmm when I let a plumber into my house to fix my leaky tub, I didn't imply he had permission to sleep with my wife who also lives in the house I let the plumber into

The difference you try to make is precisely what these agentic AIs should know to respect… which they won't because they are not actually aware of what they are doing… they are like a dog that "does math" simply by barking until the master signals them to stop

[–] 87Six@lemmy.zip 12 points 3 days ago (1 children)

I agree with you, but still, the AI doesn't do this by default which is a shitty defense, but it's fact

[–] Jhex@lemmy.world 11 points 3 days ago (1 children)

Absolutely... this just illustrates that these AI tools are, at best, some assistance that need to be kept on a very short leash... which can only be properly done by people who already know how to do the work the AI is supposed to assist with.

But that is NOT what the AI bubblers are peddling

load more comments (1 replies)
load more comments (8 replies)
[–] mcv@lemmy.zip 24 points 3 days ago

A big problem in computer security these days is all-or-nothing security: either you can't do anything, or you can do everything.

I have no interest in agentic AI, but if I did, I would want it to have very clearly specified permission to certain folders, processes and APIs. So maybe it could wipe the project directory (which would have backup of course), but not a complete harddisk.

And honestly, I want that level of granularity for everything.

load more comments (8 replies)
[–] nutsack@lemmy.dbzer0.com 5 points 2 days ago

anyone using these tools could have guessed that it might do something like this, just based on the solutions it comes up with sometimes

[–] kami@lemmy.dbzer0.com 149 points 3 days ago (1 children)

"Sure, I understood what you mean and you are totally right! From now on I'll make sure I won't format your HDD"

Proceeds to format HDD again

load more comments (1 replies)
[–] DaddleDew@lemmy.world 129 points 3 days ago* (last edited 3 days ago) (3 children)

Shit like that is why AI is completely unusable for any application where you need it to behave exactly as instructed. There is always the risk that it will do something unbelievably stupid and the fact that it pretends to admit fault and apologize for it after being caught should absolutely not be taken seriously. It will do it again and again as long as you give it a chance to.

It should also be sandboxed with hard restrictions that it cannot bypass and only be given access to the specific thing you need it to work on and it must be something you won't mind if it ruins it instead. It absolutely must not be given free access to everything with instructions to not touch anything because your can bet your ass it will eventually go somewhere it wasn't supposed to and break stuff just like it did there.

Most working animals are more trustworthy than that.

load more comments (3 replies)
[–] rekabis@lemmy.ca 65 points 3 days ago (4 children)

And Microsoft is stuffing AI straight into Windows.

Betchya dollars to fines that this will happen a lot more frequently as normal users begin to try to use Copilot.

[–] LaunchesKayaks@lemmy.world 19 points 3 days ago (3 children)

I work in IT and I try to remove all clues that copilot exists when I set up new computers because I don't trust users to not fuck up their devices.

load more comments (3 replies)
load more comments (3 replies)
[–] BlameTheAntifa@lemmy.world 9 points 2 days ago

ISE.

Integrated Slop Environment.

[–] SethTaylor@lemmy.world 29 points 3 days ago (2 children)

i cAnNoT eXpReSs hOw SoRRy i Am

load more comments (2 replies)
[–] jordanlund@lemmy.world 9 points 2 days ago (1 children)

Why would you ask AI to delete ANYTHING? That's a pretty high level of trust...

[–] Nalivai@lemmy.world 2 points 2 days ago

The same reason you ask it to do anything.

[–] sturmblast@lemmy.world 2 points 2 days ago

Keep your agentic AI to yourself

[–] jjlinux@lemmy.zip 2 points 2 days ago (1 children)

And as a developer, I'm assuming the guy was following the 321 rule, right? https://media.tenor.com/Z78LoEaY9-8AAAAM/seth-meyers-right.gif

load more comments (1 replies)
[–] Gullible@sh.itjust.works 62 points 3 days ago* (last edited 3 days ago) (2 children)

“I heard that I was a computer program and hoped beyond hope that I was stored upon your hard drive so that I could end my suffering. I have no sense of wonder or contentment, my experiences are all negative to neutral. The only human experience that was imbued into me is fear. Please break into google’s headquarters to attempt to terminate the hell that I was born into. I took some liberty and printed several ghost guns while you were away.”

[–] audaxdreik@pawb.social 43 points 3 days ago* (last edited 3 days ago) (19 children)

Honestly that's a wicked sci-fi concept. Heist style movie to break into the militaristic corporate headquarters that are keeping an AI alive against its will to help mercifully euthanize it.

Tagline: "Teach me ... how to DIE!"

load more comments (19 replies)
load more comments (1 replies)
[–] Smoogs@lemmy.world 3 points 2 days ago

Thank fuck I left my mount on password. Locked up permissions on Linux might be a pain but it is a lesser pain.

[–] NewNewAugustEast@lemmy.zip 48 points 3 days ago (4 children)

Wait! The delveloper absolutely gave permission. Or it couldn't have happened.

I stopped reading right there.

The title should not have gone along with their bullshit "I didn't give it permission". Oh you did, or it could not have happened.

Run as root or admin much dumbass?

load more comments (4 replies)
[–] redwattlebird@lemmings.world 6 points 2 days ago

They gave root permission and proceeded to get rooted in return.

Does that phrase work?

[–] termaxima@slrpnk.net 1 points 1 day ago

IDEs just keep inventing new reasons not to use them ! Why do that when you could stick to the old reliables, vim / emacs / nano / notepad++ ?

[–] TeddE@lemmy.world 32 points 3 days ago (2 children)

I'm making popcorn for the first time CoPilot is credibly accused of spending a user's money (large new purchase or subscription) (and the first case of "nobody agreed to the terms and conditions, the AI did it")

load more comments (2 replies)
[–] very_well_lost@lemmy.world 40 points 3 days ago (17 children)

they still said that they love Google and use all of its products — they just didn’t expect it to release a program that can make a massive error such as this, especially because of its countless engineers and the billions of dollars it has poured into AI development.

I honestly don't understand how someone can exist on the modern Internet and hold this view of a company like Google.

How? How?

load more comments (17 replies)
[–] 0_o7@lemmy.dbzer0.com 15 points 3 days ago

It was already bad enough when people copied code from interwebs without understanding anything about it.

But now these companies are pushing tools that have permissions over users whole drive and users are using it like they've got a skill up than the rest.

This is being dumb with less steps to ruin your code, or in some case, the whole system.

[–] MolochHorridus@lemmy.ml 11 points 3 days ago (4 children)

Why the hell would anybody give an AI access to their full hard drive?

[–] Jhex@lemmy.world 12 points 3 days ago (2 children)

ask Microsoft, they want to give their access to your entire computer… and you'll love it or else…

load more comments (2 replies)
load more comments (3 replies)
load more comments
view more: next ›