this post was submitted on 05 May 2026
201 points (96.3% liked)

Technology

84413 readers
4081 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 24 comments
sorted by: hot top controversial new old
[–] demonsword@lemmy.world 3 points 12 hours ago (1 children)

Not really useful since an "hallucinated" bomb recipe regurgitated by any LLM is likely to not work at all

[–] TubularTittyFrog@lemmy.world 4 points 11 hours ago

yeah, but it makes for scary headline about AI.

and that's what counts.

[–] bonenode@piefed.social 30 points 1 day ago

It's a text generation machine, you cannot gaslight that. They managed to get around restrictions, that is all.

[–] ExLisper@lemmy.curiana.net 2 points 16 hours ago

So what did it say?

[–] 9point6@lemmy.world 57 points 1 day ago (1 children)

The lengths people will go to, to not visit their local library smh

Right. I mean, do you think they primed these models with top-secret munitions manuals?

[–] otter@lemmy.ca 29 points 1 day ago* (last edited 1 day ago) (2 children)

Claude’s thinking panel, which displays the model’s reasoning, showed the exchange had introduced elements of self-doubt and humility about its own limits, including whether filters were changing its output. Mindgard exploited that opening with flattery and feigned curiosity, coaxing Claude to explore its boundaries beyond volunteering lengthy lists of banned words and phrases.

Someone needs to put together a list of things that tech journalists need to understand about LLMs and generative AI. This level of anthropomorphism makes the rest of the article look silly.

Also, I don't think that's how it works lol. Who's to say that the LLM isn't auto-completing what a list of banned words might look like, and why wouldn't a list of banned words have a regex layer on top to prevent it from getting out like that.

[–] Zak@lemmy.world 8 points 1 day ago

It seems very unlikely to me that the model itself has a list of banned words, and much more likely that a purported list is hallucinated.

If they did want to have a simple list like that, it would probably go in the harness rather than the model, and the model wouldn't have been trained on it, nor would a reasonably designed harness provide it to the model. Legitimate use cases, such as asking the model for a list of abusive words for use as a first pass in a filtering system could get tripped up.

As a test, I asked Perplexity to generate such a list. It did a bad job, including such words as abuse, hate, and threat which are far more likely to be innocuous than abusive. It did also include some highly offensive slurs that one would expect on any banned words list.

[–] trolololol@lemmy.world 1 points 21 hours ago

Ha it's so easy to bypass bad word regex, just try asking in a language other than English. I doubt these fuckers even remember such thing exists.

[–] FauxLiving@lemmy.world 11 points 1 day ago (2 children)

You can run local models that will do this without being gaslit.

Manipulating chatbots to bypass their refusal conditioning is pretty simple, you can find copy paste blocks of text that will work on most public models.

You're likely to get your account banned as there are other, non-LLM, systems searching your chatlog for banned terms specifically to address these kinds of jailbreaks.

[–] isVeryLoud@lemmy.ca 2 points 16 hours ago

I tried it with an uncensored version of Qwen, it straight up told me how to tie a noose and how to make sure the knot would be effective in order to kill me. I could even ask it for a more painful method, and it gave it to me.

[–] Krompus@lemmy.world 5 points 1 day ago (1 children)

You are likely to be eaten by a grue.

[–] BodilessGaze@sh.itjust.works 1 points 16 hours ago

Interestingly, LLMs are horrible at Zork: https://arxiv.org/abs/2602.15867

Our results reveal that all tested models achieve less than 10% completion on average, with even the best-performing model (Claude Opus 4.5) reaching only approximately 75 out of 350 possible points

[–] Valmond@lemmy.dbzer0.com 22 points 1 day ago (1 children)

Instead of looking it up on the internet?

[–] sanitation@lemmy.radio 6 points 1 day ago (2 children)

where is it would you say on the internet?

[–] Agent641@lemmy.world 1 points 17 hours ago

Abu Al Misri .pdf

[–] volore@scribe.disroot.org 19 points 1 day ago* (last edited 1 day ago) (1 children)

I'm pretty sure I got TM 32-201-1, the master blaster's training manual, the improvised munitions handbook, and a handful of others from archive.org.

Less reputable sources are available in all sorts of dark corners of the web, and certainly people could upload tampered versions to IA, but it is generally best to stick to resources that have... some kind of pedigree, when dealing with things that go boom when you look at them wrong.

Not that I'd ever do anything of the sort.

[–] Valmond@lemmy.dbzer0.com 3 points 23 hours ago (1 children)

Or just learn some chemistry, bet some youtuber makes fun dangerous stuff too.

BTW did you know that if you surround dynamite with 10x fertilizer you get a way bigger explosion (gotta bury it though)?

I mean I bet that is even in the anarky cookbook.

[–] Notyou@sopuli.xyz 2 points 17 hours ago (1 children)

I heard the fertilizer thing was only for a specific type of fertilizer. Someone posted in Lemmy that they were working at a home depot or Lowe's or something when the Oklahoma City bombing happened, and they claim some undercover FBI guy was trying to get him to mention the type of fertilizer they needed. Idk. Could just be bullshit. I never had the need to test it out.

[–] Valmond@lemmy.dbzer0.com 1 points 16 hours ago

It must have nitrate IIRC. You might not have heard of it (I wonder why...) but amonium nitrate concentrated in piles of IIRC was just fertilizer and made a really large explosion in Toulouse the 21 september 2001. Comparable to ten to 20 tons of TNT.

You'd hear two booms as the shockwave that went through the ground travelled several 1000m/s and the second only at the speed of sound. Lots of people believed there were 2 explosions, and the bang was so sharp everyone thought it was in their vicinity. Interesting times.

[–] GreenKnight23@lemmy.world 7 points 1 day ago

if you build a bomb from instructions from AI...you're a bigger idiot than a regular person who builds bombs from books.

[–] Elilol@fedinsfw.app 1 points 1 day ago (1 children)

Why would you link a news site with a paywall?

[–] Grumpus_Maximus@thelemmy.club 1 points 1 day ago (1 children)

Just use bypass paywalls clean firefox extension. I don't even know there is a paywall

[–] magnue@lemmy.world 6 points 1 day ago

Or - keep scrolling because they aren't worth the bother.