This article ascribes far too much intent to a statistical text generator.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Quanta is a science rag. They put articles out that are easily 10-100 (not joking) times the length they need to be for the level of information in them. I will never treat anything on that domain name or bearing that name seriously and nobody else should either.
It is Schroedinger's Stochastic Parrot. Simultaneously a Chinese Room and the reincarnation of Hitler.
It exposes that there might be a link between bad developers and far right extremism though.
... which we already knew from Notch.
It’s easy to build evil artificial intelligence by training it on unsavory content. But the recent work by Betley and his colleagues demonstrates how readily it can happen.
Garbage in, garbage out.
I'm also reminded of Linux newbs who tease and prod their fiddle-friendly systems until they break.
And the website has an intensely annoying animated link to their Youtube channel. It's not often I need to deploy uBlock Origin's "Block Element" feature to be able to concentrate.
Anyone know how to get access to these "evil" models?
Access to view the evil models or to make more evil models?
Not from a Jedi.
Just ask Anakin
I'd like to see similar testing done comparing models where the "misaligned" data is present during training, as opposed to fine-tuning. That would be a much harder thing to pull off, though.
It isn't exactly what you're looking for, but you may find this interesting, and it's a bit of an insight into the relationship between pretraining and fine tuning: https://arxiv.org/pdf/2503.10965