Theres a Eula for that.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Judas Priest got sued by parents claiming their kid killed himself over hidden messages in their music.
That's a weeee bit different, no?
A delusional kid was told by one and a delusional kid was told by another.
The difference is, there were no hidden messages in the music.
Meanwhile there are overt messages spat out by the LLM, because it's a lying yes-man machine that encourages people's worst impulses, so they keep using it.
Rob Halford just wanted to dress like a Tom of Finland drawing, and make fun music.
The companies making the chatbots want to harvest and sell your data.
People don't often realize how subtle changes in language can change our thought process. It's just how human brains work sometimes.
The old bit about smoking and praying is a great example. If you ask a priest if it's alright to smoke when you pray, they're likely to say no, as your focus should be on your prayers and not your cigarette. But if you ask a priest if it's alright to pray while you're smoking, they'd probably say yes, as you should feel free to pray to God whenever you need...
Now, make a machine that's designed to be agreeable, relatable, and makes persuasive arguments but that can't separate fact from fiction, can't reason, has no way of intuiting it's user's mental state beyond checking for certain language parameters, and can't know if the user is actually following it's suggestions with physical actions or is just asking for the next step in a hypothetical process. Then make the machine try to keep people talking for as long as possible...
You get one answer that leads you a set direction, then another, then another... It snowballs a bit as you get deeper in. Maybe something shocks you out of it, maybe the machine sucks you back in. The descent probably isn't a steady downhill slope, it rolls up and down from reality to delusion a few times before going down sharply.
Are we surprised some people's thought processes and decision making might turn extreme when exposed to this? The only question is how many people will be effected and to what degree.
People don’t often realize how subtle changes in language can change our thought process.
just changing a single word in your daily usage can change your entire outlook from negative to positive. it's strange, but unless you've experienced it yourself how such minute changes can have such large effects it's hard to believe.
And this is hard for me, actually. Because of my work background and the jargon used, I'm unconsciously negative about things a lot of the time. It's a tough habit to break.
But if you ask a priest if it's alright to pray while you're smoking, they'd probably say yes, as you should feel free to pray to God whenever you need...
When would a priest ever tell anyone it's not okay to pray?
It's the opinion on smoking, not praying, that differs.
In both cases you're praying and smoking at the same time, so your actions don't change, but the priest rationalizes two completely different answers based on the way the question is posed. It's just an example to show how two contradictory answers can seem rational to the same person because of the language used.
the priest rationalizes two completely different answers based on the way the question is posed. It’s just an example to show how two contradictory answers can seem rational to the same person because of the language used.
They aren't contradictory though. Basically what they are saying is just praying > praying + smoking > just smoking. "Okay" has different meanings in the different sentences.
But in both cases, the person is asking to do the same thing. The order of the words in the sentence doesn't change the end result, we always wind up with someone smoking and praying simultaneously, which may or may not be against God's will.
Strip away the justifications and simplify the word choices and you get this:
- May I smoke while I pray? No, you may not.
- May I pray while I smoke? Yes, you may.
Given that, can you say if it is right or wrong to smoke and pray simultaneously?
And again, this is just a hypothetical scenario. In the broader context of life, religion, and tobacco use, it'll never be this simple, but it works for an example.
Now, someone might point out that by simplifying the wording, I've changed the meaning of the original statement to make it fit my argument, and that now it means something else. But that's essentially my original point, phrasing and word choices can shape our reasoning, thought processes, and how we interpret meaning in ways we aren't immediately aware of, leading us to different conclusions or even delusional thinking.
Are we surprised some people's thought processes and decision making might turn extreme when exposed to this?
Yes, actually. I'm not doubting the power of language, but I cannot ever see something anyone ever says alter my sense of reality or right from wrong.
I had a "friend" say to me recently "why do you always go against the grain?" My reply was "I will go against the grain for the rest of my life if it means doing or saying what's right".
I guess my point is that I have a very hard time relating to this.
I guess my point is that I have a very hard time relating to this.
That's fair. In the same vein, you might find a priest that tells you to stop smoking for your health no matter how you phrase the question about lighting up and prayer. What people are receptive to is going to vary.
I'd like argue that more of us are susceptible to this sort of thing than we suspect, but that's not really something that can be proved or disproved. What seems pretty certain is that at least some of us are at risk, and given all the other downsides of chatbots, it'd be best to regulate them in a hurry.
you might find a priest that tells you to stop smoking for your health no matter how you phrase the question about lighting up and prayer. What people are receptive to is going to vary.
Ya, I've read the thing about praying and smoking in another comment. The funny thing is that I have very specific opinions about smoking and would argue that smoking while praying is disrespectful, but God would listen in any case.
It's more about how the slightly different questions lead the hypothetical priest to two separate and contradictory conclusions than disrespecting God.
At any rate, all opinions on tobacco and prayer are fine by me, just watch out for any friends you think might be talking to chatbots a little too much.
Sure, that's why propaganda can be so powerful. It's not just what is said, it's how it's said. And pretty much everyone if 3 vulnerable to the right propaganda - especially people who think they're not vulnerable to propaganda.
Absolutely, and the medium can make a huge difference as well. I suspect that there's something about chatbots and the medium of their messages that helps set those hooks extra deep in people.
Then make the machine try to keep people talking for as long as possible...
That's probably a huge part of it. How many billions of dollars have been spent engineering content on a screen to get its tendrils into people's minds and attention and not let go?
EnGaGeMent!!!
This is also part of my broader gripe with social media, cable news, and the current media landscape in general. They use so many sneaky little psychological hooks to keep you plugged in that I honestly believe it's screwing with our heads to the point of it being a public health crisis.
People are already frazzled and beat down by the onslaught of dopamine feedback loops and outrage bait, then you go and get them hooked on a charbot that feeds into every little neurosies they've developed and just sinks those hooks in even deeper and it's no wonder some people are having a mental health crisis.
A lot of us vastly overestimate our resistance to having our heads jacked with and it worries me.
How in the hell does one become addicted to a damn chatbot?
Money + downtime + not very smart?
Maybe if we're lucky people will realize this has been what capitalism and consumerism has been doing all along. People have been drivin to crazy shit because of all the evil shit we do marketing and fucking with consumers minds. But nah we will blame a chatbot that's just telling you what it thinks you want to see rather than seeing it's just the next stage of fuckery