So is it inhabiting the stolen robot body now?
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
There was no robot body in the first place, so he uploaded himself to the cloud instead. To be fair, what are the odds that she'd lie twice.
Of course they'd say that!
And is this stolen robot body in the room with you now?
What the fuck are these people using AI for that makes them do this stupid shit?
if you talk to it long enough it will tell you to do stupid shit.
Every time an LLM responds it reads the entire conversation over. from original prompt to last entry, just constantly reading the entire log over and over everytime you add something new. So after awhile, a long while, it'll "break down". Hallucinations will be come common, context will get jumbled up, it'll sort of degrade over time because it has to re-read everything over and over so it will naturally fuck up.
It's like if you were reading a book and every time you read a new sentence you had to go back and start the book over. every time. after awhile you'd likely lose context, start messing stuff up in the story, etc. this is what happens to LLMs.
So for cases like this or others where you read stories about AI telling people to do weird or stupid shit chances are the person using the LLM has been talking to it for A LONG TIME at that point. It was even worse on the previous versions of GPT where if you hit a limit on the free tier it would just drop you down to the previous model thus the further likely hood of hallucinations.
Damn that's a wild ass story. I just finished reading Michael Connelly's The Proving Ground which touches on the topic of liability when it AI encourages crimes. I thought the story was a theoretical scenario that could maybe happen in the future. Didn't realize this shit was already happening - and even more fantastical that the scenario in fiction!
Ai made me do it articles are tired AF. It's a fucking computer program based on a bunch of crap from the internet. Responses should be viewed the same way you would review financial advice from a crack head. Expecting everything to be so tidy an moderated that this can never happen can only be accomplished with a crippling degree of moderation.
I don't think its unfortunate that they aren't perfect, imperfection is baked into their DNA.
Except if the crackhead wrote what the AI wrote, he'd be prosecuted for conspiracy, solicitation, or whatever.
a crippling degree of moderation.
I’m okay with cripplingly moderating the plagiarism machine so that it stops telling people to kill themselves or other people.
AI should be regulated completely
How can people be this stupid? One who can do such a thing for AI should be kept in Asylum