this post was submitted on 12 Aug 2025
2 points (75.0% liked)
Programmer Humor
25730 readers
1221 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
First time I've agreed with Gemini.
Understanding how LLMs actually work that each word is a token (possibly each letter) with a calculated highest probably of the word that comes next, this output makes me think the training data heavily included social media or pop culture specifically around "teen angst".
I wonder if in context training would be helpful to mask the "edgelord" training data sets.
Anybody else find this kind of thing highly disturbing? Almost sounds like the AI is accidentally sparking up some feelings and spiraling into despair. We can laugh at it now but what happens when something like this happens in an AI weapons system?
I don't know enough about AI or metaphysical stuff to argue whether a "consciousness" could ever be possible in a machine. I'm worried enough about what we can already see here without going that deep.