JayGray91

joined 4 months ago
[–] JayGray91@piefed.social 5 points 6 hours ago

Someone I know (not close enough to even call an "internet friend") formed a sadistic bond with chatGPT and will force it to apologize and admit being stupid or something like that when he didn't get the answer he's looking for.

I guess that's better than doing it to a person I suppose.