this post was submitted on 25 Nov 2025
286 points (98.0% liked)
Fuck AI
4728 readers
1229 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I have been thinking about this for a long time. Why is it that it bothers people if others see them naked?
According to the scientific worldview starting from 1800, the world is real. That means that things you can touch, exist. And things that can not be measured don't exist. Also the things of interest in the world are those that are "conserved quantities", like if a hypothetical variable jumps around randomly, it's not a good data source because it's volatile and random. The things that matter are masses in space and time, because those are continuous and don't jump around rapidly. Masses in space and time can only be modified if you work on them (and that requires effort), and no significant change can be brought to masses by purely thinking about them (no "spooky action at a distance", no "telekinesis") or wishing for their change ("wishful thinking" is seen as ineffective).
That makes me wonder: Why do people freak out so much if i think about them? If i think lewd thoughts about somebody who didn't consent to this, why do people not like that? What difference does it make to them if i think about them? What difference does it make if i look at a picture of them naked? By purely thinking about them, i can not change anything meaningful about reality, therefore it shouldn't matter, right?
Often times when these deep fake nudes are being generated, the most significant real world harm comes from what happens when they get circulated around. I know a teacher at a school where this was an issue, and the girl who was a victim of this was actually interviewed by the police because if it had been a genuine image, then she could've been charged with creating CSAM.
The image had been shared around the school, and the girl in question felt humiliated, even though it wasn't her real body — if everyone thinks it's you in the image, then it's hard to fight that rumour mill. As to why she cared about this, well even if you, as an individual, try really hard to not care, it turns out that a lot of people do care. A lot of people called her a slut for taking such provocative images of herself, even if that's not actually what happened.
This goes beyond the deep fake side of things. I know someone whose ex distributed nudes that she had sent to him (revenge porn, basically), and it led to her being fired from her job. The problem here is that it's not always the individual whose nudes (or faked nudes) are shared who has the biggest problem with that person being seen naked.
You're free to think about people naked as much as you like. Hell, if you wanted to generate deepfake nudes, that'd be unethical as hell in my view, but there's little that could be done to stop you. Do whatever you like in the privacy of your own mind, but if people are getting weirded out, then that suggests that it wasn't something that stayed contained within one person's mind.
Okay then the logical step to take is to educate the population about the possibility of nude images being AI generated.
Sure. We already do that some. But we do the same basic thing around not believing rumors and that doesn't obliviate the harm they do.
Putting aside the issue of how many people want the rumor to be true or the deep fakes to be real, people expending effort to say/produce something harmful or uncomfortable is hurtful to the subject/victim. The idea that people could believe it is hurtful.
This is all exacerbated with young people because their brains are wired to care more about peer socialization and perception than adult brains.
Even things we know aren't true damage our reputations and perceptions. I know JD Vance didn't fuck a couch, but it's one of the first things that comes to mind when he's mentioned.
Education about the reality of AI generated nudes isn't a bad thing (and, like, every teen already knows this is a thing, anyway), but that doesn't stop the harm for the subject due to the association with the material.