this post was submitted on 09 Sep 2025
90 points (95.9% liked)
Technology
74994 readers
2893 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This doesn't hold much, because how do these two relate? Of course that could be deemed privacy invasive by anyone - and even considered to be trust invasive due to consent being taken loosely. But I wouldn't consider oversharing to be remotely close to such crime.
Also the rise of AI with easily found child images is what is starting to happen. Massive library of those overshared on, for example, Facebook over the years.
I know AI usage is increasing, but I don't understand how it allows 'child images' (which I assume is sexual abuse material) to be easily found.
Additionally, I'd want clarification on how those massive libraries which are over shared are related to oversharing.
They do not need to be sexual abusive material, they find easily scraped images and then use AI to make the sexually abusive or pornographic to then blackmail the child/children and their parents. For instance, a case 2 weeks back, young mother shared images of her kids, not sexual but a few in swimsuits when they were young. These were then doctored and sent to this girl who is now older to blackmail her and then on to her mother in an attempt to blackmail her as well. The pictures were shared with a lack of understanding of privacy at the time so anyone could see them. Police struggling to find out who is blackmailing the person, and struggling to find a reason to actually investigate as they say it is a likeness of the person not the actual person and it was shared to the world years ago meaning permission was given (ie the picture was allowed to be shared at the time). Now of course I am not revealing any info as it is a current police investigation and that would be illegal but it appears to be going nowhere yet is disturbing to the kid and the mother
Think of it as nonconsensual AI generated CSAM. Like nonconsensual AI porn, you take public SFW photos and use the AI to generate explicit images using the provided photo as reference for the abused victim.
As for the over share library, think of all the pictures you see of peoples' kids in your social media feed, then consider how few people take proper privacy controls on what they post (or just intentionally post them publicly for laughs/attention). All of those images can be used as the basis for generation nonconsensual AI porn/CSAM.
I am seeing it already here in the UK. Remember our privacy laws are totally different than plenty of other places, and most overshared without understanding the hows or whys. Oversharing might not be an issue now, but it will be in the future - that is why it is important to understand why it can be bad