this post was submitted on 26 Mar 2026
41 points (76.6% liked)

Technology

83078 readers
3592 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] KoboldCoterie@pawb.social 71 points 4 hours ago (4 children)

Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?

Of course not. Because infinite scroll is not inherently harmful. Autoplay is not inherently harmful. Algorithmic recommendations are not inherently harmful. These features only matter because of the content they deliver. The “addictive design” does nothing without the underlying user-generated content that makes people want to keep scrolling.

This feels like an awful argument to make. It's not the presence of those things that make Meta and co so shit, it's the fact that they provably understood the risks and the effects that their design was having, knew that it was harming people, and continued to do it anyway. I don't care if we're talking about a little forum run by a Grandma and Grandpa talking about their jam recipes; if they know that they're causing harm and don't change their behavior, they should be liable.

[–] HeartyOfGlass@piefed.social 20 points 3 hours ago (1 children)

"We designed, marketed, and sold the gun, but we didn't think anyone would use it."

[–] KoboldCoterie@pawb.social 8 points 2 hours ago

It's like if someone had a forum where insurrectionists were discussing how to build bombs and where they were going to use them, and the owners had an internal meeting where they said, "Hey, we're hosting some pretty awful people, should we maybe report them or shut this down?" and the answer was, "Nah, they're paying users, and we want their money."

Pretty sure Section 230 wouldn't protect them, either.

[–] Chulk@lemmy.ml 16 points 3 hours ago (1 children)

Yeah this feels very much like, "censor content, but don't change Meta's practices"

Which begs the question, does the author know what they're cheering for?

[–] Maeve@kbin.earth 4 points 3 hours ago

You can bet they do.

[–] XLE@piefed.social 10 points 3 hours ago

It's like he's describing a slot machine with unpainted wheels, leaving out the context that it's in a casino with a big "paint me and enjoy a share of the profit" sign above it.

The social media machine was designed to be a self-serve addiction generator. It intentionally used every trick it could legally get away with.

[–] avidamoeba@lemmy.ca 8 points 3 hours ago

Also they can now generate content without users, which they already do a lot on Facebook.