this post was submitted on 26 Mar 2026
41 points (76.6% liked)
Technology
83078 readers
3592 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Normally, I am all for Techdirt's takes. But I think this one is off the mark a bit, because I legitimately think that infinite scroll and auto play are insidious, and actually harmful enough to be treated as a dangerous design decision.
The whole point of Section 230 is that communications companies can't be held responsible for harmful things that people transmit on their networks, because it's the people transmitting those harmful things that are actually at fault. And that would be reasonable in the initial stages of the Internet, when people posted on bulletin boards (or even early social media) and the harmful content had a much smaller reach. People had to "opt in", essentially, to be exposed to this content, and if they stumble on something they find objectionable they can easily change their focus
But the purpose of the infinite scroll and auto play is to get people hooked on content. The algorithms exist to maximize engagement, regardless of the value of that engagement. I think the comparison to cigarettes is particularly apt. They are looking to hook people into actively harmful behaviors, for profit. And the algorithms don't really differentiate between good engagement and harmful engagement. Anything that attracts the users attention is fair game.
The author's points regarding how these rulings can be abused are correct, but that doesn't negate how fundamentally harmful these addictive practices are. It will be up to lawmakers to make sure that the laws are drafted in such a way that they can be applied equitably.... (So maybe we're screwed after all....)