this post was submitted on 29 Apr 2026
330 points (96.9% liked)

Technology

84281 readers
3331 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] BranBucket@lemmy.world 13 points 1 day ago (2 children)

I've said it before and I'll say it again. If you're lonely and hurting, don't fall in love with anything that doesn't have a pulse. It's only going to fuck you up worse in the end.

[–] SkunkWorkz@lemmy.world 5 points 19 hours ago* (last edited 17 hours ago)

What about electric pulses? Is 4GHz enough?

I agree, last time I dug up a corpse I got into alot of trouble. I'm no longer allowed to be within 6 feet of a corpse.

[–] andallthat@lemmy.world 16 points 1 day ago

damn autocorrect, I wanted to write "hard"

[–] ArmchairAce1944@discuss.online 6 points 1 day ago (1 children)

I found pornbots to be boring and dumb.

[–] unglueclass23@programming.dev 1 points 20 hours ago

It's mostly novelty. But wears off eventually when you start noticing very obvious patterns emerge in the way it answers and quality degrades significantly as context size grows. It also will always talk to you in the way YOU tell it to which also becomes boring as time goes on.

It's always funny to me how people on the news talk about AI partners and so on when you know if they have 2 brain-cells, next month they will drop this whole stupid idea. When you're talking to it about your problems you're just talking with yourself.

[–] SethTaylor@lemmy.world 27 points 1 day ago

Now this is quality journalism

This is why I only read Playboy for the articles

[–] AnarchistArtificer@slrpnk.net 28 points 1 day ago (1 children)

If there are any guys here who are in the UK, I can strongly recommend Andy's Man Club, a charity that does weekly peer support social sessions for men.

They've got groups all over the country, and although I personally haven't been (I'm a woman), I've heard so many good things about it from guys I know.

[–] viov@lemmy.world 6 points 1 day ago (1 children)

Hope there is something for USA too. Know Australia has a few

[–] alternategait@lemmy.world 3 points 1 day ago

I don’t know the details of either, but I hear in the US “men’s sheds” are intended to be supportive groups

[–] SethTaylor@lemmy.world 19 points 1 day ago* (last edited 1 day ago) (3 children)

I never bought into religion, never bought into astrology, never gonna buy into chatbots

You can tell me I'm great and everything will be amazing 1,000 times. It doesn't matter at all to me if it's not real

I like to escape into music or movies, but real life is real life and must not be corrupted

[–] Raiderkev@lemmy.world 10 points 1 day ago (5 children)

My work offered an AI chatbot therapist. Like to, I'm not putting all my negative feelings into a company sponsored LLM to fucking have it say, "no relax guy, it'll be OK." Like it's a fucking clanker. It doesn't have feelings. It's not fucking real. It's a slap in the face that they even offer it.

[–] partial_accumen@lemmy.world 2 points 8 hours ago* (last edited 8 hours ago)

I’m not putting all my negative feelings into a company sponsored LLM to fucking have it say, “no relax guy, it’ll be OK.” Like it’s a fucking clanker.

I'd be more concerned with any company sponsored AI chatbot therapist using what you say influence your employment relationship.

Employee X: I'm worried about losing my job so I work unpaid overtime and that is affecting my marriage.

Therapist chatbot to management: Employee X should not be given a raise. They already have enough external motivation to work without additional financial incentives.

load more comments (4 replies)
[–] mechoman444@lemmy.world 4 points 1 day ago

You’re drawing a line that sounds principled, but it’s actually pretty arbitrary.

You say “real life is real life” and don’t want it “corrupted,” yet you’re perfectly fine immersing yourself in music and movies,things literally engineered to manipulate your emotions and perception. That’s not some pure, untouched version of reality. It’s curated fiction designed to make you feel something.

The only real difference here is that those mediums don’t talk back.

Chatbots make you uncomfortable because they simulate interaction, not because they’re uniquely fake. But calling that “corruption” while giving a free pass to every other form of emotional influence is inconsistent at best.

If your stance is “I don’t want anything artificial affecting me,” then be consistent about it. Otherwise, just say you don’t like this particular form of it instead of pretending it’s some hard philosophical boundary.

[–] orioler25@lemmy.world 5 points 1 day ago

You're telling me that you believe you are not vulnerable to validation? Right before using the word "corrupted" uncritically in a way that suggests there is a universal and normative "real life?"

What if someone who you respected the authority of, like a prominent scholar or filmmaker, said your obviously incorrect stance on things was correct? You'd trust me, Online Internet Bastard, when I tell you that you are wrong?

AI has been sold as something exceptionally capable of mimicking human knowledge, and its existence is compatible with liberal notions of "objectivity" in that it is quite literally not a human being. Most men subscribe to this authority, and are also statistically bereft of emotional intelligence or management skills. You ever try telling a man what they want to hear? I've never ever met one who doesn't just eat it up.

[–] Earthman_Jim@lemmy.zip 28 points 1 day ago* (last edited 1 day ago) (13 children)

How does this make someone "feel heard"???? I feel like I'm losing my mind... It's the same to me as if someone went to the front of a McDonald's to talk to the building about their problems. It seems completely insane, and it's making me feel crazy that this is our world now.

[–] mokey@therock.fraggle-rock.org 2 points 19 hours ago

It's the same thing as prayer.

Placebo works for simple people.

[–] lightnsfw@reddthat.com 17 points 1 day ago

It's not you. These people aren't mentally well. They can't differentiate between a real person and an LLM. Probably contributes to why they're having woman problems too.

[–] DarrinBrunner@lemmy.world 7 points 1 day ago (1 children)

People care about being heard, not listened too. It's one-sided. I'm guessing they just like that the thing responded, and may not even bother reading carefully what it said. Like a friend who says supportive murmurings as you prattle on about whatever, "Really?", "Umm-hmm", "Oh, I know what you mean!", "Right, exactly", and, "It's nice to talk to someone I get along with."

[–] quarkquasar@lemmy.world 4 points 1 day ago

This is definitely true for at least a small number of people.

I've ran across more than I care to remember over the years, people who could just prattle on 24/7 if they had the energy, while not actually really saying anything or conversing in any meaningful way.

It's a living hell for me.

[–] Blemgo@lemmy.world 9 points 1 day ago (1 children)

My guess would be the same phenomenon that existed with ELIZA. People want to be heard, especially lonely people, and LLMs are pretty good at that, asking questions and acting supportive, by design.

This whole situation reminds me of that fact that some people hire escorts to just have someone to talk to.

[–] tigeruppercut@lemmy.zip 6 points 1 day ago* (last edited 1 day ago)

The Eliza creator got his secretary to try it out, and as she got into her conversation she asked if he could leave to give her some privacy.

https://www.youtube.com/watch?v=RMK9AphfLco

There was a longer video talking about that in the context of how humans engage socially but I can't find it right now.

ed: Oh, it was in the most recent John Oliver segment on AI chatbots

https://youtu.be/Ykvf3MunGf8?t=321

load more comments (9 replies)
[–] acaciadaniels@lemmy.world 21 points 1 day ago (2 children)

It's easy to point fingers but we should probably be offering solutions instead of shitting on them. Like more Men's Sheds.

load more comments (2 replies)
[–] Cantaloupe@lemmy.fedioasis.cc 19 points 1 day ago

We are so lost.

[–] CaptainBlinky@lemmy.myserv.one 26 points 2 days ago* (last edited 2 days ago) (7 children)

Meanwhile I get pissed off whenever I talk to AI about books I'm reading because they have no idea of the concept of spoilers, they consistently simp to my opinions and when they spew falsehoods and "misremember" facts from books I've already read, they simply say "GREAT CORRECTION! I WAS SO WRONG THERE, YOU'RE RIGHT, PROTAGANIST DIDN'T ACTUALLY DIE IN CHAPTER 3. MY LAST 2 PAGE SYNOPSIS ABOUT HOW PROTAGANIST DIED IN CHAPTER 3 IS A BIT INCORRECT, AND NOW HERE'S A 300 WORD ESSAY ON HOW I NEVER ACTUALLY SAID PROTAGONIST DIDN'T ACTUALLY DIE IN CHAPTER 3!

Seriously. How can anyone talk to an LLM and not feel like they're talking to a glorified phone answering computer?

load more comments (7 replies)
[–] Late2TheParty@lemmy.world 75 points 2 days ago (1 children)

Huh. Playboy is still around.

[–] bamboo@lemmy.blahaj.zone 50 points 2 days ago (8 children)

I always read it for the articles

load more comments (8 replies)
[–] devolution@lemmy.world 52 points 2 days ago (54 children)

This is more sad and pathetic than anything. But this is the result of toxic masculinity.

[–] IAmNorRealTakeYourMeds@lemmy.world 147 points 2 days ago (11 children)

It is extremely sad. and it isn't just a toxic masculinity thing (maybe only for porn bots). we are so atomised and isolated.

I remember when GPT came out, told it about my projects and it responded as if it cared. I knew ot was bs, and in retrospect it was sad and pathetic, but I genuinely cried at seeing text directed to me that was nice.

I'm in a better place now, but we as a society are way too atomised and isolated.

[–] HexParte@lemmy.zip 24 points 2 days ago

Yeah, I think saying “toxic masculinity” and moving on like it’s these guys’ fault they’re isolated is a large part of the issue. While I don’t recommend befriending every single lonely guy out there, it won’t kill people to listen or care about others.

Saying it’s “you’re” fault and absolving oneself of fault doesn’t do that. It just pushes someone else into more isolation. That’s how you end up with guys talking to porn bots: because no one will listen to them. That’s how you get incels following Andrew Tate or Nick Fuentes: people called out their “toxic masculinity,” but weren’t willing to help, just protect themselves.

While I get it that boundaries are a good defense against legitimate threats, as someone who was in this demographic, it literally took just one person being nice to me and now I’m not just some “nice guy” on Reddit (Now I’m a piece of shit on Lemmy). Now I’m married and can show incels I meet that there is a path forward where they aren’t lonely and they don’t have to listen to virgin wannabe rapists to learn how to be cool.

load more comments (10 replies)
load more comments (53 replies)
load more comments
view more: next ›