this post was submitted on 15 Sep 2025
151 points (82.4% liked)

Technology

75205 readers
2869 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FaceDeer@fedia.io 0 points 22 hours ago (2 children)

And yet a great many people are willingly, voluntarily using them as replacements for search engines and more. If they were worse then why are they doing that?

[–] AFaithfulNihilist@lemmy.world 5 points 22 hours ago* (last edited 22 hours ago) (1 children)

These kinds of questions are strange to me.

A great many people are using them voluntarily, a lot of people are using them because they don't know how to avoid using them and feel that they have no alternative.

But the implication of the question seems to be that people wouldn't choose to use something that is worse.

In order to make that assumption you have to first assume that they know qualitatively what is better and what is worse, that they have the appropriate skills or opportunity necessary to choose to opt in or opt out, and that they are making their decision on what tools to use based on which one is better or worse.

I don't think you can make any of those assumptions. In fact I think you can assume the opposite.

The average person doesn't know how to evaluate the quality of research information they receive on topics outside of their expertise.

The average person does not have the technical skills necessary to engage with non-AI augmented systems presuming they want to.

The average person does not choose their tools based on what is the most effective at producing the correct truth but instead on which one is the most usable, user friendly, convenient, generally accepted, and relatively inexpensive.

50 million cigarette smokers can't be wrong!

[–] FaceDeer@fedia.io 1 points 21 hours ago (1 children)

In order to make that assumption you have to first assume that they know qualitatively what is better and what is worse, that they have the appropriate skills or opportunity necessary to choose to opt in or opt out, and that they are making their decision on what tools to use based on which one is better or worse.

I don't think you can make any of those assumptions. In fact I think you can assume the opposite.

Isn't that what you yourself are doing, right now?

The average person does not choose their tools based on what is the most effective at producing the correct truth but instead on which one is the most usable, user friendly, convenient, generally accepted, and relatively inexpensive.

Yes, because people have more than one single criterion for determining whether a tool is "better."

If there was a machine that would always give me a thorough well-researched answer to any question I put to it, but it did so by tattooing the answer onto my face with a rusty nail, I think I would not use that machine. I would prefer to use a different machine even if its answers were not as well-researched.

But I wasn't trying to present an argument for which is "better" in the first place, I should note. I'm just pointing out that AI isn't going to "go away." A huge number of people want to use AI. You may not personally want to, and that's fine, but other people do and that's also fine.

[–] AFaithfulNihilist@lemmy.world 2 points 21 hours ago (1 children)

A lot of people want a good tool that works.

This is not a good tool and it does not work.

Most of them don't understand that yet.

I am optimistic to think that they will have the opportunity find that out in time to not be walked off a cliff.

I'm optimistically predicting that when people find out how much it actually costs and how shit it is that they will redirect their energies to alternatives if there are still any alternatives left.

A better tool may come along, but it's not this stuff. Sometimes the future of a solution doesn't just look like more of the previous solution.

[–] FaceDeer@fedia.io 1 points 20 hours ago

This is not a good tool and it does not work.

For you, perhaps. But there are an awful lot of people who seem to be finding it a good tool and are getting it to work for them.

[–] badgermurphy@lemmy.world 2 points 22 hours ago (1 children)

I suspect it because search results require manually parsing through them for what you are looking for, with the added headwinds of widespread, and in many ways intentional degradation of conventional search.

Searching with an LLM AI is thought-terminating and therefore effortless. You ask it a question and it authoritatively states a verbose answer. People like it better because it is easier, but have no ability to evaluate if it is any better in that context.

[–] FaceDeer@fedia.io 1 points 22 hours ago

So it has advantages, then.

BTW, all the modern LLMs I've tried that do web searching provide citations for the summaries they generate. You can indeed evaluate the validity of their responses.