this post was submitted on 07 Mar 2026
964 points (97.4% liked)

Technology

82378 readers
4193 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] prole@lemmy.blahaj.zone 1 points 18 hours ago (1 children)

Hooray for outsourcing of critical thinking!

What could possibly go wrong

[–] thebestaquaman@lemmy.world 1 points 18 hours ago (1 children)

I think you've misunderstood the purpose of a rubber duck: The point is that by formulating your problems and ideas, either out loud or in writing, you can better activate your own problem solving skills. This is a very well established method for reflecting on and solving problems when you're stuck, it's a concept far older than chatbots, because the point isn't the response you get, but the process of formulating your own thoughts in the first place.

[–] prole@lemmy.blahaj.zone 4 points 18 hours ago* (last edited 18 hours ago) (1 children)

Right, but a rubber duck isn't a sycophantic chatbot that isn't capable of conceptualizing anything but responding to you anyway.

[–] thebestaquaman@lemmy.world 2 points 17 hours ago (1 children)

That is correct. However, an LLM and a rubber duck have in common that they are inanimate objects that I can use as targets when formulating my thoughts and ideas. The LLM can also respond to things like "what part of that was unclear", to help keep my thoughts flowing. NOTE: The point of asking an LLM "what part of that was unclear" is NOT that it has a qualified answer, but rather that it's a completely unqualified prompt to explain a part of the process more thoroughly.

This is a very well established process: Whether you use an actual rubber duck, your dog, writing a blog post / personal memo (I do the last quite often) or explaining your problem to a friend that's not at all in the field. The point is to have some kind of process that helps you keep your thoughts flowing and touching in on topics you might not think are crucial, thus helping you find a solution. The toddler that answers every explanation with "why?" can be ideal for this, and an LLM can emulate it quite well in a workplace environment.