this post was submitted on 21 Jan 2026
1298 points (98.4% liked)

Technology

79015 readers
4168 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Workers should learn AI skills and companies should use it because it's a "cognitive amplifier," claims Satya Nadella.

in other words please help us, use our AI

you are viewing a single comment's thread
view the rest of the comments
[–] BlackDragon@slrpnk.net 15 points 1 day ago

LLMs are dead end tech which is only useful for people who want to do unethical shit. They're good at lying, making up nonsense, sounding like humans, facilitating scams, and misleading people. No matter how much time and energy is spent developing them, that's all they'll ever be good at. They can get better at doing those things, but they'll never be good at anything actually useful because of the fact that there is no internal logic going on in them. When it tells you the moon is made of various kinds of rock, the exact same thing is happening as when it tells you the moon is made of cheese and bread. It has no way of distinguishing between these two statements. All of its 'ideas' are vapor, an illusion, smoke and mirrors. It doesn't "understand" anything it's saying, all it does is generate text that looks like something someone who does understand language would say. There is no logic in the background and there cannot be.