this post was submitted on 30 Nov 2025
1190 points (99.4% liked)

People Twitter

8655 readers
311 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician. Archive.is the best way.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] MiddleAgesModem@lemmy.world 4 points 4 days ago (1 children)

They think the LLM hallucination problem will be ironed out in a couple of years.

That one is a tad more realistic than uploading human consciousness.

[–] WraithGear@lemmy.world 6 points 4 days ago (1 children)

not unless they pivot on the basic principles of the LLM’s, instead of attempting to force a square peg into a circle hole.

[–] MiddleAgesModem@lemmy.world -1 points 4 days ago* (last edited 4 days ago)

Hallucinations have already been reduced. You're expressing a pretty standard anti-LLM stance but it seems people in the field think the hallucination problem can be fixed. Even something as simple as having them say "I don't know". Better use of tools and sources.

The fact that hallucination is less of a problem then it used to be should make it pretty clear that it's not immutable.

In any event, it's still ridiculously more possible than artificially transferring human consciousness.