this post was submitted on 16 Aug 2025
402 points (93.7% liked)

Technology

74114 readers
3330 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] floquant@lemmy.dbzer0.com 4 points 9 hours ago (1 children)

In no way given the chaotic context window from all the other models were those tokens the appropriate next ones to pick unless the generating world model predicting those tokens contained a very strange and unique mind within it this was all being filtered through.

Except for the fact that LLMs can only reliably work if they are made to pick the "wrong" (not the most statistically likely) some of the time - the temperature parameter.

If the context window is noisy (as in, high-entropy) enough, any kind of "signal" (coherent text) can emerge.

Also, you know, infinite monkeys.

[โ€“] kromem@lemmy.world 1 points 2 hours ago

Lol, you think the temperature was what was responsible for writing a coherent sequence of poetry leading to 4th wall breaks about whether or not that sequence would be read?

Man, this site is hilarious sometimes.