this post was submitted on 01 May 2026
114 points (91.9% liked)

Technology

84257 readers
3918 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Ludicrous0251@piefed.zip 21 points 3 hours ago (1 children)

Friendly reminder that LLMs don't do math, they guess what number should come next, just like words.

It can probably link the image to the words "a photo of a sandwich on a plate", and interpret the question as "how many calories are in a sandwich" but from there it is just guessing at the syntax of an answer, but not at finding any truth.

It knows sandwiches have calories and those tend to be 3-4 digit numbers, but also all numbers kinda look the same, so what's to say it's not 2, 5, or 12 digits?

[–] monkeyslikebananas2@lemmy.world 6 points 2 hours ago

Tool-powered agents can do math though. The issue is the fuzziness of it trying to guess carbs. It doesn’t know weight, ingredients, or anything other than a picture. These tools can be useful but not for this. Maybe one day but not yet.

Whoever claims an AI (LLM or agents) can do that and charging their users is lying and defrauding them.