this post was submitted on 24 Aug 2025
355 points (93.6% liked)

Technology

74407 readers
2958 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ccunix@sh.itjust.works 12 points 14 hours ago (2 children)

Not true

Company I used to work for had excellent upscalers running on FPGAs that they developed 20+ years ago.

The algorithms have been there for years, just AI gives it bit of marketing sprinkle to something that has been a solved problem for years.

[–] CeeBee_Eh@lemmy.world 5 points 9 hours ago

Well, the algorithms that make up many neural networks have existed for over 60 years. It's only recently that hardware has been able to make it happen.

AI gives it bit of marketing sprinkle to something that has been a solved problem for years.

Not true and I did say "any upscaler that's worth anything". Upscaling tech has existed at least since digital video was a thing. Pixel interpolation is the simplest and computationally easiest method. But it tends to give a slight hazy appearance.

It's actually far from a solved problem. There's a constant trade-off beyond processing power and quality. And quality can still be improved by a lot.

[–] Probius@sopuli.xyz 8 points 14 hours ago

Depends on what you're trying to upscale.