this post was submitted on 24 Aug 2025
355 points (93.6% liked)
Technology
74407 readers
2958 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Sharpening and denoising don't. But upscalers worth anything do require neural nets.
Anything that uses a neural network is the definition of AI.
Not true
Company I used to work for had excellent upscalers running on FPGAs that they developed 20+ years ago.
The algorithms have been there for years, just AI gives it bit of marketing sprinkle to something that has been a solved problem for years.
Well, the algorithms that make up many neural networks have existed for over 60 years. It's only recently that hardware has been able to make it happen.
Not true and I did say "any upscaler that's worth anything". Upscaling tech has existed at least since digital video was a thing. Pixel interpolation is the simplest and computationally easiest method. But it tends to give a slight hazy appearance.
It's actually far from a solved problem. There's a constant trade-off beyond processing power and quality. And quality can still be improved by a lot.
Depends on what you're trying to upscale.