this post was submitted on 27 Oct 2025
517 points (92.3% liked)

Technology

77084 readers
2045 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)

(page 3) 50 comments
sorted by: hot top controversial new old
[–] CCMan1701A@startrek.website 4 points 1 month ago

HDR 1080p is what most people can live with.

[–] vortexal@sopuli.xyz 4 points 1 month ago

This is why I still use 768p as my preferred resolution, despite having displays that can go much higher. I hate that all TVs now are trying to go as big as possible, when it's just artificially inflating the price for no real benefit. I also hate that modern displays aren't as dynamic as what CRTs were. CRTs can handle pretty much any resolution you throw at them but modern TVs and monitors freak out if you don't use an exact resolution, causing them to either have input lag because the display has to upscale the image or a potential performance hit if the display forces the connected device to handle the upscaling.

[–] brucethemoose@lemmy.world 4 points 1 month ago* (last edited 1 month ago)

It's all about the baseline.

Cinematic, Blu Ray bitrate 1080p vs 4K is not too dramatic.

Compressed streams though? Or worse production quality? 4K raises the baseline dramatically. It's much harder to stream bad-looking 4K than it is 1080p, especially since '4K' usually implies certain codecs/standards.

[–] Bishma@discuss.tchncs.de 3 points 1 month ago (1 children)

Given how much time I spend actually looking at the screen while the show/movie is on, it might as well be in ca. 2000 RealVideo 160x120 resolution.

load more comments (1 replies)
[–] BlameTheAntifa@lemmy.world 3 points 1 month ago

It depends on how far away you sit. But streaming has taken over everything and even a little compression ruins the perceived image quality of a higher-DPI display.

[–] bobaworld@lemmy.world 3 points 1 month ago

I know I am a display tech nerd, but can people really not tell the difference? Even going from a 1440p to a 4k monitor to me was a very noticeable improvement to clarity. And there's a huge difference in the way that games look on my living room TV in 1080p compared to 4k.

[–] nyan@lemmy.cafe 3 points 1 month ago* (last edited 1 month ago)

The question for me isn't whether or not there's a difference that I might be able to see if I were paying attention to the picture quality, it's whether the video quality is sufficiently bad to distract me from the content. And only hypercompressed macroblocked-to-hell-and-back ancient MPEG1 files or multiply-recopied VHS tapes from the Dark Ages are ever that bad for me. In general, I'm perfectly happy with 480p. Of course, I might just have a higher-than-average immunity to bad video. (Similarly, I can spot tearing if I'm looking for it, but I do have to be looking for it.)

load more comments
view more: ‹ prev next ›