this post was submitted on 03 Sep 2025
540 points (98.2% liked)
Technology
74799 readers
2658 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It creates more problems than it solves. You would need an order of magnitude more processing power to play a game on it. Personally I would prefer 4K at a higher framerate. Even 1080 if it improves response.
Video in 8K are massive. You need better codecs to handle them, and they aren't that widely supported. Storage is more expensive than it was a decade ago.
Also, there is no content. Nobody wants to store and transmit such massive amounts of data over the internet.
HDMI cables will fail sooner at higher resolutions. That 5 year old cable will begin dropping out when you try it at 8k.
4K is barely worth the tradeoffs.
A couple things - every jump like that in resolution is about a 10% increase in size at the source level. So 2K is ~250GB, 4K is ~275GB. Haven't had to deal with 8K myself, yet, but it would be at ~300GB. And then you compress all that for placea like netflix and the size goes down drastically. Add to that codec improvements over time (like x264 -> x265) and you might actually end up with an identical size compressed while carrying 4x more pixels.
HDMI is digital. It doesn't start failing because of increased bandwidth; there's nothing consumable. It either works or it doesn't.
Yeah, legitimate 8K use cases are ridiculously niche, and I mean... really only have value if you're talking about an utterly massive display, probably around 90 inches or larger, and even then in a pretty small room.
The best use cases I can think of are for games where you're already using DLSS, and can just upscale from the same source resolution to 8K rather than 4K? Maybe something like an advanced CRT filter that can better emulate a real CRT with more resolution to work with, where a pixel art game leaves you with lots of headroom for that effect? Maybe there's value in something like an emulated split screen game, to effectively give 4 players their own 4K TV in an N64 game or something?
But uh... yeah, all use cases that are far from the average consumer. Most people I talk to don't even really appreciate 1080p->4K, and 4X-ing your resolution again is a massive processing power ask in a world where you can't just... throw together multiple GPUs in SLI or something. Even if money is no object, 8K in mainline gaming will require some ugly tradeoffs for the next several years, and probably even forever if devs keep pushing visuals and targeting upscaled 4K 30/60 on the latest consoles.
4K for me as a developer means that I can have a couple of source files and a browser with the API documentation open at the same time. I reckon I could use legitimately use an 8K screen - get a terminal window or two open as well, keep an eye on builds and deployments while I'm working on a ticket.
Now yes - gaming and watching video at 8K. That's phenomenally niche, and very much a case of diminishing returns. But some of us have to work for a living as well, alas, and would like them pixels.
Good point, 4K text for programming is pretty fantastic, if you don't mind small text and use a big monitor, I could see 8K bringing some worthwhile clarity improvements to some productivity workflows. It's probably better for monitors than it is for TVs.
Even as a dev, I use a 32" QHD screen for programming. If I went 4K, I would need to use 150% scaling, and that breaks a LOT of stuff.
Everything is built for 100% scaling. Every time I've plugged my PC into a 4K display I've regretted it. It go to 30Hz (on HDMI) or glitch out or something. Even if it doesn't, it's never as smooth.
I have a 43" 4K and at that physical size display scaling at 100% is appropriate (despite windows trying to run it at 300% out of the box) and it is legitimately useful. Its effectively four 1080p screens in a grid with no bezel between.