this post was submitted on 26 Aug 2025
581 points (99.0% liked)

Technology

74496 readers
4267 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://programming.dev/post/36378173

Comments

you are viewing a single comment's thread
view the rest of the comments
[–] bassomitron@lemmy.world 3 points 23 hours ago* (last edited 16 hours ago) (5 children)

Out of curiosity, why do you refuse to support Nvidia? AMD isn't some saint, they're a shitty corporation just like Nvidia. They got lucky when Jim Keller saved their asses with the Ryzen architecture in the mid-2010s. They haven't really innovated a god damn thing since then and it shows.

Edit: I get it, I get it, Nvidia is a much shittier company and I agree. I was pretty drunk last night before bed, please pardon the shots fired

[–] domi@lemmy.secnd.me 46 points 20 hours ago (1 children)

Besides what was mentioned below, it's not about making competitive products but about Nvidia being an absolute asshole since the 2000s and they got even worse ever since the crypto and AI craze started. AMD and Nvidia are both corporations but they are not even playing the same game when it comes to being anti-competitive.

There's a reason why Wikipedia has a controversies section on Nvidia: https://en.m.wikipedia.org/wiki/Nvidia#Controversies

That list is far from exhaustive. There's so much more about Nvidia that you should remember vividly if you were a PC gamer in the 2000s and 2010s with an AMD GPU, like:

  • When they pushed developers to use an unecessary amount of tesselation because they knew tesselation performed worse on AMD
  • When they pushed their Gameworks framework which heavily gimped AMD GPUs
  • When they pushed their PhysX framework which automatically offloaded to CPU on AMD GPUs
  • When they disabled their GPUs in their driver when they detected an AMD GPU is also present in the system
  • When they were cheating in benchmarks by adding optimizations specific to those benchmarks
  • When they shipped an incomplete Vulkan implementation but claimed they are compliant

Nvidia has been gimping gaming performance and visuals since forever for both AMD GPUs and even their own customers and we haven't even gotten to DLSS and raytracing yet.

I refuse to buy anything Nvidia until they stop abusing their market position at every chance they get.

[–] amorpheus@lemmy.world 47 points 21 hours ago (1 children)

they're a shitty corporation just like Nvidia

Neither of them are anyone's friend, but claiming they're the same level of nasty is a bit of a stretch.

[–] Crashumbc@lemmy.world -5 points 17 hours ago

Not saying that supporting the under dog isn't good.

Just don't think AMD is less "nasty", the only thing stopping them is the lack of power to do so.

[–] ElectroLisa@piefed.blahaj.zone 32 points 22 hours ago (2 children)

Not OC but I don't want to deal with Nvidia's proprietary drivers. AMD cards "just work" on Linux

[–] Redex68@lemmy.world 4 points 21 hours ago (3 children)

Except that AMD doesn't support HDMI 2.1 on Linux (not their fault to be fair, but still)

[–] naitro@lemmy.world 1 points 9 hours ago

Is that the case on mobile APUs as well? I'm pretty sure my laptop with 7840u does 4k120hz

[–] i_am_hiding@aussie.zone 9 points 19 hours ago (1 children)

This may be an unpopular opinion but who cares? I'll use DVI if I have to.

[–] Redex68@lemmy.world 3 points 17 hours ago (1 children)

I personally don't have a need for it, but if someone has a 4K 120Hz TV or monitor without DisplayPort that they want to use as such, it's kinda stupid that they can't.

[–] WhyJiffie@sh.itjust.works 3 points 16 hours ago

yeah, but that's the fault of the HDMI standards group. AMD cards could only support HDMI 2.1 if they closed their driver down. I guess this can't be fixed with a DP to HDMI adapter either, right?

my opinion: displayport is superior, and if I have a HDMI-only screen with supposed 4k 120Hz support I treat it as false info.

[–] Truscape@lemmy.blahaj.zone 4 points 20 hours ago (1 children)

Intel cards do, I think, so that's a non-NVIDIA option.

[–] bluecat_OwO@lemmy.world 2 points 19 hours ago

yeah intel (⁠⊙⁠_⁠◎⁠)

[–] bassomitron@lemmy.world 3 points 22 hours ago

That's completely valid, I haven't had issues on Linux myself with nvidia, but I know it's definitely a thing for a lot of people.

[–] frezik@lemmy.blahaj.zone 1 points 17 hours ago (1 children)

Haven't innovated? 3D chip stacking?

CPU companies generally don't change their micro-architecture, especially when it works.

[–] Arcane2077@sh.itjust.works 1 points 15 hours ago

intel didn’t for 7 years, but they started and ended that trend.