this post was submitted on 26 Feb 2026
462 points (99.1% liked)

Technology

81948 readers
2878 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] SavageCoconut@lemmy.world 36 points 1 day ago (2 children)

And they called me a madman for spending months tuning the CO of my 5800X3D to its limits and also OCing my 3200 mhz Crucial Memory to 3800 Mhz. It seems this setup will stay with me until DDR6 arrives. I hope the prices get back to normal by then.

[โ€“] ouRKaoS@lemmy.today 21 points 1 day ago (2 children)

I don't think there will be a DDR6. I think the AI bubble is going to pop & all these data centers will become "mainframe centers" that your minimum spec'd home terminal connects to to do all the computing for you on "Our lightning fast multi-core super computer with terabytes of memory!"

๐Ÿ˜๐Ÿ˜ญ๐Ÿคฎ

[โ€“] echodot@feddit.uk 3 points 12 hours ago (1 children)

You can't run normal programs on their weird AI architecture. This is the problem everyone has with all of the ram as well, when the AI bubble pops we won't get loads of cheap RAM because it's all configured for AI and doesn't really work on anything else. They can't just pivot, that's why they're so eager to make AI a thing.

[โ€“] ouRKaoS@lemmy.today 1 points 11 hours ago (1 children)

I'm sure you can't do it easily, but I'm sure there will be ridiculous AI vibe-coded attempts at making it work that end in a catastrophic failure/data breach/scandal.

[โ€“] echodot@feddit.uk 2 points 9 hours ago

My understanding is that the RAM architecture is built around insanely quick read write access but doesn't really store data for more than three or four seconds at a time. Most modern programs expect the RAM to hold on to the data for basically however long they need until they access it. So most programs just won't fit into memory configured like that, and I think it's a hardware thing, not something you can change with software.

[โ€“] carpelbridgesyndrome@sh.itjust.works 17 points 1 day ago* (last edited 1 day ago) (2 children)

I doubt it. Those AI computers are built in a really weird way and have a lot of hardware that isn't really useful outside an AI/HPC context. Some stuff like the weird card to card network topology can be reconfigured but the rest of it can't easily be. The servers are rather agressively designed around keeping as many GPUs fed as possible making them kinda weird for other jobs. Those datacenter cards are missing enough video hardware (for example texture units) to make gaming hard and I'm not sure there's that much consumer demand for linear algebra accelerators. If they can't find more HPC jobs they may go under. Movie studios could have interesting opportunities here but they are still primarily using CPUs in all their software IIRC.

The clusters in the UAE and Saudi Arabia might be repurposable for nuclear weapons research which isn't great.

[โ€“] olympicyes@lemmy.world 2 points 13 hours ago

My understanding is that the AI companies push their servers so hard that the components are basically consumables. Consumers donโ€™t really press their machines to the point of physical exhaustion.

[โ€“] fruitycoder@sh.itjust.works 1 points 1 day ago (1 children)

GPGPU usage is probally going to see some real usage. There was an interesting talk at the xorg conf even about turn the video hardware into virtual services running on GPGPU focused hardware.

Ive talked with some of the HPC programers too who are trying to find creative repurposes already lol

[โ€“] tal@lemmy.today 1 points 20 hours ago* (last edited 20 hours ago)

I think that it's fair to say that AI is not the only application for that hardware, but I also think that carpelbridgesyndrome's point was that they aren't really well-suited to replace conventional servers, where all local computing just moves to a server, which is the sort of thing that ouRKaoS was worried about. Maybe for some very specialized use cases, like cloud gaming in some genres. I'd also add that the physical buildings have way more cooling capacity than is necessary for conventional servers, so they probably wouldn't be the most-cost-effective approach even if you replaced the computing hardware in the buildings.

[โ€“] SharkAttak@kbin.melroy.org 16 points 1 day ago (1 children)

I don't know, if you want them to last I'd ease up on the overclock.

CO on Zen 3 X3D chips is always an undervolt, not an overclock.