this post was submitted on 12 Feb 2026
67 points (71.9% liked)

Technology

81128 readers
3714 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

When people ask me what artificial intelligence is going to do to jobs, they’re usually hoping for a clean answer: catastrophe or overhype, mass unemployment or business as usual. What I found after months of reporting is that the truth is harder to pin down—and that our difficulty predicting it may be the most important part of

https://web.archive.org/web/20260210152051/www.theatlantic.com/magazine/2026/03/ai-economy-labor-market-transformation/685731/

In 1869, a group of Massachusetts reformers persuaded the state to try a simple idea: counting.

The Second Industrial Revolution was belching its way through New England, teaching mill and factory owners a lesson most M.B.A. students now learn in their first semester: that efficiency gains tend to come from somewhere, and that somewhere is usually somebody else. The new machines weren’t just spinning cotton or shaping steel. They were operating at speeds that the human body—an elegant piece of engineering designed over millions of years for entirely different purposes—simply wasn’t built to match. The owners knew this, just as they knew that there’s a limit to how much misery people are willing to tolerate before they start setting fire to things.

Still, the machines pressed on.

...

you are viewing a single comment's thread
view the rest of the comments
[–] bunchberry@lemmy.world 4 points 1 day ago (1 children)

Moore's law died a long time ago. Engineers pretended it was going on for years by abusing the nanometer metric, by saying that if they cleverly find a way to use the space more effectively then it is as if they packed more transistors into the same nanometers of space, and so they would say it's a smaller nanometer process node, even though quite literal they did not shrink the transistor size and increase the number of transistors on a single node.

This actually started to happen around 2015. These clever tricks were always exaggerated because there isn't an objective metric to say that a particular trick on a 20nm node really gets you performance equivalent to 14nm node, so it gave you huge leeway for exaggeration. In reality, actual performance gains drastically have started to slow down since then, and the cracks have really started to show when you look at the 5000 series GPUs from Nvidia.

The 5090 is only super powerful because the die size is larger so it fits more transistors on the die, not because they actually fit more per nanometer. If you account for the die size, it's actually even less efficient than the 4090 and significantly less efficient than the 3090. In order to pretend there have been upgrades, Nvidia has been releasing software for the GPUs for AI frame rendering and artificially locking the AI software behind the newer series GPUs. The program Lossless Scaling proves that you can in theory run AI frame rendering on any GPU, even ones from over a decade ago, and that Nvidia's locking of it behind a specific GPU is not hardware limitation but them trying to make up for lack of actual improvements in the GPU die.

Chip improvements have drastically slowed done for over a decade now and the industry just keeps trying to paper it over.

[–] Holytimes@sh.itjust.works 1 points 1 day ago

Yo be fair lossless frame gen has a number of short comings, performance issues, and quality problems compared to nividias offerings.

While it's "possible" to run frame gen on any hardware the quality and performance is definitely a sizeable downgrade