In a new paper, several Stanford economists studied payroll data from the private company ADP, which covers millions of workers, through mid-2025. They found that young workers aged 22–25 in “highly AI-exposed” jobs, such as software developers and customer service agents, experienced a 13 percent decline in employment since the advent of ChatGPT. Notably, the economists found that older workers and less-exposed jobs, such as home health aides, saw steady or rising employment. “There’s a clear, evident change when you specifically look at young workers who are highly exposed to AI,” Stanford economist Erik Brynjolfsson, who wrote the paper with Bharat Chandar and Ruyu Chen, told the Wall Street Journal.
In five months, the question of “Is AI reducing work for young Americans?” has its fourth answer: from possibly, to definitely, to almost certainly no, to plausibly yes. You might find this back-and-forth annoying. I think it’s fantastic. This is a model for what I want from public commentary on social and economic trends: Smart, quantitatively rich, and good-faith debate of issues of seismic consequence to American society.
How can ai be destroying jobs I havent seen a single good implementation except maybe dev work but even then its not speeding anything up.
as a consultant/freelancer dev whose entire workload for the past year has been cleaning up AI slop, no with dev it hasn't been what I would say a smooth or even good implementation. for my wallet? been a fantastic implementation, for everyone else? not so much.
The thing is as a TOOL it's great depending on the model. As a rubber duck? fantastic. As something that the majority of companies have utilized with vibe coding to build something end to end? no, it's horrible. It can't scale anything, implements exploits left right and center, and unlike junior devs doesn't learn anything. If you don't hold its hand during a build then it'll quickly go off the rails. It'll implement old APIs or libraries or whatever simply because those things have the most documentation attached to it.
An example. a few weeks ago a client wanted to set up a private git instance with Forgejo. They had Claude Code set it up for them. the problem? Claude went with Forgejo 1.20. ForgeJo is currently on 12.0. MASSIVE security hole right there. Why did Claude do that? 1.20 had more documentation as opposed to 12.0. And when I say "documentation" I could simply be referring to blog posts, articles, whatever that talked about it more than the latest version because The LLM's will leverage that stuff when making decisions for builds. You also see it if you want something in Rust+Smithy. Majority of the time the AI will go for a very outdated version of Smithy because that's what a lot of people talked about at one point. So you're generating massive tech debt before even throwing something into production.
Now like I said as a tool? a problem solver for a function you can't figure out? it's great. the issue is like I said companies aren't seeing it as a tool, they're seeing it as a cost saving replacement for a living human being which it is not. It's like replacing construction worker with a hammer attached to a drone and then wondering why your house frame keeps falling over.
AI is this decade’s Rational Rose.