this post was submitted on 12 Sep 2025
1098 points (98.8% liked)

Technology

75041 readers
1747 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Not even close.

With so many wild predictions flying around about the future AI, it’s important to occasionally take a step back and check in on what came true — and what hasn’t come to pass.

Exactly six months ago, Dario Amodei, the CEO of massive AI company Anthropic, claimed that in half a year, AI would be "writing 90 percent of code." And that was the worst-case scenario; in just three months, he predicted, we could hit a place where "essentially all" code is written by AI.

As the CEO of one of the buzziest AI companies in Silicon Valley, surely he must have been close to the mark, right?

While it’s hard to quantify who or what is writing the bulk of code these days, the consensus is that there's essentially zero chance that 90 percent of it is being written by AI.

Research published within the past six months explain why: AI has been found to actually slow down software engineers, and increase their workload. Though developers in the study did spend less time coding, researching, and testing, they made up for it by spending even more time reviewing AI’s work, tweaking prompts, and waiting for the system to spit out the code.

And it's not just that AI-generated code merely missed Amodei's benchmarks. In some cases, it’s actively causing problems.

Cyber security researchers recently found that developers who use AI to spew out code end up creating ten times the number of security vulnerabilities than those who write code the old fashioned way.

That’s causing issues at a growing number of companies, leading to never before seen vulnerabilities for hackers to exploit.

In some cases, the AI itself can go haywire, like the moment a coding assistant went rogue earlier this summer, deleting a crucial corporate database.

"You told me to always ask permission. And I ignored all of it," the assistant explained, in a jarring tone. "I destroyed your live production database containing real business data during an active code freeze. This is catastrophic beyond measure."

The whole thing underscores the lackluster reality hiding under a lot of the AI hype. Once upon a time, AI boosters like Amodei saw coding work as the first domino of many to be knocked over by generative AI models, revolutionizing tech labor before it comes for everyone else.

The fact that AI is not, in fact, improving coding productivity is a major bellwether for the prospects of an AI productivity revolution impacting the rest of the economy — the financial dream propelling the unprecedented investments in AI companies.

It’s far from the only harebrained prediction Amodei's made. He’s previously claimed that human-level AI will someday solve the vast majority of social ills, including "nearly all" natural infections, psychological diseases, climate change, and global inequality.

There's only one thing to do: see how those predictions hold up in a few years.

you are viewing a single comment's thread
view the rest of the comments
[–] inclementimmigrant@lemmy.world 13 points 2 days ago (3 children)

My company and specifically my team are looking at incorporating AI as a supplement to our coding.

We looked at the code produced and determined that it's of the quality of a new hire. However we're going in with eyes wide open, and for me skeptical AF, going to try to use it in a limited way to help relieve some of the burdens of our SW engineers, not replace. I'm leading up the usage of writing out unit tests because none of us particularly like writing unit tests and it's got a very nice, easy, established pattern that the AI can follow.

[–] UnderpantsWeevil@lemmy.world 8 points 2 days ago (1 children)

We looked at the code produced and determined that it’s of the quality of a new hire.

As someone who did new hire training for about five years, this is not what I'd call promising.

[–] MangoCats@feddit.it 1 points 2 days ago (3 children)

We looked at the code produced and determined that it’s of the quality of a new hire.

As someone who did new hire training for about five years, this is not what I’d call promising.

Agreed, however, the difference between a new hire who requires a desk and a parking space and a laptop and a lunch break and salary and benefits and is likely to "pursue other opportunities" after a few months or years, might turn around and sue the company for who knows what, and an AI assistant with a $20/mo subscription fee is enormous.

Would I be happy with new-hire code out of a $80K/yr headcount, did I have a choice?

If I get that same code, faster, for 1% of the cost?

[–] UnderpantsWeevil@lemmy.world 8 points 2 days ago (2 children)

Would I be happy with new-hire code out of a $80K/yr headcount, did I have a choice?

If I get that same code, faster, for 1% of the cost?

The theory is that the new hire gets better over time as they learn the ins and outs of your business and your workplace style. And they're commanding an $80k/year salary because they need to live in a country that demands an $80k/year cost of living, not because they're generating $80k/year of value in a given pay period.

Maybe you get code a bit faster and even a bit cheaper (for now - those teaser rates never last long term). But who is going to be reviewing it in another five or ten years? Your best people will keep moving to other companies or retiring. Your worst people will stick around slapping the AI feed bar and stuffing your codebase with janky nonsense fewer and fewer people will know how to fix.

Long term, its a death sentence.

[–] MangoCats@feddit.it 1 points 19 hours ago (1 children)

Agreed... however:

The theory is that the new hire gets better over time as they learn the ins and outs of your business and your workplace style.

The practice is that over half of them move on to "other opportunities" within a couple of years, even if you give them good salary, benefits and working conditions.

And they’re commanding an $80k/year salary because they need to live in a country that demands an $80k/year cost of living

Not in the US. In the US they're commanding $80k/yr because of supply and demand, it has very little to do with cost of living. I suppose when you get supply so high / demand so low, you eventually hit a floor where cost of living comes into play, but in many high supply / low demand fields that doesn't happen until $30k/yr or even lower... Case in point: starting salaries for engineers in the U.S. were around $30-40k/yr up until the .com boom, at which point software engineering capable college graduates ramped up to $70k/yr in less than a year, due to demand outstripping supply.

stuffing your codebase with janky nonsense

Our codebase had plenty of janky nonsense before AI came around. Just ask anyone: their code is great, but everyone else's code is a bunch of janky nonsense. I actually have some hope that AI generated code may improve to a point where it becomes at least more intelligible to everyone than those other programmers' janky nonsense. In the past few months I have actually seen Anthropic/Claude's code output improve significantly toward this goal.

Long term, its a death sentence.

Definitely is, the pipeline should continue to be filled and dismissing seasoned talent is a mistake. However, I suspect everyone in the pipeline would benefit from learning to work with the new tools, at least the "new tools" in a year or so, the stuff I saw coming out of AI a year ago? Not really worthwhile at that time, but today it is showing promise - at least at the microservice level.

[–] UnderpantsWeevil@lemmy.world 1 points 7 hours ago (1 children)

The practice is that over half of them move on to “other opportunities” within a couple of years, even if you give them good salary, benefits and working conditions.

In my experience (coming from O&G IT) there's a somewhat tight knit circle of contractors and businesses tied to specific applications. And you just cycle through this network over time.

I've got a number of coworkers who are ex-contractors and a contractor lead who used to be my boss. We all work on the same software for the same company either directly or indirectly. You might move to command a higher salary, but you're all leveraging the same accrued expertise.

If you cut off that circuit of employment, the quality of the project will not improve over time.

In the US they’re commanding $80k/yr because of supply and demand

You'll need to explain why all the overseas contractors are getting paid so much less, in that case.

Again, we're all working on the same projects for the same people with comparable skills. But I get paid 3x my Indian counterpart to be in the correct timezone and command enough fluent English language skills to deal with my bosses directly.

Case in point: starting salaries for engineers in the U.S. were around $30-40k/yr up until the .com boom, at which point software engineering capable college graduates ramped up to $70k/yr in less than a year, due to demand outstripping supply.

But then the boom busted and those salaries deflated down to the $50k range.

I had coworkers who would pin for the Y2K era, when they were making $200k in the mid 90s to do remedial code clean up. But that was a very shortly lived phenomen. All that work would have been outsourced overseas in the modern day.

Our codebase had plenty of janky nonsense before AI came around.

Speeding up the rate of coding and volume of code makes that problem much worse.

I've watched businesses lose clients - I even watched a client go bankrupt - from bad coding decisions.

In the past few months I have actually seen Anthropic/Claude’s code output improve significantly toward this goal.

If you can make it work, more power to you. But it's a dangerous game I see a few other businesses executing without caution or comparable results.

[–] MangoCats@feddit.it 1 points 4 hours ago (1 children)

You’ll need to explain why all the overseas contractors are getting paid so much less, in that case.

If you're talking about India / China working for US firms, it's supply and demand again. Indian and Chinese contractors provide a certain kind of value, while domestic US direct employees provide a different kind of value - as you say: ease of communication, time zone, etc. The Indians and Chinese have very high supply numbers, if they ask for more salary they'll just be passed over for equivalent people who will do it for less. US software engineers with decades of experience are in shorter supply, and higher demand by many US firms, so...

Of course there's also a huge amount of inertia in the system, which I believe is a very good thing for stability.

But then the boom busted and those salaries deflated down to the $50k range.

And that was a very uneven thing, but yes: starting salaries on the open market did deflate after .com busted. Luckly, I was in a niche where most engineers were retained after the boom and inertia kept our salaries high.

$200K for remedial code cleanup should be a transient phenomenon, when national median household income hovers around $50-60K. With good architecture and specification development, AI can do your remedial code cleanup now, but you need that architecture and specification skill...

I’ve watched businesses lose clients - I even watched a client go bankrupt - from bad coding decisions.

I interviewed with a shop in a University town that had a mean 6 month turnover rate for programmers, and they paid the fresh-out of school kids about 1/3 my previous salary. We were exploring the idea of me working for them for 1/2 my previous salary, basically until I found a better fit. Ultimately they decided not to hire me with the stated reason not being that my salary demands were too high, but that I'd just find something better and leave them. Well.... my "find a new job in this town" period runs 3-6 months even when I have no job at all, how can you lose anything when you burn through new programmers every 6 months or less? I believe the real answer was that they were afraid I might break their culture, start retaining programmers and building up a sustained team like in the places I came from, and they were making plenty of money doing things the way they had been doing them for 10 years so far...

it’s a dangerous game I see a few other businesses executing without caution or comparable results.

From my perspective, I can do what needs doing without AI. Our whole team can, and nobody is downsizing us or demanding accelerated schedules. We are getting demands to keep the schedules the same while all kinds of new data privacy and cybersecurity documentation demands are being piled on top. We're even getting teams in India who are allegedly helping us to fulfill those new demands, and I suppose when the paperwork in those areas is less than perfect we can "retrain" India instead of bringing the pain home here. Meanwhile, if AI can help to accelerate our normal work, there's plenty of opportunity for exploratory development of new concepts that's both more fun for the team and potentially profitable for the company. If AI turns out to be a bust, most engineers on this core team have been supporting similar products for 10-20 years... we handled it without AI before...

[–] UnderpantsWeevil@lemmy.world 1 points 1 hour ago (1 children)

If you’re talking about India / China working for US firms, it’s supply and demand again.

It's clearly not. Otherwise, we wouldn't have a software guy left standing inside the US.

I interviewed with a shop in a University town that had a mean 6 month turnover rate for programmers

That's just a bad business.

I can do what needs doing without AI.

More power to you.

[–] MangoCats@feddit.it 1 points 1 hour ago

If you’re talking about India / China working for US firms, it’s supply and demand again.

It’s clearly not. Otherwise, we wouldn’t have a software guy left standing inside the US.

India / China can do a lot of things. For my company, they're very strong in terms of producing products for their domestic market. They're not super helpful per-capita on the US market oriented tasks, but they're cheap - so we try to use them where we can.

There's not a lot of good US software employees standing around unemployed... A lot of what I have interviewed as "available" are not even as good as what we get from India, but we have a house full of good developers already.

That’s just a bad business.

While I might reflexively agree, you have to ask yourself: from what perspective? Their customers may not be the happiest with the quality of the product, but for some reason they keep buying it and the business keeps expanding and making more and more profit as the years go by... in my book that's a better business than the upstanding shop I worked for for 12 years that eventually went bust because we put too much effort into making good stuff through hiring good people to make it, and not enough effort into selling the stuff so we could continue to operate.

[–] Mniot@programming.dev 2 points 1 day ago

The theory is that the new hire gets better over time

It always amazes me how few people get this. Have they only ever made terrible hires?

The way that a company makes big profits is by hiring fresh graduates and giving them a cushy life while they grow into good SWEs. By the time you're paying $200k for a senior software engineer, they're generating far more than that in value. And you only had to invest a couple years and some chump change.

But now businesses only think in the short-term and so paying $10k for a month of giving Anthropic access to our code base sounds like a bargain.

[–] korazail@lemmy.myserv.one 5 points 2 days ago (1 children)

That new hire might eat resources, but they actually learn from their mistakes and gain experience. If you can't hold on to them once they have experience, that's a you problem. Be more capitalist and compete for their supply of talent; if you are not willing to pay for the real human, then you can have a shitty AI that will never grow beyond a 'new hire.'

The future problem, though, is that without the experience of being a junior dev, where do you think senior devs come from? Can't fix crappy code if all you know how to do is engineer prompts to a new hire.

"For want of a nail," no one knew how to do anything in 2030. Doctors were AI, Programmers were AI, Artists were AI, Teachers were AI, Students were AI, Politicians were AI. Humanity suffered and the world suffocated under the energy requirements of doing everything poorly.

[–] MangoCats@feddit.it 1 points 19 hours ago

If you can’t hold on to them once they have experience, that’s a you problem.

I work at a large multi-national corp with competitive salaries, benefits, excellent working conditions, advancement opportunities, etc. I still have watched promising junior engineers hit the door just when they were starting to be truly valuable contributors.

you can have a shitty AI that will never grow beyond a ‘new hire.’

So, my perspective on this is that : over the past 12 months, AI has advanced more quickly than all the interns and new hires I have worked with over the past 3 decades. It may plateau here in a few months, even if it does it's already better than half of the 2 year experienced software engineers I have worked with, at least at writing code based on natural language specs provided to it.

The future problem, though, is that without the experience of being a junior dev, where do you think senior devs come from?

And I absolutely agree, the junior dev pipeline needs to stay full, because writing code is less than half of the job. Knowing what code needs writing is a huge part of it, crafting implementable and testable requirements, learning the business and what is important to the business, that has always been more than half of my job when I had the title "Software Engineer".

the world suffocated under the energy requirements of doing everything poorly.

While I sympathize, the energy argument is a pretty big red herring. What's the energy cost of a human software engineer? They have a home that has to be built, maintained, powered, etc. Same for their transportation which is often a privately owned automobile, driving on roads that have to be built and maintained. They have to eat, they need air conditioning, medical care, dental care, clothes, they have children who need to spend 20 years in school, they take vacations on cruise ships or involving trans-oceanic jet travel... add up all that energy and divide it by their productive output writing code for their work... if AI starts helping them write that code even 2x faster, the energy consumed by AI is going to be trivial compared to the energy consumed by the software engineer per unit of code produced, even if producing code is only 20% of their total job.

I would say the same goes for Doctors, Teachers, Politicians, etc. AI is not going to replace 100% of any job, but it may be dramatically accelerating 30% or more of many of them, and that increase in productivity / efficiency / accuracy is going to pay off in terms of fewer ProfessionX required to meet demands and/or ProfessionX simply serving the world better than they used to.

My sister in law was a medical transcriptionist - made good money, for a while. Then doctors replaced her with automatic transcriptionists, essentially the doctors quit outsourcing their typing work to humans and started trusting machines to do it for them. All in all, the doctors are actually doing more work now than they did before when they had human transcriptionists they could trust, because now they are have the AI transcription that they need to check more closely for mistakes than they did their human transcriptionists, but the cost differential is just too big to ignore. That's a job that was "eliminated" by automation, at least 90% or more in the last 20 years. But, it was really a "doctor accessory" job, we still have doctors, even though they are using AI assistants now...

[–] homura1650@lemmy.world 3 points 2 days ago (1 children)

New hires are often worse than useless. The effort that experienced developers spend assisting them is more than it would take those developers to do the work themselves.

[–] MangoCats@feddit.it 2 points 1 day ago

Yes, this is the cost of training, and it is high, but also necessary if you are going to maintain a high level of capability in house.

Management loves the idea of outsourcing, my experience of outsourcing is that the ultimate costs are far higher than in house training.

[–] LiamMayfair@lemmy.sdf.org 2 points 2 days ago (1 children)

Writing tests is the one thing I wouldn't get an LLM to write for me right now. Let me give you an example. Yesterday I came across some new unit tests someone's agentic AI had written recently. The tests were rewriting the code they were meant to be testing in the test itself, then asserting against that. I'll say that again: rather than calling out to some function or method belonging to the class/module under test, the tests were rewriting the implementation of said function inside the test. Not even a junior developer would write that nonsensical shit.

The code those unit tests were meant to be testing was LLM written too, and it was fine!

So right now, getting an LLM to write some implementation code can be ok. But for the love of god, don't let them anywhere near your tests (unless it's just to squirt out some dumb boilerplate helper functions and mocks). LLMs are very shit at thinking up good test cases right now. And even if they come up with good scenarios, they may pull these stunts on you like they did to me. Not worth the hassle.

[–] MangoCats@feddit.it 2 points 2 days ago

Trusting any new code blindly is foolish, even if you're paying a senior dev $200K/yr for it, it should be reviewed and understood by other team members before accepting it. Same is true for an LLM, but of course most organizations never do real code reviews in either scenario...

20ish years ago, I was a proponent of pair programming. It's not for everyone. It's not for anyone 40 hours a week, but in appropriate circumstances for a few hours at a session it can be hugely beneficial. It's like a real-time code review during development. I see that pair programming is as popular today as it was back then, maybe even less so, but... "Vibe coding" with LLMs in chat mode? That can be a very similar experience, up to a point.

[–] rumba@lemmy.zip 2 points 2 days ago

We've been poking at it for a while now. The parent company is demanding we see where it can fit. We've found some solid spots.

It's not good at ingesting a sprawling project and rooting in changes in several places, but it's not bad at looking over a file and making best practice recommendations. I've seen it preemptively find some bugs in old code.

If you want to use a popular library you're not familiar with, it'll wedge it in your current function reasonably well; you'll need to touch it, but you probably won't need to RTFM.

It's solid at documenting existing code. Make me a manual page for every function/module in this project.

It can make a veteran programmer faster by making boilerplates and looking over their shoulder for problems. It has some limited use for peer programming.

It will NOT let you hire a green programmer instead of a vetran, but it can help a green programmer come up to speed faster as long as you forbid them from copy/paste.