this post was submitted on 01 May 2026
37 points (97.4% liked)
Technology
84257 readers
3895 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
My experience is, they're not. Like the article says they are just focused on MOAR and not on the quality of the output. It may take years for the unmaintainable code to cause problems, and they may have already been laid off by the time that happens, anyway .
I don't write much code anymore, but when I did, there was a fair amount of embedded code, where fixing a bug is more costly than just pushing out a build to a production server. I actively sought out automation back then, but the purpose of the automation was to help cover edge cases and better test the embedded code for flaws that traced through multiple layers of code.
Whenever I start a new software project, it usually starts with a short period of experimentation when I try out several things. Then, I coalesce on an architecture in my head (and eventually document it), and once I do that I can add more structure to the code.
Given the state of the AI tools today, I can see myself using them to accelerate all the little fiddly parts of this (especially if I can give it a coding standard and have it stick to it). But I wouldn't trust it more than that. I would always keep the archictecture separate, because I don't trust the AI tools to change it on me for no good reason.
Hoooooh boy, that if is doing a lot of heavy lifting, in my experience. I'm constantly telling the stupid little stochastic fuck to follow basic coding standards I've given it.
I don't use a lot of AI tooling outside of debugging and a little bit into command discovery, but fuck if the little shit isn't constantly rewriting my code into a shit style that I hate and constantly correct.
Those are all great habits.
But the time spent doing that is time not shipping code. Most companies don't give a flying fuck about quality, they just want to ship as much as possible to make as much money as possible.
When the cost to ship trash code trends toward zero, then there will not be value in shipping trash code. Companies will need to focus on software that is actually competitive (in a qualitative way) because otherwise their customers will just self-vend the slop code.