this post was submitted on 20 Feb 2026
94 points (100.0% liked)
Technology
42293 readers
168 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Well, AI code should be reviewed prior merge into master, same as any code merged into master.
We have git for a reason.
So I would definitely say this was a human fault, either reviewer’s or the human’s who decided that no (or AI driven) review process is needed.
If I would manage devOps, I would demand that AI code has to be signed off by a human on commit taking responsibility with the intention that they review changes made by AI prior pushing
And you would get burned. Today's AI does one thing really really well - create output that looks correct to humans.
You are correct that mandatory review is our best hope.
Unfortunately, the studies are showing we're fucked anyway.
Because whether the AI output is right or wrong, it is highly likely to at least look correct, because creating correct looking output is where (what we call "AI", today) AI shines.
Realistically what happens is the code review is done under time pressure and not very thoroughly.