this post was submitted on 05 May 2026
139 points (96.6% liked)
Technology
84434 readers
4166 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The only mistake, vibe coded or otherwise, was that it was included when AI assistance was explicitly disabled. It's otherwise entirely deliberate.
That said, I'm not sure the concept a bad thing overall. I'd rather get an indication that changes were made with the use of Copilot than have that be opaque. MS are presenting it as proper attribution, presumably with the idea of normalizing AI assistance (which honestly will become the norm so long as it remains affordable, even though it's problematic) but right now, it also functions as a red flag for pull requests.
Two mistakes:
allsetting makes little sense, unless someone wants to enforce a zero-AI policy. It shouldn't have been the default. In-line completions don't justify attribution, so thechatAndAgentssetting makes more sense (there can be more arguments made about uncopyrightable LLM output and the fact that “creation height” can't be automatically determined)The one you cited is more of a safety measure against the intersection of these two issues: if the code would work correctly, it wouldn't add the copilot line anyway.
Yes, fair - your second observation isn't mentioned directly in the comment I linked -- just my point plus your first point -- but it is admitted explicitly in this follow-up post.
I'm gonna try to say this gently:
Microsoft... is gone now.
They contracted terminal corporate dementia.
They're not going to be the same anymore.
... I used to work for them.
Something like was inevitable, given their highly cliquey and authoritarian corporate culture.
They've imploded under the weight of around two decades of accumulating technical debt, around two decades of the guys and gals that huffed the most farts getting the most promoted.
They are now primarily a member of the military industrial complex.
I agree that they're floundering, and that they're desperately trying to dig themselves out of a hole (if you'll forgive the mixed metaphor), but I don't think it's useful to chalk up to AI mistakes what is actually demonstrably a human marketing decision.
Double post, but:
I just now realized I fat fingered my own semi-manual autocorrect, and did not originally use 'They contracted', which is what I meant.
I have accordingly here made log of my revision commit, which should now be reflected above.
Apologies for any confusion this may have created, derp.
I agree with you... its the people and the way they've basically conditioned themselves to act, not the LLM.
Also, to further confuse the metaphor:
You can't dig your way out of a hole that you flooded, doesn't matter how hard you pedal your mind bicycle.