this post was submitted on 05 May 2026
139 points (96.6% liked)

Technology

84434 readers
4171 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 14 comments
sorted by: hot top controversial new old
[–] floquant@lemmy.dbzer0.com 16 points 2 days ago (2 children)

Even if it was only live for a couple of days, I wonder how much it inflated the "commits made by Copilot" metric that they will no doubt brag about to their investors

[–] FooBarrington@lemmy.world 7 points 2 days ago

Yeah, I find it hard to believe this wasn't their goal, especially considering they found this bug in testing and still released the update: https://news.ycombinator.com/item?id=47994193

[–] Sylvartas@lemmy.dbzer0.com 3 points 2 days ago

They're probably pissed it was discovered so fast

[–] sp3ctr4l@lemmy.dbzer0.com 28 points 3 days ago (1 children)

Microsoft claims to have fixed...

Via their own vibe-coding.

That supposedly fixed the vibe-coding attribution problem.

Mhm, yep, seems hunky dory to me!

[–] egrets@lemmy.world 6 points 3 days ago (2 children)

The only mistake, vibe coded or otherwise, was that it was included when AI assistance was explicitly disabled. It's otherwise entirely deliberate.

That said, I'm not sure the concept a bad thing overall. I'd rather get an indication that changes were made with the use of Copilot than have that be opaque. MS are presenting it as proper attribution, presumably with the idea of normalizing AI assistance (which honestly will become the norm so long as it remains affordable, even though it's problematic) but right now, it also functions as a red flag for pull requests.

[–] flying_sheep@lemmy.ml 2 points 2 days ago* (last edited 2 days ago) (1 children)

Two mistakes:

  1. The all setting makes little sense, unless someone wants to enforce a zero-AI policy. It shouldn't have been the default. In-line completions don't justify attribution, so the chatAndAgents setting makes more sense (there can be more arguments made about uncopyrightable LLM output and the fact that “creation height” can't be automatically determined)
  2. The code is bugged and attributed the contributions of other LLMs to copilot

The one you cited is more of a safety measure against the intersection of these two issues: if the code would work correctly, it wouldn't add the copilot line anyway.

[–] egrets@lemmy.world 1 points 2 days ago

Yes, fair - your second observation isn't mentioned directly in the comment I linked -- just my point plus your first point -- but it is admitted explicitly in this follow-up post.

[–] sp3ctr4l@lemmy.dbzer0.com 4 points 3 days ago* (last edited 2 days ago) (1 children)

I'm gonna try to say this gently:

Microsoft... is gone now.

They contracted terminal corporate dementia.

They're not going to be the same anymore.

... I used to work for them.

Something like was inevitable, given their highly cliquey and authoritarian corporate culture.

They've imploded under the weight of around two decades of accumulating technical debt, around two decades of the guys and gals that huffed the most farts getting the most promoted.

They are now primarily a member of the military industrial complex.

[–] egrets@lemmy.world 2 points 3 days ago (2 children)

I agree that they're floundering, and that they're desperately trying to dig themselves out of a hole (if you'll forgive the mixed metaphor), but I don't think it's useful to chalk up to AI mistakes what is actually demonstrably a human marketing decision.

[–] sp3ctr4l@lemmy.dbzer0.com 1 points 2 days ago* (last edited 2 days ago)

Double post, but:

I just now realized I fat fingered my own semi-manual autocorrect, and did not originally use 'They contracted', which is what I meant.

I have accordingly here made log of my revision commit, which should now be reflected above.

Apologies for any confusion this may have created, derp.

[–] sp3ctr4l@lemmy.dbzer0.com 2 points 2 days ago

I agree with you... its the people and the way they've basically conditioned themselves to act, not the LLM.

Also, to further confuse the metaphor:

You can't dig your way out of a hole that you flooded, doesn't matter how hard you pedal your mind bicycle.

[–] pHr34kY@lemmy.world 16 points 2 days ago* (last edited 2 days ago)

AI is "taking credit for the work of humans" all the way down.

[–] luciferofastora@feddit.org 6 points 3 days ago

One particularly nasty example is when a dev "deleted Copilot's generated English commit message and manually wrote [their] own commit message instead. However, after the commit was created, the final Git history still contained the Copilot co-author line."

While I can see an argument that using it in development should see your code marked as AI-assisted, it wouldn't even hold in this case: "Copilot only generated a commit message suggestion; it did not author the code", and even that suggestion was rejected in favour of manual work.

It sneakily messed with the commit, not just without explicit consent but despite the user's explicit dissent. That's not even an opt-in/opt-out discussion at that point, if you don't get an option.

Now I wonder whether that could happen even if you don't have Copilot at all (or rather no license, no matter how much the AI-postles at work have been trying to sell me one). Intuitively, it shouldn't, but who knows...

[–] vithigar@lemmy.ca 1 points 2 days ago

This happened to me on a nextra docs sure I use for one of my projects. Copilot added its attribution to a two line commit that added two items to a page's _meta.json file. I was baffled as to what copilot could even had potentially have done to help.