this post was submitted on 13 Feb 2026
1010 points (95.9% liked)

Programmer Humor

29713 readers
2007 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] SCmSTR@lemmy.blahaj.zone 1 points 43 minutes ago

The the greatest intellectual property theft and laundering scheme in human history

[–] kokesh@lemmy.world 12 points 7 hours ago (1 children)

As it should. All the idiots calling themselves programmers, because they tell crappy chatbot what to write, based on stolen knowledge. What warms my heart a little is the fact that I poisoned everything I ever wrote on StackOverflow just enough to screw with AI slopbots. I hope I contributed my grain of sand into making this shit little worse.

[–] DeathsEmbrace@lemmy.world 2 points 7 hours ago (1 children)

Do it in a way that a human can understand but AI fails. I remember my days and you guys are my mvp helping me figure shit out.

load more comments (1 replies)
[–] ricecake@sh.itjust.works 16 points 8 hours ago

That's not what that research document says. Pretty early on it talks about rote mechanical processes with no human input. By the logic they employ there's no difference between LLM code and a photographer using Photoshop.

[–] iglou@programming.dev 23 points 9 hours ago* (last edited 9 hours ago) (1 children)

That sounds like complete bullshit to me. Even if the logic is sound, which I seriously doubt, if you use someone's code and you claim their license isn't valid because some part of the codebase is AI generated, I'm pretty sure you'll have to prove that. Good luck.

[–] CanadaPlus@lemmy.sdf.org 5 points 8 hours ago (1 children)

If there was an actual civil suit you'd probably be able to subpoena people for that information, and the standard is only more likely than not. I have no idea if the general idea is bullshit, though.

IANAL

[–] I_am_10_squirrels@beehaw.org 4 points 8 hours ago* (last edited 8 hours ago) (1 children)

You forgot the heart

I ♥️ ANAL

[–] CanadaPlus@lemmy.sdf.org 3 points 8 hours ago* (last edited 8 hours ago) (1 children)

Would that be North African Lawyer, or North American Lawyer?

In any case, we're splitting the cheque. /s

[–] I_am_10_squirrels@beehaw.org 1 points 7 minutes ago

Are you suggesting that lawyers migrate?

[–] Evil_Shrubbery@thelemmy.club 16 points 9 hours ago (1 children)

By that same logic LLMs themselves (by now some AI bro had to vibe code something there) & their trained datapoints (which were on stolen data anyway) should be public domain.

What revolutionary force can legislate and enforce this?? Pls!?

[–] CanadaPlus@lemmy.sdf.org 3 points 8 hours ago* (last edited 4 hours ago) (1 children)

By that same logic LLMs themselves (by now some AI bro had to vibe code something there)

I'm guessing LLMs are still really really bad at that kind of programming. The packaging of the LLM, sure.

& their trained datapoints

For legal purposes, it seems like the weights would be generated by the human-made training algorithm. I have no idea if that's copyrightable under US law. The standard approach seems to be to keep them a trade secret and pretend there's no espionage, though.

[–] Evil_Shrubbery@thelemmy.club 1 points 7 hours ago* (last edited 7 hours ago)

The packaging of the LLM, sure.

Yes, totally, but OP says a small bit affects "possibly the whole project" so I wanted to point out that includes prob AIs, Windows, etc too.

[–] fubarx@lemmy.world 50 points 11 hours ago (2 children)

This whole post has a strong 'Sovereign Citizen' vibe.

[–] GalacticSushi@lemmy.blahaj.zone 7 points 8 hours ago

I do not give Facebook or any entities associated with Facebook permission to use my pictures, information, messages, or posts, both past and future.

[–] brianary@lemmy.zip 13 points 10 hours ago (1 children)

The Windows FOSS part, sure, but unenforceable copyright seems quite possible, but probably not court-tested. I mean, AI basically ignored copyright to train in the first place, and there is precedent for animals not getting copyright for taking pictures.

[–] CanadaPlus@lemmy.sdf.org 8 points 8 hours ago* (last edited 8 hours ago) (1 children)

If it's not court tested, I'm guessing we can assume a legal theory that breaks all software licensing will not hold up.

Like, maybe the code snippets that are AI-made themselves can be stolen, but not different parts of the project.

[–] brianary@lemmy.zip 3 points 4 hours ago

That seems a more likely outcome.

[–] cmhe@lemmy.world 6 points 8 hours ago* (last edited 8 hours ago) (1 children)

I had a similar thought. If LLMs and image models do not violate copyright, they could be used to copyright-wash everything.

Just train a model on source code of the company you work for or the copyright protected material you have access to, release that model publicly and then let a friend use it to reproduce the secret, copyright protected work.

[–] pkjqpg1h@lemmy.zip 4 points 8 hours ago (1 children)

btw this is happening actuallt AI trained on copyrighted material and it's repeating similar or sometimes verbatim copies but license-free :D

[–] definitemaybe@lemmy.ca 1 points 15 minutes ago

This is giving me illegal number vibes. Like, if an arbitrary calculation returns an illegal number that you store, are you holding illegal information?

(The parallel to this case is that if a statistical word prediction machine generates copyrighted text, does that make distribution of that text copyright violation?)

I don't know the answer to either question, btw, but I thought it was interesting.

[–] meekah@discuss.tchncs.de 33 points 12 hours ago (1 children)

Aren't you all forgetting the core meaning of open source? The source code is not openly accessible, thus it can't be FOSS or even OSS

This just means microslop can't enforce their licenses, making it legal to pirate that shit

[–] the_artic_one@programming.dev 3 points 7 hours ago (1 children)

It's just the code that's not under copyright, so if someone leaked it you could legally copy and distribute any parts which are AI generated but it wouldn't invalidate copyright on the official binaries.

If all the code were AI generated (or enough of it to be able to fill in the blanks), you might be able to make a case that it's legal to build and distribute binaries, but why would you bother distributing that slop?

[–] m0stlyharmless@lemmy.zip 2 points 7 hours ago

Even if it were leaked, it would still likely be very difficult to prove that any one component was machine generated from a system trained on publicly accessible code.

[–] leftzero@lemmy.dbzer0.com 11 points 10 hours ago (1 children)

Is Windows FOSS now?

Ew, no, thank you, I don't want it.

load more comments (1 replies)
[–] Kazumara@discuss.tchncs.de 15 points 12 hours ago (1 children)

How the hell did he arrive at the conclusion there was some sort of one-drop rule for non-protected works.

Just because the registration is blocked if you don't specify which part is the result of human creativity, doesn't mean the copyright on the part that is the result of human creativity is forfeit.

Copyright exists even before registration, registration just makes it easier to enforce. And nobody says you can't just properly refile for registration of the part that is the result of human creativity.

[–] JackbyDev@programming.dev 7 points 11 hours ago* (last edited 10 hours ago) (1 children)

Yeah, a lot of copyright law in the US is extremely forgiving towards creators making mistakes. For example, you can only file for damages after you register the copyright, but you can register after the damages. So like if I made a book, someone stole it and starting selling copies, I could register for a copyright afterwards. Which honestly is for the best. Everything you make inherently has copyright. This comment, once I click send, will be copyrighted. It would just senselessly create extra work for the government and small creators if everything needed to be registered to get the protections.

Edit: As an example of this, this is why many websites in their terms of use have something like "you give us the right to display your work" because, in some sense, they don't have the right to do that unless you give them the right. Because you have a copyright on it. Displaying work over the web is a form of distribution.

[–] definitemaybe@lemmy.ca 1 points 5 minutes ago

That edit had confused so many users over the years. They think they are signing away rights to their copyrighted work by agreeing to the platform's EULA, but the terms granting them license to freely store and distribute your work? That's literally what you want their service to do because you're posting it with the intention of the platform showing it to others!

Granted, companies are using user data for other purposes too, so that's a problem, but I've seen so so many posts over the last couple decades of people complaining about EULAs that describe core site functions...

[–] ToTheGraveMyLove@sh.itjust.works 12 points 11 hours ago

Public domain ≠ FOSS

[–] RagingRobot@lemmy.world 62 points 16 hours ago (2 children)

That's not even remotely true....

[–] chaogomu@lemmy.world 48 points 14 hours ago (5 children)

The law is very clear that non-human generated content cannot hold copyright.

That monkey that took a picture of itself is a famous example.

But yes, the OP is missing some context. If a human was involved, say in editing the code, then that edited code can be subject to copyright. The unedited code likely cannot.

Human written code cannot be stripped of copyright protection regardless of how much AI garbage you shove in.

Still, all of this is meaningless until a few court cases happen.

load more comments (5 replies)
load more comments (1 replies)
[–] phoenixz@lemmy.ca 17 points 13 hours ago

So by that reasoning all Microsoft software is open source

Not that we'd want it, it's horrendously bad, but still

[–] Michal@programming.dev 32 points 14 hours ago (17 children)

Counterpoint: how do you even prove that any part of the code was AI generated.

Also, i made a script years ago that algorithmically generates python code from user input. Is it now considered AI-generated too?

load more comments (17 replies)
[–] herseycokguzelolacak@lemmy.ml 6 points 10 hours ago (2 children)

How do you prove some codebase was AI generated?

This might be true, but it is practically unenforceable.

[–] Wfh@lemmy.zip 6 points 10 hours ago (1 children)

Agentic IDEs like Cursor track usage and how much of the code is LLM vs human generated.

Which probably means it tracks every single keystroke inside it. Which rightfully looks like a privacy and/or corporate code ownership nightmare.

But hey at least our corporate overlords are happy to see the trend go up. The fact that we tech people were all very unsubtly threatened into forced agentic IDEs usage despite vocal concerns about code quality drop, productivity losses and increasing our dependence on US tech (especially openly nazi tech) says it all.

[–] herseycokguzelolacak@lemmy.ml 1 points 6 hours ago (1 children)

Agentic IDEs like Cursor track usage and how much of the code is LLM vs human generated.

For your code, sure. How do you know someone else's code is LLM generated?

[–] Wfh@lemmy.zip 1 points 6 hours ago

Because it's a surveillance state baby. Everything is uploaded to a central server so our corporate overlords can monitor our usage.

load more comments (1 replies)
[–] sudoer777@lemmy.ml 4 points 9 hours ago* (last edited 9 hours ago) (1 children)

https://reuse.software/faq/#uncopyrightable

The REUSE specification recommends claiming copyright even if it's machine generated. Is this incorrect information?

EDIT: Also, how is copyrighting code from an AI different than copyrighting an output from a compiler?

[–] Buddahriffic@lemmy.world 4 points 9 hours ago

I believe it was a product of the earlier conflict between copyright owners and AIs on the training side. The compromise was that they could train on copyright data but lose any copyright protections on the output of the AI.

[–] WiseFirefighter7299@sh.itjust.works 15 points 13 hours ago (4 children)

windows would be OSS, not FOSS.

load more comments (4 replies)
[–] HappyFrog@lemmy.blahaj.zone 279 points 21 hours ago (58 children)

As much as I wish this was true, I don't really think it is.

load more comments (58 replies)
load more comments
view more: next ›