this post was submitted on 13 Mar 2026
142 points (93.3% liked)

Selfhosted

56958 readers
2103 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Just a PSA.

See this thread

Sorry to link to Reddit, but not only is the dev sloppily using using Claude to do something like 20k line PRs, but they are completely crashing out, banning people from the Discord (actually I think they wiped everything from Discord now), and accusing people forking their code of theft.

It’s a bummer because the app was pretty good… thankfully Calibre-web and Kavita still exist.

you are viewing a single comment's thread
view the rest of the comments
[–] shads@lemy.lol 13 points 22 hours ago (3 children)

And every time the use of LLMs for open source development comes up we get the same tired spiel from people about how it's just a tool and implications that anyone who doesn't embrace it with jpy in their heart is just a Luddite.

It seems to me that it's less a tool and more like intentionally infecting your project with cancer. Sure it shows all the signs of rapid growth, but metastasization isn't sustainable or desirable. Plus I am yet to encounter a strong advocate for LLMs who isn't a cunt.

[–] PeriodicallyPedantic@lemmy.ca 2 points 8 hours ago (1 children)

I think it kinda depends on the context. If someone is just making a tool for themselves and they slap on MIT or GPL3 just because who cares someone else can have it, then sure. Who cares if it's trash if the stakes are so low that they're scraping the ground and the user base is expected to be single digits.

But when you care about the reputation of your project, or if your project requires people trust it, then yeah for sure it's not appropriate to vibe/slop it.

I have ethical concerns about the realities of how this tech is used, mainly in what it's doing to the economic and power dynamics in society. But I don't have a problem with the tech itself. That said, I have to admit that it may not be realistic to separate the tech from its inevitable impact. Now I have become death, the destroyer of worlds, and all that.

[–] shads@lemy.lol 2 points 6 hours ago

How do people gain the ability to make these major projects if not for cutting their teeth on the small ones though. We cut the apprentice and journeyman stages of mastering an art out, replace it with slop, and then ten years from now we wonder why kids these days are so incapable of actually creating anything.

I have talked to kids who have told me that the assignments they got at school were so trivial they just ran them through ChatGPT rather than waste their time. When I pointed out that the reason the assignments were "trivial" was to give them the skills and confidence to do the big projects when the time came I got, at best, blank looks.

I said it somewhere else, if you are using an LLM to generate unit tests I find it hard to be terribly mad at that. If it's scaffolding documentation, meh whatever. If it's generating the main body of your project, I have concerns. Plus I circle back to how can you open source code that may have been stolen from a copyrighted work?

[–] chicken@lemmy.dbzer0.com 3 points 12 hours ago (1 children)

I'll argue that it is a tool, and object to automatic zealous hostility towards anyone using it, but that doesn't mean criticisms of how that tool is being used aren't valid. It seems like that is what people are focusing on here, and they definitely aren't Luddites for doing so.

[–] shads@lemy.lol 3 points 9 hours ago (2 children)

I think I can provide you a great equivalent. Firearms, they have utility, but there are people who make them a lifestyle choice, and there are people who make them their whole personality. There are also a lot of people just desperate for an excuse to use one. I grew up with a couple of farmers in the extended family, I would never argue guns should be entirely banned, but I am so glad I live somewhere with sane laws around gun ownership. It would be so nice if we had similar consideration around regulating LLMs.

The danger to open source as I see it is that LLMs degrade the quality and ability of developers while increasing their throughput, and I have never once heard someone complain that open source lacks quantity, but I hear a lot of people complaining about the quality.

[–] PeriodicallyPedantic@lemmy.ca 2 points 8 hours ago (1 children)

I think that the problem, in both cases, is culture.

It's not that either of those are bad, or bad for people; it's bad for people of this culture or people of this society. It's how the two intersect that is the problem.

It could be a tool that lifts up the worker or creative, but instead it's a tool to devalue the creative and extract power and wealth.
It highlights that people with power get a different set of rules and laws than the rest of us, and they're using that to further entrench and enrich themselves.

[–] shads@lemy.lol 2 points 6 hours ago

And it's so noisy. We are already losing bug bounties, it's swamping open source projects in poor quality or even counter productive "work" on github to get recognition, its drowning out the work of creatives, its invading so many aspects of life (education, communication, research, public policy) and its fundamentally a bad tool for so many of those areas.

I recently applied for a job and got some advice from a friend who works HR in a different industry. His advice, see if you can find out which LLM they use and run your application through it. A lot of positions are getting huge numbers of applicants so they are using LLMs to generate the short list for interview, you could have the absolute perfect application but because the LLM doesn't like the way you wrote it you are thrown out of the pool without a human being ever seeing you. It's so insidious, by being "helpful" it reinforces its necessity.

[–] chicken@lemmy.dbzer0.com 1 points 8 hours ago (1 children)

I will complain about quantity, many areas where open source projects are competing with closed source commercial products they have not achieved feature parity or a comparable level of polish, quantity matters. So does, as someone else touched on, quality of life improvements to the process of writing code like ease of acquiring and synthesizing information. That doesn't mean it's necessarily a worthwhile tradeoff, but how much is really being sacrificed depends on what exactly is being done with a LLM. To me one part of what's described here that's clearly going too far is using it to automate communication with other people contributing to the project, there's no way that is worth it.

As for the gun thing, I will support entirely banning LLM powered weapons intended to kill people, that's an easy choice.

[–] shads@lemy.lol 2 points 7 hours ago (1 children)

I still don't think quantity is lacking, and when quality is there it's amazing how often Open Source becomes a defacto standard. How many video tools are just a shim over FFMPEG for example?

Yet again the problem I see is that LLMs are a seductive form of software cancer, it starts as a little help and before you know it we have booklore like projects. If open source can't be better it will be subsumed in slop.

Not disagreeing about LLMs as a weapon. In a functional society the person who pulls the trigger on any weapon is responsible for the consequences of that action. I wonder how eager the CEOs of these "AI" companies would be to weaponise their creations if they were held personally accountable for every injury caused by their product. By a jury. Preferably with explicit laws stating they could not indemnify or gain immunity.

[–] chicken@lemmy.dbzer0.com 1 points 6 hours ago* (last edited 6 hours ago) (1 children)

One example of a place where quantity is lacking is web browsers. Another might be mobile operating systems. I am glad projects like Firefox and GrapheneOS exist, but it's obvious that the volume of work needed to achieve broad compatibility and competitiveness for these types of software is a limiting factor. As for the idea that any LLM use is a slippery slope, the way to avoid the slippery slope fallacy would be to have compelling evidence or rationale that any use really does lead naturally to problematic use; without that the argument could apply to basically any programming thing that gets to be associated with things done badly (ie. Java), but I think it isn't usually the case that a popular tool has genuinely no good or safe ways to use it and I don't think that's true for AI.

[–] shads@lemy.lol 1 points 5 hours ago* (last edited 5 hours ago)

How many browsers would you like me to list, yes a lot of them are spins on some of the big incumbents, but there is a much wider variety than you might credit. Rendering engines on the other hand, yeah there's not much variety there.

Mobile operating systems are something of a special case I'm afraid, the Telcos and incumbents have got way too heavy a thumb on the scale, and if any new comer looks like breaking the duopoly it will be treated as an existential threat. It will be associated with paedophilic terrorists faster than you can blink.

Both incidentally categories where I will never be happy with slopcode. But hey if anyone wants to use a slop-coded browser I just heavily suggest you never enter any passwords or personal information while using it.

We are actively building a history of cases where LLM usage correlates heavily with that slope you mentioned, but hey that's OK, we aren't allowed to call things out before they happen, judgement may only be passed once the damage is done right?

Out of curiosity, we know that LLM usage increases cognitive deficit and in some cases leads to psychosis. How many fatalities would you say is an acceptable number before governments act? How degraded do we let our societies get before we reign it in?

At some point the bubble is going to burst and we will see a number of countries bankrupted in the name of "AI" I'm really curious to see if we learn our lessons at that point. Should be interesting.

[–] Vendetta9076@sh.itjust.works 3 points 15 hours ago* (last edited 15 hours ago) (1 children)

I find an LLM is a great way to shortcut the googling itd take for me to parse random error message #506 when I'm learning a new language but that's about it. I'm also in no way writing software meant for mass consumption.

[–] shads@lemy.lol 3 points 13 hours ago (1 children)

Ergo its a tool, a search engine replacement, that we wouldn't need if search hadn't gone to shit due to neglect and active internal sabotage.