I'm using heylogin. It's a German company with members or former members of the CCC (Chaos Computer Club) with servers in Germany.
Fuck AI
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
I fucking hate vibe coding and stuff but their usage of AI seems more like autocompletion and tooling around the code. So nothing really frightening from my point of view
Even "generating boilerplate" isn't a good use case for AI. My coworker gave a presentation on how he used AI to "generate boilerplate" for a Go project and like 90% of the mountain of slop he generated was just not necessary. There's a snuck premise here that you need to generate a mountains of boilerplate, but that's not always the case. AI is cementing bad practices at my company.
The most people I see complaining about this kind of AI usage seems to have more a coworker ability / practice issue than an actual problem with AI. Nothing requires to accept AI slop (even for boilerplate) and it does not spare thorough reviews and practices / codestyle fine tuning. To me its more like a bad intern that works really fast and does not learn much. So with good and precise directions you can achieve something, otherwise you can do it yourself faster. It can of course become an issue if your code review loads increase too much due to people pushing AI generated PR
Really?
This is literally where LLMs have probably the most advantageous use with practically no downsides. Their devs aren't idiots that are suddenly vibe coding. Using an LLM can be an invaluable tool.
Linux already has merged code that had some form of LLM input years ago.
It's not about whether or not you're using an LLM as part of your work process, its more about whether or not you're submitting shitty code.
Even if you want an alternative for this reason, I can probably bet you that several PRs in Vaultwarden were probably looked over by someone's Claude chat while they were writing and testing it, or straight up took generated code and edited to their needs.
Hell I'd even bet Lemmy has PRs that have been touched by LLMs.
But muh purity!
I wish I could upvote this twice.
Its weird seeing so many people in a place literally called "Fuck AI" defending AI.
That's alarming.
Seriously can it stop. I just switched to BW.
Faaaaackkkkkkk
SyncThing + KeePass, I've been using this setup for a long time. Requires setup and isn't automagically done for you, but you control everything about it + it's decentralized and local. I unfortunately don't have any good guides off-hand, but I can try to give some pointers if you're interested to know more about it.
On Linux, the only downside is you can't use the auto-type feature in Wayland, but there are browser plugins to make it less of an issue.
Alternatively, if you are a self-hoster, you can still use the BitWarden local clients with an open source backend server that you control: https://github.com/dani-garcia/vaultwarden
You're going to have to stop using all software in the next five years or so if you want to keep up the LLM boycott.
Hopefully people who care will start (and for those that already are, continue) to contribute to open software projects that don't include this shit.
You say "this shit" while not understanding what they are doing. As per their listed guidelines, they use it for documentation, generating boilerplate and other kinds of repetitive tasks. Newsflash, that's how it is used in most companies I know, since chatGPT's inception. This is not vibe coding, this is using the tool as intended.
You do use autocomplete and autocorrect in your phone, right?
You say "this shit" while not understanding what they are doing.
Oh, I understand fine.
I'm a software developer, and the company I currently work for has mandated that we make ai coding tools (aka "this shit") part of our daily workflows. I've been using this shit every day for the last year and a half. I'm not a "vibe coder", either. I have 15 years of experience in this industry, and this shit has universally made my job worse. Even for simple or repetitive tasks it requires constant babysitting, and when it does actually produce functional code, it's always messy, verbose, and fails to match the style guidelines of our app, meaning I have to waste even more time cleaning THIS SHIT up (or prompting it through that cleanup process — which wastes my time AND my patience).
And most people I work with are a lot lazier with it than I am, which means now I have to spend twice as much time on code reviews to make sure that no one is pushing MORE OF THIS BROKEN FUCKING SHIT into our codebase. There have already been several major production outages at the company because of AI generated code committed by other teams, and in general the quality of our apps has fallen a lot.
Maybe AI tools are fine in isolation, I don't know. I've never asked one to build shitty node.js app #1743168... But if you set THIS SHIT lose on a mature codebase, that codebase immediately gets worse. It introduces bugs, makes the code harder to read, makes the code harder to maintain, and worst of all, it decreases the code literacy of all the developers using it.
When you write your own code, there's a self-reinforcement mechanism at play, the same as how taking notes in class helps you retain the information better than just passively listening. You don't get that when you just auto-generate and then passively review code, so we're starting to see a real "brain drain" where AI tools are harming developers' understanding of the apps they work on. This isn't hypothetical, I've seen this first hand. A year ago I could ask fellow developers to explain to me in detail the code they wrote three or four weeks ago and they could do it just fine. Now, devs can barely explain code from last week — which, as I'm sure you can imagine, greatly slows down the inevitable debugging that follows when the code they don't fully understand inevitably breaks.
So yeah, I understand perfectly well what's going on with this stupid, wasteful, tech-debt producing SHIT, and even though I can't avoid it in the software I write, I'm sure as shit going to avoid it in the software I use.
Shit.
Yep, I've noticed the people who have to use AI the most are usually the most noob people on the team. Reviewing the mountain of slop code they post is aggravating. Honestly, I don't even review it anymore. Fuck it. If you're not gonna take the time to write something good, I'm not gonna take the time to give you an honest review.
They'll reap what they sow. More bugs, more shit.
Ugh I really didn’t want to migrate password managers again 🤦♀️
Goddam it wtf
Would recommend VaultWarden
Why would you like undocumented LLM usage better than documented LLM usage? I also recommend vaultwarden, but not for this reason
Can you prove that Vaultwarden devs use LLM generated code?
Well fuck me I guess. Can't even use vaultwarden now
Fucking tools. Was gonna use Bitwarden but fuck that noise
Came across this the other day and considered setting it up to replace Vaultwarden. Definitely need to sit down and do that now, a vibe coded password manager sounds like an absolute fucking nightmare.
Read the guidelines posted there, using it as an autocomplete and a helper for docs is NOT vibe coding.
Thanks for the suggestion. That looks like a pretty cool option.
ew
I was looking into vault warden, an open source bit warden compatible system
keepassxc
Oh FFS... I was just in the process of migrating all of my family members from Dashlane to BW...
Vaultwarden + keyguard
Gross. Looks like I'm canceling my subscription.
Looks like I'll be setting up Vaultwarden, so long as they aren't doing the same.
Bloody hell