this post was submitted on 05 Mar 2026
29 points (75.4% liked)

Fuck AI

5751 readers
2438 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

The guy replying to me is (as far as I can tell) the sole owner and moderator of .wtf, which is the instance I've been using up until this point. I kinda already knew they allowed AI slop, as there's nothing in the rules that says otherwise, but this interaction really sealed my decision. "Hey, person who makes music. If you don't like another musician using the fascist plagiarism machine, how about you offer to create art for them? After all, if people simply donated their time and effort, maybe they wouldn't have to resort to pissing in the face of their fellow artists of a different medium. Think about it."
Also, I think you can donate to the instance in crypto?

Fuck right off with that.

On another note: PeerTube itself uses Whisper for automatic subtitle generation. It's something I don't LIKE, but I approached the devs about it and they responded very thoughtfully. I'll admit I don't know all the differences between locally run, open source models that are used for accessibility and the horrible plagiarism machines we all despise the most. I suspect they're still built off exploitive tech / trained on stolen data and whatnot, and Whisper being the product of OpenAI doesn't inspire confidence, but Framasoft only uses it to detect speech, not create it. That's hardly "generative" at all, is it? It's just creating subtitles. Now, that doesn't mean the program itself is ethical given how it was likely created (as the devs acknowledge), and we SHOULD push for ethical, FLOSS methods of doing these sorts of things. I'm sure it can be done, it wasn't exploitative before the AI boom, right? This is where my knowledge ends and I ask for feedback. Any thoughts?

top 7 comments
sorted by: hot top controversial new old
[–] mrmaplebar@fedia.io 10 points 58 minutes ago

"you are more than welcome to offer your cover art services for free."

The gross entitlement to other people's labor just seems to come so easily to some people... It shows just how much of generative AI is based on little more than the desire to exploit people's work.

[–] bridgeenjoyer@sh.itjust.works 1 points 1 minute ago

If you're already making music how hard is it to make a fucking album cover??? Kids these days are lazy as fuck. I was making shitty album covers in gimp when i was 6, and I'm not intelligent.

I blame parents.

[–] thisbenzingring@lemmy.today 8 points 1 hour ago (1 children)

one of my coworkers uses AI to generate his music from the lyrics he writes

it's shit, i fucking hate it. it's like auto tune but so much worse. I told him I'd give it a listen if he's actually performing something and auto tuned it... hasn't happened

[–] cloudskater@piefed.blahaj.zone 2 points 49 minutes ago (1 children)

My friend did the same thing and didn't get why I wouldn't tolerate it, even tho he wrote the lyrics. Also, this is coming from someone who LOVES pitch correction as an effect. I adore synth pop and digitally perfected vocals as a choice.

[–] thisbenzingring@lemmy.today 3 points 31 minutes ago

for sure! it can be garbage when overdosed but auto tune isn't without its uses.

[–] Zagorath@aussie.zone 2 points 36 minutes ago (1 children)

I'll admit I don't know all the differences between locally run, open source models that are used for accessibility and the horrible plagiarism machines we all despise the most

Fwiw 99.9% of the time someone talks about ak "open source" generative AI model, what they really mean is "open weight".

An open-source model has public training code and training dataset, allowing full reproduction

a random Reddit post I found when looking for a good definition to share

Some people (including the author of that definition) don't like the need for open source models to have an open source dataset. It's also not clear to me whether that definition is even supposed to mean the dataset is actually public domain, or just clearly defined (e.g. "we trained on all top 100 best-selling books from the period 2000–2020"). The former would obviously be very meaningfully different from closed models in terms of accusations of ethical problems in the training process.

Open-weight models basically just mean you can download it and make some slight tweaks and run it at home. It means the big AI companies aren't benefiting financially from your use and can't train on what you feed them for their next model, and because these are typically designed to be run locally rather than in a data centre the environmental impacts are lessened. But in terms of the training process it's no better than closed models.

[–] cloudskater@piefed.blahaj.zone 1 points 22 minutes ago

So this is what Whisper and whatnot is like? Framasoft responded tastefully to me and I understand their perspective, but now I'm realizing that Kdenlive has some similar, and potentially generative features and I'm trying to get testimony from the devs of that project because istfg if their using fucking generative AI just as I was getting used to and learning my way around their video editor, I'm going to lose it.