this post was submitted on 21 Jan 2024
1 points (100.0% liked)

Technology

74331 readers
2995 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 22 comments
sorted by: hot top controversial new old
[–] ScaredDuck@sopuli.xyz 1 points 2 years ago

Won't this thing actually help the AI models in the long run? The biggest issue I've heard is the possibility of AI generated images getting into the training dataset, but "poisoned" artworks are basically guaranteed to be of human origin.

[–] HexesofVexes@lemmy.world 1 points 2 years ago

Ah, another arms race has begun. Just be wary, what one person creates another will circumvent.

[–] kromem@lemmy.world 1 points 2 years ago (2 children)

This doesn't work outside of laboratory conditions.

It's the equivalent of "doctors find cure for cancer (in mice)."

[–] bier@feddit.nl 1 points 2 years ago (1 children)

I like that example, everytime you hear about some discovery that x kills 100% of cancer cells in a petri dish. You always have to think, so does bleach.

[–] Worx@lemmynsfw.com 1 points 2 years ago

Nice, maybe we should try injecting bleach. I heard it also cures Covid!

[–] Wiz@midwest.social 0 points 2 years ago (1 children)

It hasn't worked much outside of the laboratory, because they just released it from the laboratory. They've already proven it works in their paper with about 90% effectiveness.

[–] Meowoem@sh.itjust.works 0 points 2 years ago (1 children)

It's clever really, people who don't like ai are very lonelye to also not understand the technology, if you're going to grift then it's a perfect set of rubes - tell them your magic code will defeat the evil magic code of the ai and that's all they need to know, fudge some numbers and they'll throw their money at you

[–] Misconduct@lemmy.world 1 points 2 years ago

What's not clever is making stuff up to not really make a point after typing a whole paragraph lmao

[–] vsis@feddit.cl 1 points 2 years ago (2 children)

It's not FOSS and I don't see a way to review if what they claim is actually true.

It may be a way to just help to diferentiate legitimate human made work vs machine-generated ones, thus helping AI training models.

Can't demostrate that fact neither, because of its license that expressly forbids sofware adaptions to other uses.

Edit, alter, modify, adapt, translate or otherwise change the whole or any part of the Software nor permit the whole or any part of the Software to be combined with or become incorporated in any other software, nor decompile, disassemble or reverse engineer the Software or attempt to do any such things

sauce: https://nightshade.cs.uchicago.edu/downloads.html

[–] JATtho@lemmy.world 1 points 2 years ago

I read the article enough to find that the Nightshade tool is under EULA... :(

Because it definitely is not FOSS, use it with caution, preferably on a system not connected to internet.

[–] nybble41@programming.dev 1 points 2 years ago (1 children)

The EULA also prohibits using Nightshade "for any commercial purpose", so arguably if you make money from your art—in any way—you're not allowed to use Nightshade to "poison" it.

[–] Nommer@sh.itjust.works 1 points 2 years ago

This is the part most people will ignore but I get that's it's mainly meant for big actors.

[–] General_Effort@lemmy.world 1 points 2 years ago

Explanation of how this works.

These "AI models" (meaning the free and open Stable Diffusion in particular) consist of different parts. The important parts here are the VAE and the actual "image maker" (U-Net).

A VAE (Variational AutoEncoder) is a kind of AI that can be used to compress data. In image generators, a VAE is used to compress the images. The actual image AI only works on the smaller, compressed image (the latent representation), which means it takes a less powerful computer (and uses less energy). It’s that which makes it possible to run Stable Diffusion at home.

This attack targets the VAE. The image is altered so that the latent representation is that of a very different image, but still roughly the same to humans. Say, you take images of a cat and of a dog. You put both of them through the VAE to get the latent representation. Now you alter the image of the cat until its latent representation is similar to that of the dog. You alter it only in small ways and use methods to check that it still looks similar for humans. So, what the actual image maker AI "sees" is very different from the image the human sees.

Obviously, this only works if you have access to the VAE used by the image generator. So, it only works against open source AI; basically only Stable Diffusion at this point. Companies that use a closed source VAE cannot be attacked in this way.


I guess it makes sense if your ideology is that information must be owned and everything should make money for someone. I guess some people see cyberpunk dystopia as a desirable future. I wonder if it bothers them that all the tools they used are free (EG the method to check if images are similar to humans).

It doesn’t seem to be a very effective attack but it may have some long-term PR effect. Training an AI costs a fair amount of money. People who give that away for free probably still have some ulterior motive, such as being liked. If instead you get the full hate of a few anarcho-capitalists that threaten digital vandalism, you may be deterred. Well, my two cents.

[–] Zealousideal_Fox900@lemmy.world 1 points 2 years ago (1 children)

As an artist, nightshade is not something I will ever use. All my art is public domain, including AI. Let people generate as many pigeon pictures as they want I say!

[–] nightwatch_admin@feddit.nl 0 points 2 years ago (1 children)

That’s great for you, truly it is, but for others it’s not.

[–] Zealousideal_Fox900@lemmy.world 1 points 2 years ago (2 children)

Mind explaining what artists it isn't good for? I genuinely don't see why it is so hard to let others remix and remake.

[–] Drewelite@lemmynsfw.com 1 points 2 years ago

Yeah same. Empowering people to be more creative has never stuck me as something that needs to be gatekept. Tools have constantly improved allowing more people to become artists. If it's the copying of styles you're worried about, I'd take it up with every artist that's learned from Picasso or Da Vinci.

[–] Mustard@lemmy.blahaj.zone 0 points 2 years ago (1 children)

Believe it or not I need to eat food. Crazy I know.

[–] Zealousideal_Fox900@lemmy.world 0 points 2 years ago (1 children)
[–] Mustard@lemmy.blahaj.zone 0 points 2 years ago (1 children)

Do you have a means of securely and reliably getting it? Cause I don't.

You really come across as coming from a place of privilege whilst lamenting that the reason poor people are worried about this is because they're just not as nice as you.

[–] Zealousideal_Fox900@lemmy.world 1 points 2 years ago

Lmao I have never been rich, in my entire life. It isn't like my art is being directly copied.

[–] bonus_crab@lemmy.world 1 points 2 years ago

big companies already have all your uncorrupted artwork, all this does is eliminate any new competition from cropping up.