LX Pontual
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Lee Duna@lemmy.nz to Technology@lemmy.worldEnglish · 1 year ago

Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use

venturebeat.com

external-link
message-square
278
fedilink
  • cross-posted to:
  • [email protected]
829
external-link

Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use

venturebeat.com

Lee Duna@lemmy.nz to Technology@lemmy.worldEnglish · 1 year ago
message-square
278
fedilink
  • cross-posted to:
  • [email protected]
The tool's creators are seeking to make it so that AI model developers must pay artists to train on data from them that is uncorrupted.
  • Even_Adder@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    1 year ago

    You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.

    A few quotes:

    First, copyright law doesn’t prevent you from making factual observations about a work or copying the facts embodied in a work (this is called the “idea/expression distinction”). Rather, copyright forbids you from copying the work’s creative expression in a way that could substitute for the original, and from making “derivative works” when those works copy too much creative expression from the original.

    Second, even if a person makes a copy or a derivative work, the use is not infringing if it is a “fair use.” Whether a use is fair depends on a number of factors, including the purpose of the use, the nature of the original work, how much is used, and potential harm to the market for the original work.

    and

    Even if a court concludes that a model is a derivative work under copyright law, creating the model is likely a lawful fair use. Fair use protects reverse engineering, indexing for search engines, and other forms of analysis that create new knowledge about works or bodies of works. Here, the fact that the model is used to create new works weighs in favor of fair use as does the fact that the model consists of original analysis of the training images in comparison with one another.

    • gapbetweenus@feddit.de
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      7
      ·
      1 year ago

      Yeah, that’s what I’m saying - our current copiright laws are insufficient to deal with AI art generation.

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        4
        ·
        1 year ago

        They aren’t insufficient, they are working just fine. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. We shouldn’t be trying to make that any worse.

        • Dkarma@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          4
          ·
          1 year ago

          Yep. Copyright should not include “viewing or analyzing the picture” rights. Artists want to start charging you or software to even look at their art they literally put out for free. If u don’t want your art seen by a person or an AI then don’t publish it.

          • EldritchFeminity@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            7
            ·
            1 year ago

            Copyright should absolutely include analyzing when you’re talking about AI, and for one simple reason: companies are profiting off of the work of artists without compensating them. People want the rewards of work without having to do the work. AI has the potential to be incredibly useful for artists and non artists alike, but these kinds of people are ruining it for everybody.

            What artists are asking for is ethical sourcing for AI datasets. We’re talking paying a licensing fee or using free art that’s opt-in. Right now, artists have no choice in the matter - their rights to their works are being violated by corporations. Already the music industry has made it illegal to use songs in AI without the artist’s permission. You can’t just take songs and make your own synthesizer out of them, then sell it. If you want music for something you’re making, you either pay a licensing fee of some kind (like paying for a service) or use free-use songs. That’s what artists want.

            When an artist, who does art for a living, posts something online, it’s an ad for their skills. People want to use AI to take the artist out of the equation. And doing so will result in creativity only being possible for people wealthy enough to pay for it. Much of the art you see online, and almost all the art you see in a museum, was paid for by somebody. Van Gogh died a poor man because people didn’t want to buy his art. The Sistine Chapel was commissioned by a Pope. You take the artist out of the equation and what’s left? Just AI art made as a derivative of AI art that was made as a derivative of other art.

            • Even_Adder@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              3
              ·
              1 year ago

              You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.

              • EldritchFeminity@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                2
                ·
                1 year ago

                MidJourney is already storing pre-rendered images made from and mimicking around 4,000 artists’ work. The derivative works infringement is already happening right out in the open.

                • Even_Adder@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  1
                  ·
                  1 year ago

                  Something being derivative doesn’t mean it’s automatically illegal or improper.

                  First, copyright law doesn’t prevent you from making factual observations about a work or copying the facts embodied in a work (this is called the “idea/expression distinction”). Rather, copyright forbids you from copying the work’s creative expression in a way that could substitute for the original, and from making “derivative works” when those works copy too much creative expression from the original.

                  Second, even if a person makes a copy or a derivative work, the use is not infringing if it is a “fair use.” Whether a use is fair depends on a number of factors, including the purpose of the use, the nature of the original work, how much is used, and potential harm to the market for the original work.

                  Even if a court concludes that a model is a derivative work under copyright law, creating the model is likely a lawful fair use. Fair use protects reverse engineering, indexing for search engines, and other forms of analysis that create new knowledge about works or bodies of works. Here, the fact that the model is used to create new works weighs in favor of fair use as does the fact that the model consists of original analysis of the training images in comparison with one another.

                  You are expressly allowed to mimic others’ works as long as you don’t substantially reproduce their work. That’s a big part of why art can exist in the first place. You should check out that article I linked.

                  • EldritchFeminity@lemmy.blahaj.zone
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    1 year ago

                    I actually did read it, that’s why I specifically called out MidJourney here, as they’re one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists’ works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist’s name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.

                    I wanna make it clear that I’m not on the “AI evilllll!!!1!!” train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.

          • Even_Adder@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            edit-2
            1 year ago

            It’s sad some people feel that way. That kind of monopoly on expression and ideas would only serve to increase disparities and divisions, manipulate discourse in subtle ways, and in the end, fundamentally alter how we interact with each other for the worse.

            What they want would score a huge inadvertent home run for corporations and swing the doors open for them hindering competition, stifling undesirable speech, and monopolizing spaces like nothing we’ve seen before. There are very good reasons we have the rights we have, and there’s nothing good that can be said about anyone trying to make them worse.

            Also, rest assured they’d collude with each other and only use their new powers to stamp out the little guy. It’ll be like American ISPs busting attempts at municipal internet all over again.

Technology@lemmy.world

technology@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


  • @[email protected]
  • @[email protected]
  • @[email protected]
  • @[email protected]
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 2.48K users / day
  • 9.47K users / week
  • 17.8K users / month
  • 37.8K users / 6 months
  • 2 local subscribers
  • 69.9K subscribers
  • 12.9K Posts
  • 526K Comments
  • Modlog
  • mods:
  • L3s@lemmy.world
  • enu@lemmy.world
  • Technopagan@lemmy.world
  • L4sBot@lemmy.world
  • L3s@hackingne.ws
  • L4s@hackingne.ws
  • BE: 0.19.8
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org