Four ways to protect your art from AI 

Informed i’s Weekly Business Insights

Extractive summaries and key takeaways from the articles carefully curated from TOP TEN BUSINESS MAGAZINES to promote informed business decision-making | Since 2017 | Week 376, November 22-28, 2024 | Archive

Four ways to protect your art from AI 

By Melissa Heikkilä | MIT Technology Review | November 21, 2024

2 key takeaways from the article

  1. Since the start of the generative AI boom, artists have been worried about losing their livelihoods to AI tools.  Artists and writers have launched several lawsuits against AI companies, arguing that their work has been scraped into databases for training AI models without consent or compensation. Tech companies have responded that anything on the public internet falls under fair use. But it will be years until we have a legal resolution to the problem. 
  2. Unfortunately, there is little you can do if your work has been scraped into a data set and used in a model that is already out there. You can, however, take steps to prevent your work from being used in the future.  Here are four ways to do that.  Mask your style.  Rethink where and how you share.  Opt out of scraping.  If all else fails, add some poison.

Full Article

(Copyright lies with the publisher)

Topics:  Technology, Artificial Intelligence, Copy Rights, Scraping

Extractive Summary of the Article | Read | Listen

Since the start of the generative AI boom, artists have been worried about losing their livelihoods to AI tools. There have been plenty of examples of companies’ replacing human labor with computer programs. Most recently, Coca-Cola sparked controversy by creating a new Christmas ad with generative AI. 

Artists and writers have launched several lawsuits against AI companies, arguing that their work has been scraped into databases for training AI models without consent or compensation. Tech companies have responded that anything on the public internet falls under fair use. But it will be years until we have a legal resolution to the problem. 

Unfortunately, there is little you can do if your work has been scraped into a data set and used in a model that is already out there. You can, however, take steps to prevent your work from being used in the future.  Here are four ways to do that. 

  1. Mask your style.  One of the most popular ways artists are fighting back against AI scraping is by applying “masks” on their images, which protect their personal style from being copied.  Tools such as Mist, Anti-DreamBooth, and Glaze add tiny changes to an image’s pixels that are invisible to the human eye, so that if and when images are scraped, machine-learning models cannot decipher them properly.  But defenses like these are never foolproof, and what works today might not work tomorrow.  
  2. Rethink where and how you share.  Popular art profile sites such as DeviantArt and Flickr have become gold mines for AI companies searching for training data. And when you share images on platforms such as Instagram, its parent company, Meta, can use your data to build its models in perpetuity if you’ve shared it publicly.  One way to prevent scraping is by not sharing images online publicly, or by making your social media profiles private. But for many creatives that is simply not an option; sharing work online is a crucial way to attract clients.  It’s worth considering sharing your work on Cara, a new platform created in response to the backlash against AI. 
  3. Opt out of scraping.  Data protection laws might help you get tech companies to exclude your data from AI training. If you live somewhere that has these sorts of laws, such as the UK or the EU, you can ask tech companies to opt you out of having your data scraped for AI training. For example, you can follow these instructions for Meta. Unfortunately, opt-out requests from users in places without data protection laws are honored only at the discretion of tech companies.  The site Have I Been Trained, created by the artist-run company Spawning AI, lets you search to find out if your images have ended up in popular open-source AI training data sets.
  4. If all else fails, add some poison.  The University of Chicago researchers who created Glaze have also created Nightshade, a tool that lets you add an invisible layer of “poison” to your images. Like Glaze, it adds invisible changes to pixels, but rather than just making it hard for AI models to interpret images, it can break future iterations of these models and make them behave unpredictably.

Be the first to comment

Leave a Reply