MIT Technology Review: A new tool, called Nightshade, lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways. Ben Zhao, a professor at the University of Chicago, who led the team that created Nightshade, says the hope is that it will help tip the power balance back from AI companies towards artists, by creating a powerful deterrent against disrespecting artists’ copyright and intellectual property.
The team intends to integrate Nightshade into Glaze, and artists can choose whether they want to use the data-poisoning tool or not. The team is also making Nightshade open source.
Ben Zhao is a C3.ai DTI Principal Investigator on cybersecurity.
Read the Tech Review story here.
Read the paper, “Prompt-Specific Poisoning Attacks on Text-to-Image Generative Models,” here.
Technology Review Image by STEPHANIE ARNETT/MITTR | ENVATO
This new data poisoning tool lets artists fight back against generative AI
October 23, 2023