A group of American experts from the University of Chicago have developed a software tool called Nightshade, designed to prevent artists’ original works from being copied using artificial intelligence (AI). This was reported by portal Techcrunch.
Nightshade alters images using a technique called “shading.” It modifies the file data in such a way that the neural network trained on the “shaded” sample can no longer reproduce the source. Professor Ben Zhao, head of the Nightshade project, likened its development to adding hot sauce to your lunch to prevent it from being stolen from the office fridge.
Nightshade targets relationships between textual clues by subtly arranging pixels in images to enable AI models to interpret an image that is completely different than what a human viewer would see.
Models will misclassify features of “shaded” files and begin to generate unresponsive graphs if trained on enough altered data. Less than 100 samples need to be “poisoned” using Nightshade to disrupt the result of stable propagation (an AI rendering method).
“By manipulating and effectively distorting this relationship, you can make models think that cows have four round wheels, a bumper, and a trunk. And when they are asked to draw a cow, they will produce a large Ford truck instead of a cow,” Professor Zhao explained.
The authors of the novelty claim that they are not trying to destroy the business based on productive artificial intelligence in this way, but simply want to prevent model owners from using artists’ works for free for training neural networks.
happened before known That Google is developing an AI video creator.