Where Chuckles Roar and Cartoons Soar!


Artists can now use Nightshade to fight theft, a tool that corrupts image-generating AI models.

Advertisement: Click here to learn how to Generate Art From Text

Computer scientists at the University of Chicago have developed a new tool called Nightshade that is meant to “poison” digital artwork, making it detrimental for training image-generating AI models that engage in intellectual property theft, such as DALL-E, Midjourney, and Stable Diffusion.

Where did Nightshade originate?Nightshade, developed by computer scientists from the University of Chicago under the direction of professor Ben Zhao, is part of The Glaze Project. The group had previously developed Glaze – a tool to confuse AI training algorithms by changing the way they perceive the style of digital art.

How does it Work?Pytorch, an open-source machine-learning framework, is used to tag images down to the pixel. The tags aren’t obvious to humans looking at the image, but AI models see them differently, adversely affecting how the images are used for training.


Your email address will not be published. Required fields are marked *