Nightshade Protects Artists’ Work from AI Theft – IOTW Report

Nightshade Protects Artists’ Work from AI Theft

venturebeat

It’s here: months after it was first announcedNightshade, a new, free software tool allowing artists to “poison” AI models seeking to train on their works, is now available for artists to download and use on any artworks they see fit.

Developed by computer scientists on the Glaze Project at the University of Chicago under Professor Ben Zhao, the tool essentially works by turning AI against AI. It makes use of the popular open-source machine learning framework PyTorch to identify what’s in a given image, then applies a tag that subtly alters the image at the pixel level so other AI programs see something totally different than what’s actually there. More

3 Comments on Nightshade Protects Artists’ Work from AI Theft

Comments are closed.