venturebeat
It’s here: months after it was first announced, Nightshade, a new, free software tool allowing artists to “poison” AI models seeking to train on their works, is now available for artists to download and use on any artworks they see fit.
Developed by computer scientists on the Glaze Project at the University of Chicago under Professor Ben Zhao, the tool essentially works by turning AI against AI. It makes use of the popular open-source machine learning framework PyTorch to identify what’s in a given image, then applies a tag that subtly alters the image at the pixel level so other AI programs see something totally different than what’s actually there. More
B U L L S H I T
“AI programs see something totally different than what’s actually there.”
Yeah, liquor will do the same thing re women at closing time. Ask me how I know.
Considering what passes for art nowadays…. https://www.youtube.com/watch?v=YooFwA7xj18