Armitage Archive

Highlight from UL NO. 405: My AI Bill Deep-Dive, AI Poisoning, an IR Prep Checklist, and Discovery++

Artists are fighting back against AI with Nightshade, a new tool that 'poisons' AI models with corrupted training data. Developed by researchers at the University of Chicago, Nightshade alters pixels in images in a way that's invisible to the human eye but confuses AI models. This means that an AI model trained on these 'poisoned' images will learn incorrect information, for instance, seeing a dog as a cat. MORE | MORE | MORE