Tech vs tech, for better tech.
How an ancient toxic substance is meant to improve a world overrun by AI.
Nightshade
This poison.
Juliet’s famous, fictional, faked death was built on it.
Russian czars like Ivan the Terrible suffered from it.
Agatha Christie used it more than once in her twisty murders with Poirot.
The Borgia family allegedly finished off some pesky rivals with it.
Now, the historically and fictionally famous poison, might well be coming for our beloved new AI friends.
Well, kinda.
Nightshade is a tool developed to help fight the good battle against the evil corporations who are stealing creator’s work without their consent. Except, of course, that the battle is murky (aren’t they all), and the good/ evil divide is not quite sharp, to most.
Developed by researchers at the University of Chicago, Nightshade will carry out its nefarious/noble aim by:
…changing the pixels of images in subtle ways that are invisible to the human eye, but manipulate machine-learning models to interpret the image as something different from what it actually shows.
Poisoned data samples can manipulate models into learning, for example, that images of hats are cakes, and images of handbags are toasters. The poisoned data is very difficult to remove, as it requires tech companies to painstakingly find and delete each corrupted sample.
This can be an innovative way for the wider creative community to push back against having their work exploited without consent or compensation. It sounds like a mid-term play, while AI companies spend time in court around many cases in this space. But besides pesky hat-cake problems, deterrence is the true game.
the hope is that it will help tip the power balance back from AI companies towards artists, by creating a powerful deterrent against disrespecting artists’ copyright and intellectual property.
Many certainly await and expect legal frameworks & licensing practices to evolve, and compensation models to emerge to bridge this gap. While the march of AI research is all but a given, respect for artists' rights cannot be collateral damage.
Tools like Nightshade look to shift leverage to creators.
Are there potential ethical issues and risks of abuse? Certainly, but not at scale yet, we are told.
attackers would need thousands of poisoned samples to inflict real damage on larger, more powerful models, as they are trained on billions of data samples.
Humans battling tech, using tech to stand against Tech.
Code to defeat code.
Programs to defeat programs.
I think someone might have been in touch with The Oracle, that lovely kindly lady baking cookies. Save me some?


