I tried Glaze, so you don’t have to.

I found it on ARTNews, New Data ‘Poisoning’ Tool Enables Artists To Fight Back Against Image Generating AI.

I had to start it somewhere, so it started there.

The story talks about a tool, …

… called Nightshade, that enables artists to add invisible pixels to their art prior to being uploaded online. These data samples “poison” the massive image sets used to train AI image-generators such as DALL-E, Midjourney, and Stable Diffusion, destabilizing their outputs in chaotic and unexpected ways as well as disabling “its ability to generate useful images”, reports MIT Technology Review.

Reportedly, the team beyond Nightshade plans to add it to their existing tool, Glaze, “which disrupts the ability for AI image generators to scrape images and mimic a specific artist’s personal style”.

Both Glaze and Nightshade sound really interesting, so I downloaded Glaze to try it out. The initial download was only 207Mb, but upon opening the app for the first time, it proceeded to download another 4.8Gb of additional resources. These get dumped into a hidden .glaze folder on your system, though the website’s F.A.Q. tells you where to find these assets to delete them; if you want to uninstall the app, you have to manually find and delete this folder.

I then “glazed” my first image, a 463Kb JPEG file. At the default Medium settings, the app took 42 minutes to process this one file. The resultant image was 1.7Mb, and suffered noticable and damning artifacting.

Click these images to view the full-size JPGs.

Source image:

Glazed image:

In conclusion, my findings are thus:

  • the glazed images are unusable from an artistic standpoint;
  • the process of glazing an image is too time-consuming to be useful in a photographic workflow.

It’s still early days for this technology, though, and it remains conceptually interesting. I hope it might evolve to the point where the results are less destructive to the image, and the process less disruptive to workflows. And maybe Nightshade, when it’s released to the public, will yield better results? We’ll have to wait to find out.