Skip to main content

Glaze 2: new version of anti-AI scraping tool for artists launches, video defense planned

A shield decorated with paint splotches and brushes to resemble an artist's palette.
Credit: VentureBeat made with OpenAI DALL-E 3

Want smarter insights in your inbox? Sign up for our weekly newsletters to get only what matters to enterprise AI, data, and security leaders. Subscribe Now


Back in February 2023, a small team of researchers at the University of Chicago studying under computer science professor Ben Zhao released Glaze, a free software tool that uses machine learning to subtly alter the pixels of an artwork provided by a user, changing the way its style is perceived by any AI art generator models that scrape and train on said artworks. In other words: an image done in a hand-drawn style could appear to an AI to be a watercolor painting or CGI artwork, something different entirely, while still appearing normally to the user.

The goal of Glaze was simple: “to help artists disrupt AI models trying to mimic their artistic style, without adversely impacting their own artwork,” according to a research paper by its creators Shawn Shan, Jenna Cryan, Emily Wenger, Haitao Zheng, Rana Hanocka, Ben Y. Zhao, that was peer reviewed and published at Usenix Security 2023, where it won multiple awards, including a distinguished paper award, and the Internet Defense Prize.

Why do this? As the Glaze Project team stated on their website:

“For artists whose styles are intentionally copied, not only do they see loss in commissions and basic income, but low quality synthetic copies scattered online dilute their brand and reputation. Most importantly, artists associate their styles with their very identity. Seeing the artistic style they worked years to develop taken to create content without their consent or compensation is akin to identity theft.”


AI Scaling Hits Its Limits

Power caps, rising token costs, and inference delays are reshaping enterprise AI. Join our exclusive salon to discover how top teams are:

  • Turning energy into a strategic advantage
  • Architecting efficient inference for real throughput gains
  • Unlocking competitive ROI with sustainable AI systems

Secure your spot to stay ahead: https://bit.ly/4mwGngO


Basically, an artist could create a still image by hand or using digital programs, then, before uploading it to the web for public display or sale, download Glaze for free off the University of Chicago website and run it to alter their artwork to prevent AI models from successfully training on it and providing the ability for AI users to mimic the original artist’s style. To a human eye, the “glazed” artwork would look very similar to the original artwork — but to an AI, it would be radically different and attempting to emulate the style would result in a completely different one. It is a way for artists to stop their unique style from being emulated by AI — though it only works on artworks that haven’t already been scraped in the AI dragnet.

Many artists flocked to the tool, with more than 2.3 million downloads as of March 2024, as well as to the team’s hit follow-up open source program, Nightshade, which seeks to “poison” AI models training on artists’ works without consent.

But now, more than a year later, the University of Chicago Glaze Project team is back with a new version of their first offering: Glaze 2, which they say is faster for artists to use and provides more protection for them against newer AI models including Stable Diffusion XL , an open source text-to-image model that users can fine-tune to emulate a specific artists or artists styles.

How Glaze 2 improves over the original

When Glaze was first released, the original research paper noted that “it takes an average of 1.2 mins on Titan RTX GPU and 7.3 mins on a single Intel i7 CPU,” to glaze a single artwork.

At the same time, it proved highly effective at disrupting AI’s ability to mimic an artist’s style, achieving over 92% success in preventing style mimicry under typical conditions and 85% against adaptive countermeasures designed to overcome Glaze’s protections.

However, Glaze 2 offers between a 50% to 500% increase in speeds for modifying images, depending on the user’s individual computer hardware.

“Computation speed up is significant,” wrote Glaze Project team leader Ben Zhao, in an email to VentureBeat. “It generally means Glaze 2 runs nearly twice as fast as Glaze 1.1. Some older GPUs go from 4 mins to 2 mins per image. Others go from 50 seconds to 30 seconds per image.”

The biggest speed bump is for Mac’s running M1-M3 processors, which can see an increase of 5X (500%) according to the documentation on the Glaze Project’s website.

As for how it protects against SDXL and why it needs additional protection, the Glaze 2 download page states that it now protects “smooth surface art (e.g. anime, cartoon).” Zhao also elaborated in his email to VentureBeat’s questions that “Glaze 2 provides even stronger protection (more disruptive artifacts when trying to mimic an artist whose work has been protected by Glaze 2 than with Glaze 1).”

Asked about whether or not Glaze 2 would protect artists against style mimicry by other AI art generators such as OpenAI’s DALL-E 3 or Midjourney, Zhao explained the focus was one Stable Diffusion since it is the leading open source AI art generator model, and as such, is the one most open to being fine-tuned by a user to mimic a particular artist’s style.

“Stable Diffusion is by far the primary way that people mimic artists,” Zhao said. “As a tool to disrupt mimicry, Glaze’s main focus is Stable Diffusion’s models.  There is no easy way for people to use [OpenAI’s] Dalle 3, or Firefly or Gemini to mimic an artist…”

Video protections planned

The Glaze Project team at the University of Chicago is all-in on Glaze 2: Zhao told VentureBeat in his email that Glaze 1 would no longer be made available for download because “There is no point in supporting Glaze 1. Glaze 2 is a drop-in replacement that should be better than Glaze 1 in every way.”

Yet the team is also focused on offering more protection against unauthorized AI scraping for artists in a new medium: video.

“In addition to Glaze2.0, we are working on a project extending Glaze-like protection to short videos and animations,” posted the Glaze Project team from their account on X. “(Our prior user study led to a paper to be presented at CCS 2024),” referencing the ACM Conference on Computer and Communications Security.

The Glaze researchers also followed this up with a call for user participation in a survey to judge how well AI can replicate frames of a video, and how well Glaze’s new version can disrupt the mimicry.

If the tool performs anything like Glaze 1 and 2 appears to, it is likely to be another big hit with artists seeking to protect their creations from unauthorized AI scraping.