OpenAI has added a new tool to detect if an image was made with its DALL-E AI image generator, as well as new watermarking methods to more clearly flag content it generates.
In a blog post, OpenAI announced that it has begun developing new provenance methods to track content and prove whether it was AI-generated. These include a new image detection classifier that uses AI to determine whether the photo was AI-generated, as well as a tamper-resistant watermark that can tag content like audio with invisible signals.
The classifier predicts the likelihood that a picture was created by DALL-E 3. OpenAI claims the classifier works even if the image is cropped or compressed or the saturation is changed. While the tool can detect if images…