Adobe proposes a way to protect artists from AI ripoffs

Adobe launches tools to protect digital artistry authenticity.

: Adobe plans to address AI-driven content misuse by launching its Content Authenticity web app in beta in early 2025. This app will utilize digital fingerprinting and invisible watermarking to maintain the integrity of creative works. Adobe's move includes introducing tools like a Chrome extension and partnering with two industry groups to improve online content authenticity.

Adobe aims to combat AI-driven content misuse by introducing the Content Authenticity web app in beta in early 2025. The initiative includes methods like digital fingerprinting, invisible watermarking, and cryptographically signed metadata to ensure that artworks in different formats are safeguarded as belonging to original creators.

Additionally, Adobe is set to launch a Content Authenticity Chrome extension and an Inspect tool on their website, increasing visibility and accessibility of content credentials. Adobe collaborates with industry giants, including camera manufacturers, Microsoft, and OpenAI, although integration of Adobe's credentials in their products is still under consideration.

With Adobe's own generative AI tool, Firefly, trained solely on permitted content, the company also joins forces with Spawning to allow artists to control the use of their work in AI training datasets. This is part of a broader effort to improve content authenticity and transparency, gaining significant user engagement and industry support.