Adobe aims to develop a robots.txt-style indicator for images used in AI training
Adobe's tool signals AI to avoid using certain images for training.

Adobe has embarked on an initiative to develop a tool akin to the robots.txt file for web crawlers but for images intended to be excluded from AI training datasets. Named the Adobe Content Authenticity App, this tool allows creators to append content credentials to as many as 50 JPG or PNG files simultaneously. This development comes as a response to frequent concerns voiced by creators regarding unauthorized use of their work in generative AI training. The tool not only permits creators to assert originality and ownership through their credentials, including name and social media links, but it also includes an option to signal that an image should not be utilized in AI model training.
In partnership with LinkedIn, Adobe has incorporated verification services to ensure that the credentials are attached by verified identities, providing a layer of trust for content creators using this tool. Andy Parson, Senior Director at Adobe's Content Authenticity Initiative, stressed the importance of giving creators control over how their content is used, particularly in the realm of AI, where copyright legislation remains fractured globally. Adobe combines digital fingerprinting, open-source watermarking, and cryptographic metadata so even modified images retain creator credentials. However, this endeavor remains challenging without formal agreements with major AI companies to respect these credentials.
The tool's functionality is enriched with the introduction of a Chrome extension that enables users to identify images carrying content credentials across major platforms like Instagram. This capability is critical as many platforms do not natively support Adobe's content credential standards. Despite efforts, Adobe's solution faces hurdles similar to previous initiatives, such as Meta's AI labels on images, which met with resistance from users and misinterpretation issues.
Adobe's app is part of a broader effort under the Coalition for Content Provenance and Authenticity (C2PA), which aims to standardize methods for asserting content authenticity. Adobe and Meta, both participants in C2PA, currently implement slightly different approaches to handling content authenticity, reflecting the complex landscape of digital rights management in the era of AI. Parson clarifies that Adobe's purpose is not to dictate the nature of art through AI but to furnish creators with a symbolic signature indicating ownership and authorship without necessarily impacting intellectual property rights.
Although initially targeting images, Adobe foresees extending this tool to encompass video and audio formats in the future. With the continuous evolution of AI, managing ethical and rights-based challenges has become crucial. The tool reflects an intersection of technology and ethics, underscoring Adobe's strategy to ensure transparency and respect for creators' rights in the digital arena.
Sources: TechCrunch, C2PA, Adobe.