In the 1990s, when mass food production introduced hundreds of novel ingredients and industrial processes, people lost significant visibility into what they were eating. Nutrition labels emerged as a solution for transparency and to help consumers make informed choices on their consumption.
Today, digital content creation is experiencing its own industrial revolution. Canva logged 16 billion AI-powered feature uses last year 1, Midjourney surpassed 21 million users by March 2025 2, and 71% of businesses and 83% of creators are already reporting using AI tools in their content workflows 3. This shift has introduced new intermediaries and processes that make it harder to trace digital content origins. Without transparency, it is near impossible for artists and platforms to maintain creative control and proper attribution, or for audiences to connect to increasingly intermediated and automated creators.
The music industry for one is facing a tipping point of copyright infringement from unauthorised training models, artificially inflated play counts, and fake AI tracks swamping streaming platforms 4. The likes of Universal Music Group, Warner, and Sony, are pushing for the technology to serve the artists and help enhance their creativity instead of replacing them. As Jeremy Uzan of the Universal Music Group says, “AI can be used to assist the artist” as they pursue an artist-centric approach. For example, using AI to translate Brenda Lee’s iconic ‘Rockin around the Christmas tree’ into Spanish, or audio-enhancing archival Beatles audio, to demonstrate how human creative input combined with AI assistance can work commercially and legally. However, the current AI tools intending to foster new creative opportunities often lack the granular attribution controls and auditability that is needed to create the level of supply-chain transparency that creators need.
Like nutrition labels before them, The Content Authenticity Initiative and Creator Assertions Working Group (CAWG) are trying to restore the transparency that mass automation has obscured. Such efforts, deployed at scale, would give audiences the information they need to make informed choices about the media they consume, without dictating their decisions.
The Foundational Infrastructure
The C2PA Content Credentials Specification 5 acts as a foundation for this media transparency. It cryptographically binds an origin ‘label’ to a digital asset that says how the asset was created. The CAWG 6 builds on C2PA with a framework for attaching the ‘who’ and ‘why’ to an asset as ‘Content Credentials’. As Scott Perry, the Co-Chair of CAWG and Conformance Program Administrator of C2PA, puts it, CAWG metadata brings “human claims to digital media”.
What this looks like in the real world is Google’s Pixel 10 now shipping with C2PA conformance built in, YouTube tagging select videos as “captured with a real camera”, and LinkedIn marking images with a “Cr” symbol to show when they carry Content Credentials. These tags relay information like whether AI was used to generate or edit a part of the content, the entity that created the Content Credential, and when the credential was created. But, as Eric Scouten, Co-Chair of CAWG and Identity Standards Architect at Adobe, stresses, “one of the biggest misconceptions about [CAWG] is that it is something like Snopes or Politifact, out to say what’s true and what’s not, and that’s not the case.” CAWG does not arbitrate truth. It does not fact-check or attach political judgments. Instead, it provides signals about who created a piece of content, when, how, and in what context. The decision to ‘believe’ in the content remains with the viewer.
When FKA Twigs created her own AI clone for fan interactions, she demonstrated the difference between artist-controlled AI use and unauthorised exploitation 7. Once people knew that she stood behind her AI clone, the work felt legitimate and trust flowed from the person to the work. With provenance infrastructure, fans could verify which AI interactions were officially sanctioned by Twigs herself versus the unauthorised. Not because Content Credentials determine what’s ‘real,’ but because they can provide verifiable provenance information about the creation of digital assets, including an authorisation trail from creation to consumption.
The Challenges with Creator Identities
The FKA Twigs example hints at a much larger challenge ahead. Apps like Character AI already have ~9 million users per day, and as AI clones, virtual personas, and agentic creators proliferate, the range of online ‘identifiers’ becomes significantly more complex.
Even today, navigating the multiple digital identifiers of creators, from professional personas to social media handles and artist pseudonyms, is a fragmented journey. The plurality of creator identities creates a fundamental mismatch with existing identity verification systems. Large organisations like newsrooms and major labels rely on X.509 certificates and PKI systems that fit to enterprise workflows and secure their own supply chains, for example disincentivising pre-release leaks. But individual creators don’t operate in that world. For them, identity lives in social handles and personal websites where identifiers are informal and often platform-bound.
CAWG’s framework bridges this gap by accepting both ends of the spectrum. Their Identity Claims Aggregator mechanism verifies the disparate identity signals through trusted third parties, and issues a single verifiable credential that binds the creator’s chosen identity to their content. This gives creators a direct, human voice in the history of their work, rather than only recording what the device or app has logged in the process. As Eric explains, “the point of the identity assertion is that it is a framework that allows a lot of different things to be plugged into it.” The design is deliberately credential-agnostic, giving creators the flexibility to bring their own chosen identity signals. Future versions of the CAWG identity framework will likely add support for generic W3C Verifiable Credentials and self-controlled credentials such as those being developed by the First Person Project.
Major labels and organisations like the Universal Music Group and the Recording Industry Association of America are already exploring the use of ISNI (International Standard Name Identifier) for artist identities. In practice, this allows labels and managers to attach industry-recognised identifiers to digital assets that protect an artist’s image and likeness in their content. But this approach still has its challenges. For one, ISNI faces the perennial challenge of universal standards adoption. As with most industries, there is no single identifier used for creators today that is publicly resolvable. Scott takes a pragmatic approach to the universal identifier problem, “each industry should publish its best practices alongside the normative standard - i.e. saying this is the state of play in music right now. This is the best you can do, do it this way, we're working on it. Then as that evolves, as it updates, you have one place to go for anyone who wants to know how to identify music.”
CAWG’s strength lies in anticipating this plurality of identity and evolution. The framework is designed to incorporate new credential types as they emerge, from today’s ISNI and social accounts to tomorrow’s W3C Verifiable Credentials and even agentic identity systems.
This adaptability is particularly critical for media industries because digital content can be discovered and consumed decades after creation. Provenance data needs to persist across the content’s entire lifecycle, requiring what Eric calls “archival-quality identity”. Unlike transactional systems that only need authorisation at the point of use, such as purchasing an item online, media attribution can become more valuable as content gains cultural significance or commercial success. Sample clearances, royalty disputes, and copyright claims can arise years later, demanding granular, persistent attribution records that today’s identity token-based models like OAuth don’t provide.
As Eric explains, “if I produce a piece of content today, and you happen to find it in 2030 or 2040, I would like you to be able to understand that it was me that produced that, and to have confidence that you correctly attribute it to me. But that sort of lasting, archival quality identity, is shaky. I think the AI systems are especially shaky on that front.”
But what if every track could carry its creative history? A kind of musical DNA that travels with the content, recording not just what was made, but who made it, how, and under what authority.
The Complexity of Agentic Identity
This type of content DNA becomes essential with agentic AI systems. Unlike generative AI tools that simply transform input to output, agents pursue goals over time, coordinating multiple tools and delegating to other agents. When a music producer delegates post-production to an AI agent that then assigns harmonisation to one agent and mastering to another, the non-deterministic nature means every delegation, agent version, and training input must be recorded in case of future disputes.
This creates a fundamental distinction in attribution requirements. Tools are deterministic, their provenance handled by C2PA which can reliably attest to what happened inside a capture device or editing suite. Agents are non-deterministic, making autonomous choices and passing work along delegation chains. CAWG addresses this by developing persistent, verifiable identifiers that survive across delegations and enable authorisation chains to be traced.
In media industries, the complexity extends beyond identity into rights management and remuneration. The JPEG Trust Initiative, an ISO standards effort collaborating with CAWG, is standardising how usage permissions and commercial terms travel with content. Together, C2PA, CAWG, and JPEG Trust form a layered trust stack, proving what happened, who did it, and under what rights.
This infrastructure enables critical uses cases for the agentic web, such as:
- AI Disclosure Granularity: Moving beyond binary “AI or not AI” labels to capture the spectrum of AI involvement.
- Copyright Protection: Recording types and quantities of input from humans, agents, instruments, or other sources, to establish legal protection for mixed human-AI works, as fully AI-generated works cannot be copyrighted.
- Platform Identification: Indicating content boundaries and licensing restrictions while maintaining creator control over broader commercial use.
These capabilities can unlock automated royalty distribution, combat unauthorised training data use, and support new discovery mechanisms between creators and fans. Alongside these market-driven opportunities, regulatory pressure is simultaneously accelerating Content Credential adoption across industries.
The Drivers from Compliance to Opportunity
Steps towards mandating content labelling have already begun. California has proposed legislation that would fine platforms $5,000 a day for failing to label AI content, with implementation targeted for 2026 8. The EU is similarly considering disclosure requirements for AI-assisted and generated content for 2026 9. These disclosure laws will catalyse C2PA adoption as platforms need the infrastructure to record content provenance and AI involvement to comply with regulations.
Some may see these laws as a regulatory burden, or worry that they will create surveillance infrastructure, forcing creators to expose more than they wish, but the technical reality is different. C2PA alone only records the tools used and when, allowing for total anonymity of the creator. CAWG equally gives complete control to the creators in what they disclose. The technical architecture enables privacy by letting creators choose which identity signals amplify their message or benefit their attribution goals. There’s no requirement to tie your entire identity to one piece of content.
To further increase creator flexibility, CAWG is now developing a new mechanism called ‘identity hooks’ as a way to delay attribution decisions until creators know which identity signals they need. When creators are authenticated in a phone or editing tool, that system can both sign via C2PA and attest the creator was logged in during the creation process. This establishes a stable anchor at the time of creation that creators can hook back into later when they need to attach a relevant persona or credential. As Andrew Dworschak, Co-founder of Yakoa, says, “[Identity hooks] bring flexibility so that a creator can have maximum optionality down the line when they realise they need [attribution] to support their content flow”.
Andrew’s company builds digital rights protection technology for creators and he sees even broader opportunities from Content Credentials. For example, “allowing people to connect with each other in new ways, come to new agreements, and share revenue in ways that are appropriate to them.” Even today, Yakoa’s AI monitoring tool can help to identify where creators haven’t received proper attribution across platforms, shifting the conversation from reactive compliance to proactive rights-management infrastructure.
The Future of Creative Infrastructure
The infrastructure for content authenticity is already evolving. Regulations on content disclosure are effective from next year. Major technology providers and platforms have begun adopting the tools. The question isn’t whether this happens but who will shape how it develops.
CAWG’s open and credential-agnostic approach creates infrastructure that serves all creators regardless of size or association. The specifications continue to be written as new technologies emerge, and provenance data types continue to be developed. Altogether the ecosystem is enabling creative controllability while embracing AI’s collaborative potential.
For creative industries facing AI transformation, engaging with the working groups now means influencing the attribution systems that will eventually be as commonplace as nutrition labels.
Join the conversation:
- Participate in DIF’s Creator Assertions Working Group: Help shape the solutions that allow content creators to express individual and organisational intent about their content.
- Test the Tools: Experiment with identity assertions and provenance using Contentauthenticity.adobe.com
Join the Content Authenticity Initiative: Help the growing cross-industry ecosystem that is restoring trust and transparency online.
Endnotes
- McGill, Justin. 2025. https://brandwell.ai/blog/midjourney-statistics/. Brandwell.
- McGill, Justin. 2025. https://brandwell.ai/blog/midjourney-statistics/. Brandwell.
- Singla, A., et al.. 2025. The State of AI: How organizations are rewiring to capture value. Quantum Black AI by McKinsey
- Force, Eamonn. 2025. AI, bot farms and innocent indie victims: how music streaming became a hotbed of fraud and fakery. The Guardian.
- C2PA. C2PA Specifications.
- CAWG. CAWG Specifications.
- Youngs, Ian. 2024. FKA Twigs uses AI to create deepfake of herself. BBC UK.
- Cal Matters. SB 942: California AI Transparency Act.
- European Union. AI Act: Regulation (EU) 2024/1689.
A huge thank you to Eric Scouten, Scott Perry, Andrew Dworschak, Jeremy Uzan, and Erik Passoja for their time and insights in preparing this article.