- ARTiculate
- Posts
- When Seeing is No Longer Believing
When Seeing is No Longer Believing
What artists need to know about C2PA and content credentials
Support ARTiculate for as little as the price of a coffee
This is a reader-supported newsletter. If ARTiculate has ever pointed you towards something useful, introduced you to an artist you didn't know, or given you something to think about, consider supporting it with a small donation via Ko-fi or PayPal. Even a one-off contribution helps keep it going.
It was a joy to be stopped in the street twice in one day by people commenting on the newsletter, and to hear from someone who applied for an opportunity they found through ARTiculate and got it. Knowing that what's shared here is making a practical difference means everything. Thank you to everyone who supports it, in whatever way you do. 🫶🏾
When Seeing is No Longer Believing: What Artists Need to Know About Content Credentials
If you use Photoshop's healing brush, you are creating with AI. If you use digital audio tools that clean up audio, that's AI too. It's been part of creative software for decades, and for most of us, that's not the problem. Where many find a problem is with generative AI: the systems that produce images, video, text and audio from a prompt, trained on existing material that has mainly come from humans. You've probably seen the fruit and vegetable soap operas on TikTok and Instagram (although I am not going to lie, I am awaiting part 3 of the Bananita and Mangito story, and I do stop and enjoy watching that guy respond to those cat videos with guilt. I'll stop soon). It's all surreal AI-generated slop that clogs our feeds and seeps into our everyday lives. As enticing as some of it may be, let's be honest that it devalues visual creativity. Remember the Willy Wonka experience in Glasgow? An event sold on AI-generated promotional imagery and an AI-generated script that in real life was massively disappointing, and left paying families furious (but that gave us a lot of memes). There is also the growing number of reports about AI models built by scraping artists' work without permission, credit or payment.
It's right to be concerned about AI and artist sentiment reflects that. According to Artsy's inaugural AI Survey 2026, released just last week, and which gathered responses from more than 300 gallery professionals, a third of artists working with galleries are critical of AI due to ethical concerns such as data scraping, and 31% are opposed to it entirely. Only 9% of gallery professionals consider AI-generated art a legitimate new medium, and 25% see it as a destabilising force for authorship and value. Collector appetite is limited — 41% of galleries say AI rarely comes up with collectors, and 16% report that collectors actively avoid work made with AI assistance. That's not surprising. But generative AI isn't going away, whether we like it or not. Instead of asking how we eradicate generative AI, the more useful question may need to be around how we protect our work, and how do we hold ourselves and others to account when AI is involved? For artists thinking about their market and their reputation, transparency about how work is made is only going to become more important.
That's what I found myself thinking after attending a talk at CDI Collective, a community connecting creative, digital and tech businesses with academics, institutions and funders across the East Midlands. The speaker was Matt Ford, Founder and Creative Director of Origen Story, a storytelling platform creating films and interactive media that illuminate complex global challenges through ethical AI-enhanced visual storytelling and interactive experiences. His talk was titled "When Seeing is No Longer Believing." I registered expecting a conversation about spotting AI fakes, and although there was a fun game of spot-the-AI-video towards the end, what I left with was knowledge of C2PA, a global standard for content provenance. I couldn’t help but think about how this would impact artists, and even myself as a curator and commissioner of art. As generative AI becomes more commonplace (and harder to distinguish) in creative work, I think is going to become a cornerstone of how the creative industries operates.
C2PA stands for the Coalition for Content Provenance and Authenticity. Founded in 2021 by Adobe, Microsoft, the BBC, Intel, Arm and Truepic, it's a global open standard that allows any piece of digital media to carry a verifiable, tamper-evident record of its history. C2PA gives your digital work a trail, built in from the moment of creation.
The practical output is called Content Credentials, a verifiable record embedded directly into your file. Unlike ordinary metadata, if anyone tampers with a credentialled file, the system flags it. Content Credentials can record who created the work, when and where, what software was used, what edits were made, and whether AI tools were involved at any stage. They can also carry a Do Not Train signal, a machine-readable instruction telling AI companies not to use your file for model training. It's not yet legally enforced everywhere, but it puts your preference on record, and that will matter as legislation hopefully develops.
How C2PA Applies to Your Practice:
Painters and makers of physical work
C2PA applies from the moment you document your work digitally. When you photograph or scan a painting, you can attach Content Credentials at that point, recording your authorship, the date, and the device or software used. If you edit or colour-correct in Photoshop or Lightroom before sending images to a gallery, publisher or licensing agency, you can embed credentials at export. This creates a verifiable chain between the physical work and its digital representation, which matters when images of your work circulate online without context.
Photographers
This is where C2PA is most developed. Cameras from Leica, Sony and Nikon already support it natively, meaning credentials can be signed at the moment of capture. Adobe Lightroom and Photoshop allow you to apply and export Content Credentials with your images. If you license your work, this creates a clear record from shutter to client.
Film and video artists
Sony has released the first camcorder with C2PA support. For artists working with moving image, particularly in documentary or archival contexts, credentials can now travel with video files, recording what was captured, when, and what was edited in post.
Sound artists and musicians
As voice cloning and AI-generated audio become more common, attaching provenance data to recordings is becoming a meaningful protection, particularly for artists whose voice or sonic identity is central to their practice.
Graphic designers and those with commercial practice
Content Credentials can be embedded in documents and exported files, including PDFs. For designers working with brands, advertising agencies or editorial clients, being able to demonstrate authorship and a clear edit history is increasingly relevant, both for protecting your work and for meeting the transparency expectations of clients. Major organisations including the Associated Press, Reuters and the BBC are already part of the C2PA ecosystem, and this is likely to filter into commissioning and delivery expectations.
Artists working with AI tools
If generative AI is part of your practice, Content Credentials allow you to be transparent about what was human-made and what wasn't. This is something publishers and licensing agreements are increasingly asking about, and I don't think it will be long before galleries and commissioners will too. With 28% of galleries (according the the Arsty survey) describing AI art as an evolving category whose market value remains unclear, being able to document exactly how your work was made puts you in a much stronger position. Transparency is massively important in this context.
C2PA is growing but it isn't everywhere yet. Some platforms, including Instagram, don't yet display credentials. Screenshots and re-saves through unsupported software can break the record. The small CR symbol you might now see on LinkedIn means a file has provenance data attached. It does not mean the content is AI-generated, which is a common misconception worth clearing up.
I'm still learning about this myself so I don't have all the answers right now. However, the resources below will take you further:
contentauthenticity.org —Adobe's Content Authenticity Initiative.
contentcredentials.org/verify — Free tool to check any image's Content Credentials.
c2pa.org — The official C2PA site.
c2pa.wiki — Community-built resource guide.
If you've made it to this part — thank you for reading. I hope there's something in here that resonates or encourages you to keep going. And if you know someone who might benefit from these newsletters or sessions, feel free to forward this on and encourage them to subscribe.
Reply