FAIA – Fair AI Attribution
Last updated
Was this helpful?
Last updated
Was this helpful?
The FAIA project (Fair AI Attribution) develops an open, structured framework to disclose the role of artificial intelligence in content creation. FAIA is developed by in collaboration with and the , addressing a growing need for transparency in digital publishing and research.
As AI tools become more common in writing, publishing, and media production, transparency is no longer optional - it’s essential for trust, credibility, and compliance. With the increasing integration of generative AI tools – ranging from writing assistants to full content generators – creators, publishers, and academic researchers need a consistent and verifiable way to indicate whether and how AI has contributed to the content. FAIA provides that mechanism.
Implemented as a plugin in the Liccium software, FAIA allows creators to document and share how AI contributed to their work.
FAIA enables creators and rightsholders to:
Disclose how AI was involved in the creation or editing of content.
Align with regulatory requirements like the EU AI Act (Art. 50).
Promote provenance, reproducibility, and trust in digital publishing.
The framework is implemented as a Liccium plugin, allowing users to flag AI involvement directly within Liccium's declaration and signing interface. These flags are machine-readable, interoperable, and persistently linked to the content.
This short video introduces the FAIA initiative. It highlights the problem of AI opacity, emerging regulation like the EU AI Act, and a practical solution for certifying content with verifiable AI involvement.
FAIA introduces a controlled vocabulary for AI contribution, modelled after standards like the IPTC Digital Source Type, which can be used for visual content, and aligning with proposals like the STM Association’s classification system.
Using the FAIA plugin, users can:
Select AI involvement types during declaration process.
Potentially provide details, e.g. provide tool versions and usage context (e.g., "ChatGPT-4, used for summarization").
Bind this information cryptographically to the content's ISCC code, ensuring it travels with the media.
All declarations are verifiable, timestamped, and publicly resolvable via Liccium registries.
FAIA promotes ethical publishing practices by:
Helping creators and researchers avoid misrepresentation.
Supporting disclosure requirements from journals, funding bodies, and institutions.
Enabling platforms and users to filter or verify content based on transparency metadata.
It is designed for cross-sector use but initially focuses on academic and trade publishing workflows.