Skip to content

Transparency and Explainability

Purpose: To ensure that AI systems and their outputs are understandable, traceable and open to scrutiny but those who use them, are affected by them or are responsible for their governance.

Organisational / Technical Measure
A. Openness about tool use and purpose
ORG There is clear guidance on the existence, purpose and intended use of the AI tool
ORG Information about why the tool is used and how it may affect individuals is made available in accessible non-technical language
ORG Responsibility for the tool/system governance and oversight is clearly communicated
ORG Obligations are reviewed whenever the tools scope or use changes
B. Accessibility of information
BOTH Information about the tools functioning, inputs, outputs and safeguards is accessible to all relevant users
ORG Communication channels and formats are adapted to different roles, languages, digital literacy and accessibly needs
ORG Users know where to find information and whom to contact with concerns about the tool
BOTH Transparency is communicated clearly in non-technical language and without intent to overwhelm the reader
C. Explainability of outputs and decisions
TECH The system provides explanation of outputs at a level appropriate to their use context
BOTH Explanations enable users to understand key factors, limitations and uncertainty associated with outputs
ORG Explanations support meaningful review, contestation or justification of AI-supported decisions

Source: AIOLIA deliverable 3.1