Chapter 16

Annex A Controls: Information for Interested Parties (A.8)

Detailed guidance on implementing Annex A controls for AI transparency and communication (A.8), covering stakeholder information, documentation, and explainability with 4 controls.

15 min read

Chapter Overview

This chapter covers the Information for Interested Parties domain (A.8), which ensures organizations communicate appropriately about AI systems to stakeholders. This includes transparency, user documentation, and explainability. This domain contains 4 controls.

A.8 Information for Interested Parties of AI Systems

Transparency and communication are essential for building trust in AI systems.

Why Transparency Matters

Stakeholders need information about AI systems to:
• Make informed decisions
• Understand how they're affected
• Exercise their rights
• Trust the organization
• Comply with regulations (EU AI Act requires transparency)

A.8.2 Communication of Information to Interested Parties

AttributeDetails
ControlRelevant information about AI systems shall be communicated to interested parties.
PurposeEnsure stakeholders have necessary information
Related Clause7.4 (Communication)

Implementation Guidance

  • Identify information needs of different stakeholders
  • Determine appropriate communication channels
  • Develop communication materials
  • Ensure accessibility of information
  • Maintain communication records
  • Update communications when AI systems change

Stakeholder Information Needs

StakeholderInformation NeedsChannels
UsersHow to use, limitations, support contactsUser guides, help systems, training
AI SubjectsThat AI is being used, how decisions are made, rightsNotices, disclosures, privacy policies
RegulatorsCompliance information, technical detailsReports, documentation, audit access
CustomersAI capabilities, limitations, data practicesContracts, documentation, websites
PublicAI governance approach, responsible AI commitmentWebsite, annual reports, press
Audit Questions - A.8.2

• What information do you communicate about AI systems?
• How do you identify stakeholder information needs?
• Show me communications for [specific stakeholder group]
• How do you ensure communications are accessible?
• How do you update communications when AI systems change?

A.8.3 User Documentation

AttributeDetails
ControlDocumentation for users shall be provided according to defined requirements.
PurposeEnable effective and appropriate use of AI systems
Related Clause7.5 (Documented information)

Implementation Guidance

  • Define documentation requirements for each AI system
  • Create user-appropriate documentation
  • Include usage instructions, limitations, and warnings
  • Provide training materials where needed
  • Keep documentation current
  • Make documentation accessible

User Documentation Content

SectionContent
PurposeWhat the AI system does and its intended use
InstructionsHow to use the system correctly
Inputs/OutputsWhat inputs are needed, what outputs are produced
LimitationsWhat the system cannot do, known constraints
WarningsPotential risks, misuse scenarios to avoid
SupportHow to get help, report issues
UpdatesHow users are notified of changes
Documentation Best Practices

Effective user documentation:
• Written for the target audience (not too technical)
• Includes practical examples
• Clearly states limitations
• Easy to find and access
• Kept up to date
• Available in appropriate languages

Audit Questions - A.8.3

• What user documentation exists for AI systems?
• Show me documentation for [specific AI system]
• How do you determine documentation requirements?
• How do you keep documentation current?
• How do users access documentation?

A.8.4 Information Regarding AI Interaction

AttributeDetails
ControlInterested parties interacting with or subject to AI system decisions shall be informed of their interaction with the AI system, as appropriate.
PurposeEnsure people know when AI is involved
Related Clause7.4 (Communication)

Implementation Guidance

  • Identify where AI interaction disclosure is needed
  • Determine appropriate disclosure methods
  • Implement disclosure mechanisms
  • Ensure disclosures are clear and understandable
  • Document disclosure practices
  • Monitor compliance with disclosure requirements

AI Interaction Disclosure Scenarios

ScenarioDisclosure Approach
Chatbot/Virtual Assistant"You are chatting with an AI assistant"
Automated Decision"This decision was made using automated processing"
AI-Generated ContentLabel indicating AI generation
AI Recommendation"This recommendation is AI-generated"
Biometric AINotice of AI-based biometric processing
Regulatory Requirements

EU AI Act requires disclosure when:
• Interacting with AI systems (e.g., chatbots)
• Subject to emotion recognition or biometric categorization
• Exposed to AI-generated/manipulated content (deepfakes)

Disclosure must be "clear and distinguishable" at the point of interaction.

Audit Questions - A.8.4

• How do you inform people when they interact with AI?
• Show me AI interaction disclosures
• What triggers the need for disclosure?
• How do you ensure disclosures are clear?
• How do you handle regulatory disclosure requirements?

A.8.5 Information for Achieving Explainability

AttributeDetails
ControlInformation for achieving explainability of AI system outputs and the AI system's functioning shall be documented according to defined requirements.
PurposeEnable understanding of how AI systems work and make decisions
Related Clause7.5 (Documented information)

Implementation Guidance

  • Define explainability requirements for each AI system
  • Determine appropriate level of explanation
  • Implement explainability mechanisms
  • Document how explanations are generated
  • Validate explanations are accurate and useful
  • Make explanations accessible to intended audience

Levels of Explainability

LevelDescriptionAudience
System-levelOverall system purpose and approachGeneral stakeholders
Model-levelHow the model works, key factorsTechnical reviewers
Decision-levelWhy a specific output was producedUsers, affected individuals
Feature-levelWhich inputs influenced the outputTechnical users, auditors

Explainability Techniques

TechniqueDescriptionUse Case
Feature ImportanceRank inputs by influence on outputUnderstanding key drivers
SHAP ValuesContribution of each feature to predictionIndividual decision explanation
LIMELocal interpretable explanationsExplaining specific predictions
CounterfactualsWhat would change the outcomeActionable insights
Decision TreesRule-based explanation extractionSimplified logic representation
Attention MapsWhat the model focused onImage/text model explanations
Explainability Documentation

Document for each AI system:
• Explainability requirements (who needs what level)
• Explainability approach and techniques used
• Limitations of explanations
• How explanations are generated
• How explanations are presented to users
• Validation of explanation accuracy

Audit Questions - A.8.5

• What are your explainability requirements?
• How do you explain AI decisions?
• Show me explanation mechanisms for [AI system]
• How do you validate explanations are accurate?
• What explainability techniques do you use?
• How do you document explainability?

Control Implementation Summary

ControlKey EvidenceCommon Gaps
A.8.2 CommunicationCommunication materials, stakeholder analysisAd-hoc communication only
A.8.3 User DocumentationUser guides, help systems, training materialsDocumentation incomplete or outdated
A.8.4 AI InteractionDisclosure notices, labeling mechanismsNo disclosure when AI is used
A.8.5 ExplainabilityExplainability documentation, technique evidenceBlack box with no explanations
Key Takeaways - A.8

1. Communication should be tailored to different stakeholder groups
2. User documentation must include limitations and warnings
3. People must be informed when interacting with AI (regulatory requirement)
4. Explainability requirements vary by audience and context
5. Explanations must be documented, accurate, and accessible
6. Transparency builds trust and supports compliance

AI Assistant
00:00