Chapter 8

Clause 8: Operation

Operational planning and control, AI risk assessment execution, risk treatment implementation, and AI system impact assessment.

25 min read

Chapter Overview

Description

Clause 8 is where the AIMS becomes operational. This clause covers the execution of risk assessments, implementation of risk treatments, and critically, the AI system impact assessment (8.4) - a unique ISO 42001 requirement not found in other management system standards.

Clause Structure

Sub-clauseTitleFocus
8.1Operational planning and controlPlanning and controlling AIMS processes
8.2AI risk assessmentConducting risk assessments
8.3AI risk treatmentImplementing risk treatment plans
8.4AI system impact assessmentAssessing impacts on individuals and society

8.1 Operational Planning and Control

Requirement

The organization shall plan, implement, and control the processes needed to meet requirements by:

  • Establishing criteria for the processes
  • Implementing control of the processes in accordance with the criteria
  • Keeping documented information to have confidence that processes have been carried out as planned

The organization shall control planned changes and review the consequences of unintended changes, taking action to mitigate any adverse effects.

The organization shall ensure that outsourced processes are controlled.

Key Concept

Operational control means having defined processes with clear criteria, implementing them consistently, and maintaining evidence that they work as intended. This applies to all AI-related activities within your AIMS scope.

Operational Controls for AI

Process AreaControl Examples
AI DevelopmentDevelopment standards, code review, testing requirements
Data ManagementData quality checks, provenance tracking, bias testing
Model TrainingTraining protocols, validation procedures, documentation
DeploymentApproval gates, deployment checklists, rollback procedures
MonitoringPerformance metrics, drift detection, alert thresholds
Change ManagementChange requests, impact assessment, approval workflow
Incident ResponseIncident procedures, escalation paths, communication plans

Outsourced Processes

If you outsource AI-related processes (cloud AI services, model development, data labeling), you must:

  • Define requirements for outsourced processes
  • Include AI governance requirements in contracts
  • Monitor supplier performance
  • Retain accountability for outsourced activities

8.2 AI Risk Assessment

Requirement

The organization shall perform AI risk assessments at planned intervals, or when significant changes are proposed or occur, taking account of the criteria established in 6.1.2.

The organization shall retain documented information of the results of the AI risk assessments.

When to Conduct Risk Assessments

Planned intervals: Annual review of all AI systems
New AI systems: Before deployment approval
Significant changes: Model updates, new data sources, expanded use cases
Incidents: After AI-related incidents or near-misses
External changes: New regulations, technology changes

Risk Assessment Execution

StepActivitiesOutputs
1. PreparationDefine scope, gather information, assemble teamAssessment plan
2. Risk IdentificationIdentify risks across all categories and lifecycle stagesRisk list
3. Risk AnalysisAssess likelihood and consequence for each riskRisk ratings
4. Risk EvaluationCompare against criteria, prioritize risksPrioritized risk register
5. DocumentationRecord results, recommendations, decisionsRisk assessment report

AI System Lifecycle Risk Considerations

Lifecycle StageRisk Considerations
DesignRequirements gaps, ethical considerations, feasibility
Data CollectionBias, privacy, consent, representativeness
DevelopmentModel selection, training issues, security vulnerabilities
TestingInadequate testing, edge cases, adversarial inputs
DeploymentIntegration issues, user readiness, rollback capability
OperationPerformance drift, misuse, unintended consequences
MonitoringDetection gaps, alert fatigue, response delays
RetirementData retention, knowledge loss, transition risks

8.3 AI Risk Treatment

Requirement

The organization shall implement the AI risk treatment plan.

The organization shall retain documented information of the results of the AI risk treatment.

Risk Treatment Implementation

This clause requires you to actually implement the controls and actions identified in your risk treatment plan (created under 6.1.3). It's not enough to have a plan - you must execute it and document the results.

Treatment Implementation Process

  1. Review approved risk treatment plan
  2. Assign implementation responsibilities
  3. Allocate resources (budget, personnel, tools)
  4. Implement controls according to plan
  5. Document implementation evidence
  6. Verify control effectiveness
  7. Update risk register with residual risk levels
  8. Report to risk owners

Evidence of Risk Treatment

Treatment TypeEvidence Examples
Technical controlsSystem configurations, test results, monitoring dashboards
Process controlsProcedures, work instructions, process records
TrainingTraining records, competence assessments
DocumentationPolicies, guidelines, user documentation
Third-party controlsContracts, SLAs, audit reports

8.4 AI System Impact Assessment

Requirement

The organization shall conduct an AI system impact assessment for AI systems, taking into account the potential consequences of the AI system for individuals, groups of individuals, and societies.

The organization shall retain documented information of the results of the AI system impact assessment.

Critical: Unique ISO 42001 Requirement

Clause 8.4 is unique to ISO 42001 and not found in ISO 27001 or other management system standards. It requires assessing impacts beyond traditional risk assessment - specifically focusing on how AI systems affect people and society.

Impact Assessment vs Risk Assessment

AspectRisk Assessment (8.2)Impact Assessment (8.4)
FocusRisks to organization and AIMSImpacts on individuals and society
PerspectiveOrganization-centricHuman-centric
ScopeAll risk categoriesConsequences for people
OutputRisk register, treatment planImpact assessment report

Impact Categories

Impacts on Individuals

CategoryExamples
Rights & FreedomsPrivacy, freedom of expression, non-discrimination
Safety & HealthPhysical safety, mental health, wellbeing
EconomicEmployment, financial decisions, access to services
AutonomyDecision-making ability, informed consent, choice
DignityRespect, fair treatment, human oversight

Impacts on Society

CategoryExamples
SocialSocial cohesion, inequality, discrimination at scale
EconomicLabor market disruption, economic concentration
DemocraticMisinformation, manipulation, surveillance
EnvironmentalEnergy consumption, resource use, e-waste
CulturalCultural bias, homogenization, loss of diversity

Impact Assessment Process

  1. Identify AI System: Define the AI system being assessed
  2. Describe Functionality: What does the AI system do?
  3. Identify Affected Parties: Who is affected by the AI system?
  4. Assess Individual Impacts: Evaluate impacts on individuals
  5. Assess Societal Impacts: Evaluate broader societal impacts
  6. Evaluate Severity: Rate the severity of identified impacts
  7. Identify Mitigations: Define measures to reduce negative impacts
  8. Document Results: Create impact assessment report
  9. Review and Approve: Obtain appropriate approvals
  10. Monitor: Ongoing monitoring of actual impacts

Template: AI System Impact Assessment

AI System Impact Assessment Template

1. AI SYSTEM IDENTIFICATION
• System Name:
• System ID:
• Business Owner:
• Assessment Date:
• Assessor(s):

2. SYSTEM DESCRIPTION
• Purpose and objectives:
• AI type (ML, NLP, computer vision, etc.):
• Input data:
• Outputs and decisions:
• Autonomy level:
• Scale of use:

3. AFFECTED PARTIES
• Direct users:
• Subjects of AI decisions:
• Indirectly affected groups:
• Vulnerable groups:

4. INDIVIDUAL IMPACT ASSESSMENT
For each impact category, assess: Description, Likelihood (1-5), Severity (1-5), Affected Groups, Existing Mitigations, Residual Impact Level

5. SOCIETAL IMPACT ASSESSMENT
For each societal category, assess: Description, Scale, Severity, Existing Mitigations, Residual Impact Level

6. MITIGATION MEASURES
• Additional controls required:
• Human oversight measures:
• Transparency measures:
• Monitoring requirements:

7. CONCLUSIONS
• Overall impact rating:
• Recommendation (proceed/proceed with conditions/do not proceed):
• Conditions or requirements:

8. APPROVAL
• Assessed by: [Name, Date]
• Reviewed by: [Name, Date]
• Approved by: [Name, Date]

Documented Information Requirements

Mandatory Documents - Clause 8

Required:
• Operational planning documentation (8.1)
• AI risk assessment results (8.2)
• AI risk treatment results (8.3)
• AI system impact assessment results (8.4)

Recommended:
• Operational procedures for AI activities
• Change management records
• Outsourced process controls
• Impact assessment methodology

Sample Audit Questions

Auditor Questions - Clause 8

8.1 Operational Planning:
• How do you control AI-related operational processes?
• What criteria have you established for AI processes?
• How do you manage changes to AI systems?
• How do you control outsourced AI processes?

8.2 AI Risk Assessment:
• Show me a completed AI risk assessment
• How often do you conduct risk assessments?
• What triggers a new risk assessment?
• How do you ensure assessments cover the full AI lifecycle?

8.3 AI Risk Treatment:
• Show me evidence that risk treatments have been implemented
• How do you verify that controls are effective?
• What is the current status of your risk treatment plan?

8.4 AI Impact Assessment:
• Show me an AI system impact assessment
• How do you assess impacts on individuals?
• How do you consider societal impacts?
• Who approves impact assessments?
• How do you handle AI systems with significant negative impacts?

Common Nonconformities

TypeNonconformityHow to Avoid
MajorNo AI system impact assessments conductedImplement impact assessment process
MajorRisk assessments not covering AI lifecycleUse lifecycle-based assessment approach
MajorNo evidence of risk treatment implementationDocument implementation evidence
MinorImpact assessments missing societal considerationsInclude societal impact categories
MinorOperational criteria not documentedDocument process criteria
MinorOutsourced processes not controlledInclude AI requirements in contracts
MinorChanges not systematically managedImplement change management process
Key Takeaways - Clause 8

1. Operational control requires defined criteria, implementation, and evidence
2. Risk assessments must be conducted at planned intervals and when changes occur
3. Risk treatment plans must be implemented and documented
4. AI system impact assessment (8.4) is unique to ISO 42001
5. Impact assessment focuses on individuals and society, not just organizational risk
6. Outsourced AI processes must be controlled

Exam Tips - Clause 8

• Know that 8.4 (AI system impact assessment) is unique to ISO 42001
• Understand the difference between risk assessment and impact assessment
• Remember that risk assessments must cover the entire AI lifecycle
• Know that outsourced processes must be controlled
• Impact assessment considers individuals AND society
• All four sub-clauses require documented information

AI Assistant
00:00