The global skills and competency framework for the digital world

Applying SFIA skills to AI governance and assurance capabilities

Keep the generic SFIA meaning intact, show how it applies in AI governance practice, help AI governance specialists and AI engineering/operations recognise their work using SFIA as a common language.

    Using SFIA to describe AI governance and assurance capabilities


    Governance (GOVN)

    SFIA definition (generic)
    Defining and operating frameworks for decision-making, risk management, stakeholder relationships and compliance with organisational and regulatory obligations.

    Illustration of how this looks in AI governance

    • defining the organisation’s AI governance framework aligned to regulatory obligations (e.g. high-risk AI classification rules)
    • establishing AI decision forums and clarify accountability between engineering, legal and risk
    • determining what constitutes acceptable AI use under organisational values
    • setting reporting structures for AI incidents and model risk
    • acting as the primary contact for regulators on AI governance matters

    This is not simply “writing a policy”.
    It is about shaping how the organisation approaches AI decision-making and authority.


    Risk management (BURM)

    SFIA definition (generic)
    Planning and implementing processes for managing risk across the enterprise, aligned with organisational strategy and governance frameworks.

    Illustration of how this looks in AI governance

    • defining how AI model risks are identified, categorised and prioritised
    • integrating bias risk, drift risk and misuse risk into enterprise risk frameworks
    • establishing risk thresholds aligned to board-approved risk appetite
    • ensuring risk reporting from model monitoring feeds into enterprise dashboards
    • advising executives when AI risk exposure exceeds defined tolerance

    This helps connect AI-specific risks to enterprise risk language.


    Quality assurance (QUAS)

    SFIA definition (generic)
    Assuring, through ongoing and periodic assessments and reviews, that the organisation’s quality objectives are being met.

    Illustration of how this looks in AI governance

    • reviewing whether AI projects have completed required impact assessments
    • examining evidence that testing, validation and monitoring controls have occurred
    • identifying non-conformances against AI governance standards
    • ensuring corrective actions are assigned and tracked
    • reporting systemic quality weaknesses in AI control implementation

    This can help to provide confidence that governance is being followed in practice.


    Quality management (QUMG)

    SFIA definition (generic)
    Defining and operating a management framework of processes and working practices to deliver the organisation's quality objectives.

    Illustration of how this looks in AI governance

    • establishing an AI governance management system
    • defining metrics for governance performance (e.g. assessment coverage, incident resolution times)
    • embedding AI assurance checkpoints into lifecycle processes
    • aligning AI governance standards with ISO, sector standards or regulatory frameworks
    • continuously improving governance processes based on audit findings

    This is about building a more repeatable system, rather than only reviewing outputs


    AI and data ethics(AIDE)

    SFIA definition (generic)
    Implementing and promoting ethical practices in the design, development, deployment and use of AI and data technologies.

    Illustration of how this looks in AI governance

    • conducting or overseeing AI ethical impact assessments
    • defining fairness, transparency and accountability criteria
    • advising engineering teams on ethical trade-offs
    • responding to AI ethical incidents
    • setting direction for responsible AI principles aligned to industry standards

    This helps organisations consider how AI governance aligns with both compliance obligations and organisational values.


    Information and data compliance(PEDP)

    SFIA definition (generic)
    Implementing and promoting compliance with information and data management legislation.

    Illustration of how this looks in AI governance

    • ensuring AI systems comply with data protection legislation
    • defining privacy-by-design expectations for model training and deployment
    • maintaining records of legislated data used in AI systems
    • investigating data-related breaches involving AI
    • acting as organisational contact for regulators regarding data use in AI

    This helps anchor discussions of AI governance in legal accountability.


    How the generic and specific combine

    Using this pattern

    • SFIA defines the professional capability
    • The AI context defines the domain application

    For example:

    • Risk management (BURM) does not become “AI risk management”
    • Governance (GOVN) does not become “AI governance”

    Instead, the professional skill is applied in an AI domain. This preserves portability.


    Why using generic SFIA skills helps employers and industry bodies

    1. Consistency across domains

    AI governance uses:

    • governance
    • risk management
    • quality assurance
    • compliance
    • ethics

    These are not new capabilities. They are established professional skills.

    Using SFIA avoids inventing fragmented AI-specific taxonomies.


    2. Clear levelling of responsibility

    SFIA levels clarify:

    • contributing to governance (Level 4)
    • defining and leading governance (Level 5)
    • shaping organisational governance strategy (Level 6–7)

    This helps employers distinguish:

    • practitioner
    • governance lead
    • executive risk owner

    3. Portability and mobility

    A governance professional can move between professional domains such as:

    • cyber security governance
    • data governance
    • AI governance
    • enterprise risk

    Because the underlying SFIA skills are consistent.

    This supports career mobility and workforce planning.


    4. Regulatory defensibility

    Regulators often look for:

    • defined governance structures
    • integrated risk management
    • quality assurance mechanisms
    • compliance accountability

    SFIA already describes these capabilities clearly.

    Using generic SFIA skills can help organisations articulate:

    • professional maturity
    • structured capability
    • recognised competence definitions

    Rather than ad hoc AI labels.


    5. Avoiding “AI inflation”

    Not every AI governance role requires:

    • data science expertise
    • model engineering skills

    Using SFIA keeps focus on:

    • responsibilities
    • accountability
    • decision authority

    rather than technical proximity.


    Summary

    AI governance and assurance does not need to be seen as an entirely new profession.
    It is the disciplined application of established professional skills — governance, risk, quality, ethics and compliance — in an AI context.

    SFIA provides:

    • a shared language
    • consistent levelling
    • portability across sectors
    • defensible professional definitions

    The AI domain provides the context.

    Together, they can describe the role clearly without the need to invent a new taxonomy.

    SFIA supports structured discussion about roles and responsibilities, but does not replace professional judgement or determine compliance outcomes.


    Using SFIA to describe AI engineering / operations capabilities

    AI engineering and operations roles build, deploy and run AI systems reliably at scale. The SFIA skills describe the professional capabilities required — the AI context defines where and how they are applied.


    Data engineering (DENG)

    SFIA definition (generic)
    Designing, building, operationalising, securing and monitoring data pipelines, stores and real-time processing systems for scalable and reliable data management.

    Illustration of how this looks in AI engineering

    • designing training and inference data pipelines
    • building batch and real-time ingestion pipelines
    • implementing feature stores
    • enforcing data validation and integrity rules
    • embedding encryption, access control and lineage tracking
    • operationalising DataOps practices
    • monitoring pipeline health, latency and failure modes

    At higher levels:

    • defining standards for ML data environments
    • selecting tooling for secure and scalable AI data platforms
    • supporting compliance with governance and regulatory requirements

    This is a foundational component of trustworthy AI systems - without stable data pipelines, model quality can degrade.


    Machine learning (MLNG)

    SFIA definition (generic)
    Developing systems that learn from data and experience, improving performance, accuracy and adaptability in dynamic environments.

    Illustration of how this looks in AI engineering

    • selecting appropriate algorithms for business problems
    • engineering features and training models
    • validating models for performance, robustness and bias
    • implementing MLOps for deployment and retraining
    • monitoring model drift and degradation
    • diagnosing performance anomalies

    • setting organisational ML standards
    • establishing responsible ML practices
    • defining lifecycle and traceability expectations

    This goes beyond local experimentation to production-grade machine learning engineering.


    Data science (DATS)

    SFIA definition (generic)
    Applying mathematics, statistics, data mining and predictive modelling techniques to gain insights, predict behaviours and generate value from data.

    Illustration of how this looks in AI engineering

    • forming hypotheses for predictive models
    • exploring data patterns and anomalies
    • evaluating model outputs for business impact
    • refining models based on observed performance
    • translating analytic findings into deployable model logic

    Data science provides the insight layer that feeds into ML pipelines and production systems.


    Systems design (DESN)

    SFIA definition (generic)
    Designing systems to meet specified requirements and agreed systems architectures.

    • A system is a structured arrangement of components, both physical and digital, designed to work together to meet specific requirements, considering factors such as scalability, security, human interaction and adherence to organisational and regulatory standards.

    Illustration of how this looks in AI engineering

    • AI systems are engineered systems, not isolated models
    • operational integration is intentional
    • non-functional requirements are designed in, not retrofitted
    • resilience and safety are system-level concerns

    It can help organisations articulate regulatory and engineering considerations.

    • designing end-to-end AI systems spanning data ingestion, model training, deployment, monitoring and feedback loops
    • defining how AI components interact with legacy systems and human workflows
    • designing fallback, override and human-in-the-loop mechanisms
    • specifying traceability between inputs, models, outputs and decisions
    • modelling failure modes, resilience strategies and recovery pathways
    • balancing functional performance with non-functional requirements such as reliability, explainability, privacy and safety
    • coordinating cross-domain trade-offs across software, infrastructure, security and data

    This goes beyond model design or code-level architecture, focusing on the AI system as a complete operational capability.

    Software design (SWDN)

    SFIA definition (generic)
    Architecting and designing software to meet specified requirements, ensuring adherence to established standards and principles.

    Illustration of how this looks in AI engineering

    • designing scalable inference services
    • architecting model-serving APIs
    • designing secure interfaces between models and downstream systems
    • evaluating architectural trade-offs (latency vs accuracy vs cost)
    • modelling failure modes and fallback behaviours
    • ensuring security and resilience by design

    AI engineering extends beyond modelling to include software architecture at scale.


    Solution architecture (ARCH)

    SFIA definition (generic)
    Developing and communicating a multi-dimensional solution architecture to deliver agreed business outcomes.

    Illustration of how this looks in AI engineering

    • designing end-to-end AI solutions spanning data, model, infrastructure and monitoring
    • aligning model deployment with enterprise architecture standards
    • balancing cost, scalability and reliability in cloud AI services
    • managing architectural deviations
    • coordinating architecture across multiple AI initiatives

    This ensures AI systems fit into the broader technology landscape.


    Systems and software lifecycle engineering (SLEN)

    SFIA definition (generic)
    Establishing and deploying an environment for developing, continually improving and securely operating software and systems products and services.

    Illustration of how this looks in AI engineering

    • embedding CI/CD for ML models
    • automating testing, validation and deployment
    • implementing monitoring and alerting for model and data health
    • integrating risk, quality and security into lifecycle processes
    • improving deployment pipelines through feedback loops
    • managing rollback and remediation processes

    This is the operational discipline that prevents AI systems becoming fragile prototypes.


    Infrastructure operations (ITOP)

    SFIA definition (generic)
    Provisioning, deploying, configuring, operating and optimising technology infrastructure across physical, virtual and cloud-based environments.

    Illustration of how this looks in AI engineering

    • provisioning GPU/compute environments
    • managing containerised inference services
    • monitoring performance, availability and security posture
    • automating infrastructure with IaC
    • responding to infrastructure incidents affecting AI services

    Reliable AI depends on reliable infrastructure.


    Infrastructure design (IFDN)

    SFIA definition (generic)

    Designing technology infrastructure to meet business requirements, ensuring scalability, reliability, security and alignment with strategic objectives.

    Illustration of how this looks in AI engineering

    • designing cloud-based ML platforms
    • defining storage and compute topology for model training
    • designing for resilience under load
    • incorporating data protection and regulatory considerations
    • evaluating new infrastructure technologies for AI workloads

    This supports the platform’s ability to evolve with AI demands.


    Supplier management (SUPP)

    SFIA definition (generic)
    Aligning the organisation’s supplier performance objectives and activities with sourcing strategies and plans, balancing costs, efficiencies and service quality.

    Illustration of how this looks in AI engineering

    • managing relationships with cloud AI and ML platform providers
    • monitoring service performance and reliability of hosted AI or ML services
    • assessing risks associated with third-party models or APIs
    • managing risks related to external data providers and data supply chains
    • understanding and managing AI capabilities embedded within purchased software and digital services
    • ensuring supplier contracts support security, privacy and regulatory obligations

    Context in AI systems

    AI capabilities are increasingly embedded within commercial software platforms and digital services. Organisations may deploy AI indirectly through HR systems, CRM platforms, analytics tools, security products or other enterprise software. Effective supplier management helps maintain visibility of these embedded AI capabilities and manage associated risks, performance expectations and compliance obligations.


    Sourcing (SORC)

    SFIA definition (generic)
    Managing, or providing advice on, the procurement or commissioning of products and services.

    Illustration of how this looks in AI engineering

    • evaluating build-versus-buy decisions for AI components and services
    • selecting cloud AI and machine learning platforms
    • assessing technical, commercial and operational trade-offs between vendors
    • defining procurement criteria for software and services that incorporate AI capabilities
    • negotiating contracts that reflect technical, regulatory and governance requirements

    Context in AI systems

    Many AI capabilities enter organisations through procurement decisions rather than internal development. Sourcing processes therefore play an important role in identifying where AI is embedded within purchased software and services, and ensuring that supplier selection aligns with governance, regulatory and risk management expectations.


    The combined picture

    AI engineering / operations roles apply:

    • engineering discipline (DESN, SWDN, ARCH)
    • data and ML capability (DENG, DATS, MLNG)
    • operational reliability practices (SLEN, ITOP, IFDN)
    • commercial and ecosystem management (SUPP, SORC)

    These are established professional capabilities. AI is the domain context.


    Why using generic SFIA skills helps employers and industry bodies

    1. Avoids narrow or buzzword “AI engineer” labels

    Instead of relying on newly created  job titles, SFIA shows:

    • what capability is required
    • at what level of responsibility
    • across which domains

    2. Supports career progression

    For example:

    • Level 3 ML engineer
    • Level 5 ML architect
    • Level 6 ML strategy lead

    The difference lies in responsibility, rather than job title labels.


    3. Enables cross-domain mobility

    An infrastructure architect in cloud services can transition into AI platform design because the underlying SFIA skills are portable.


    4. Connects engineering to governance

    Because governance roles also use SFIA, organisations can:

    • align engineering skills with oversight skills
    • identify gaps
    • clarify handoffs between first and second lines

    5. Strengthens workforce planning

    Employers can:

    • map roles to consistent professional skills
    • compare AI roles with cyber, data or platform roles
    • plan capability development
    • avoid duplication and scope creep

    Summary

    AI engineering / operations is not “just building models”. It can involve the integrated application of:

    • data engineering
    • systems design
    • software architecture
    • lifecycle engineering
    • infrastructure reliability
    • supplier and sourcing management

    SFIA provides a structured language to describe these capabilities clearly, consistently and portably. SFIA supports structured discussion of professional capability, but does not replace professional judgement or determine organisational or regulatory outcomes.

    The AI domain supplies the context. SFIA supplies the professional definition.