Model Governance
The policies, roles, and controls that ensure AI models are developed, approved, and used in line with organizational standards and regulatory requirements.
Definition
The overarching framework that specifies model-risk policies, defines stakeholder responsibilities (owners, validators, operators), prescribes approval workflows (impact assessments, ethics review), and enforces controls (versioning, access restrictions). Model governance ensures consistency, accountability, and regulatory compliance across all models, from prototype to retirement.
Real-World Example
A global insurer enforces model governance by requiring: (1) every new model to complete a Governance Approval Form; (2) quarterly audit of production models for policy adherence; and (3) automatic deprecation of models lacking current approval—ensuring only compliant models operate.