Formal Verification

Mathematically proving that AI algorithms comply with specified correctness properties, often used in safety-critical systems.

Définition

Applies formal methods (model checking, theorem proving) to verify properties like invariants, absence of runtime errors, or safety constraints. Formal verification is resource-intensive and best suited for critical components (e.g., control systems in automotive, avionics). Governance requires defining formal specifications, selecting suitable proof tools, and maintaining proofs alongside code updates.

Exemple concret

An aerospace firm uses formal verification on its autonomous-drone collision-avoidance module. They specify safety invariants (“distance to obstacles must never fall below 2 meters”) and use a model checker to prove the flight-control software respects these invariants under all modeled flight conditions—ensuring certification readiness.