Model Retraining

The process of updating an AI model with new or refreshed data to maintain performance and compliance as data distributions evolve.

Definition

A scheduled or trigger-based pipeline that ingests new labeled data (e.g., recent transactions), retrains the model with updated parameters, validates against current benchmarks, and deploys the new version. Governance defines retraining frequency, approval gates (automated tests, validation reviews), rollback protocols, and documentation requirements to ensure controlled updates.

Real-World Example

A logistics company’s demand-forecast model retrains monthly on the past 90 days of shipment data, runs automated validations (accuracy, drift, fairness), and deploys the updated model during off-peak hours. If post-deployment metrics drop, the system rolls back to the previous version.