AI Drift Detection

Definition

The monitoring and detection of changes in AI system behavior over time. Drift can manifest as: model drift (performance degradation), data drift (input distribution changes), concept drift (relationship changes between inputs and outputs), and context drift (multi-turn conversations where the model's understanding diverges from reality). In agentic AI, state drift refers to system state changing between evaluation and execution.

Why It Matters

Drift is insidious because it's gradual and invisible. A model that was 99% accurate at deployment can degrade to 95% over months without any alerts. In agentic systems, state drift between evaluation and execution creates TOCTOU vulnerabilities. Context drift in multi-turn conversations can cause the model to take decisions based on stale or contradictory information.

How Exogram Addresses This

Exogram detects state drift through SHA-256 state hashing (evaluation vs commit comparison), detects context drift through conflict detection across sessions, and maintains temporal tracking to flag expired or stale facts. Confidence decay functions automatically reduce authority of aging information.

Related Terms

medium severityProduction Risk Level

Key Takeaways

  • This concept is part of the broader AI governance landscape
  • Production AI requires multiple layers of protection
  • Deterministic enforcement provides zero-error-rate guarantees

Governance Checklist

0/4Vulnerable

Frequently Asked Questions