Enterprise AI Architecture

Preventing AI Agent Double-Spends

How Exogram uses Cryptographic Execution Idempotency to mathematically guarantee agents never execute the same payload twice during network retries.

01. The Architectural Threat

  • AI agents interact with production systems over unreliable networks. Network timeouts happen.
  • When a timeout occurs, agents often retry the exact same execution token, assuming the first attempt failed.
  • If the downstream API is not perfectly idempotent, the agent might charge a credit card twice, send duplicate emails, or provision duplicate resources.
  • Relying on LLMs to self-correct during a retry loop is completely probabilistic and actively dangerous.

02. The Exogram Resolution

  • Exogram implements a hard, infrastructure-level idempotency lock on every execution path.
  • Every tool call payload is hashed and bound to a unique evaluation token.
  • The database strictly tracks `EVALUATED` versus `EXECUTED` state.
  • If an agent retries an execution that is already sealed, Exogram instantly returns an HTTP 409 Conflict, mathematically halting the double-spend before it reaches the external API.

Technical Implementation Blueprint

// The Exogram execution lock process:

1. Agent proposes an action.
2. Exogram evaluates it, generating an evaluation_id and an idempotency_key.
3. Exogram locks the transaction in Supabase as EVALUATED.
4. If the agent retries the same idempotency_key, Exogram hits the DB unique constraint and halts.
5. Once the action executes downstream, the commit endpoint flips the status to EXECUTED.
6. The state is cryptographically sealed. Replay attacks are neutralized.

Frequently Asked Questions

Why not just make the downstream API idempotent?

You should. But 90% of legacy APIs are not idempotent. Exogram provides an idempotency boundary so you don't have to rewrite your entire backend.

Can the agent bypass the idempotency key?

No. The idempotency key is deterministically generated by Exogram based on the payload hash and session context. The LLM does not generate it.

Explore Other Blueprints