AI Writes Your Code. Who Watches for Drift?

Hey Cursor community,

We built a CLI tool that monitors your development signals and flags when patterns shift — something we’ve all experienced during longer Cursor sessions.

**The problem**: Cursor is great at writing code, but over a series of commits, subtle shifts can accumulate — more files touched than usual, dependencies growing, CI slowing down. Each individual change looks fine, but the pattern drifts.

**What Evolution Engine does**: It builds a statistical baseline from your git history and alerts you when metrics deviate significantly. Think of it as a drift alarm for your project.

**How it works with Cursor**:

1. Run `evo analyze .` after a coding session

2. If something drifted, it generates an investigation prompt

3. Paste that prompt into Cursor’s chat — it already has your project context

4. Cursor explains what shifted and whether it was intentional

No AI APIs on our end — your code stays local. Cursor does the investigation using your own setup.

```bash

pip install evolution-engine

evo analyze .

```

Also works as a GitHub Action or git hook for continuous monitoring.

Signals monitored:

- Git: files touched, dispersion, change locality, co-change novelty

- CI: duration, failure patterns

- Dependencies: count, depth changes

- Deployments: release cadence

- Testing: failure rates, skip rates

- Coverage: line/branch rate changes

Would love feedback from Cursor users — does this match problems you’ve run into?

- GitHub: GitHub - alpsla/evolution-engine: Drift detection for AI-assisted codebases. Correlates signals across git, CI, dependencies, and deployments. Your code never leaves your machine. · GitHub