What I Built
Cortex is a plug-and-play persistent memory system for AI agents. It gives any AI application the ability to remember, learn, and build context over time β from chatbots to multi-agent systems to RAG pipelines.
Website: cortexmemory.dev
Documentation: docs.cortexmemory.dev
GitHub: github.com/SaintNick1214/Project-Cortex
The Problem I Solved
Every AI developer building agents faces the same challenge: AI has no memory.
You can use vector databases (Pinecone, Weaviate), but they only handle embeddings β not complete memory systems. You can use simple storage (Redis, PostgreSQL), but they lack semantic search. Framework memory (LangChainβs built-in) is tightly coupled and basic.
I needed memory that is:
- Flexible β Remember anything without predefined schemas
- Searchable β Semantic search, not just keyword matching
- Isolated β Multi-tenant with per-user/per-agent boundaries
- Fast β Sub-second retrieval at scale
- Framework-agnostic β Works with any LLM
So I built it.
Features Highlights
What you get with Cortex:
Flexible Memory β Remember anything without hardcoded topics or schemas
Infinite Context β Up to 99% token reduction via semantic retrieval
Hive Mode β Multiple AI tools share ONE memory space (Cursor + Claude + custom tools)
Semantic Search β AI-powered retrieval with multi-strategy fallback
Automatic Versioning β Updates preserve history (temporal queries supported)
Fact Extraction β LLM-powered extraction for 60-90% storage savings
Belief Revision β Intelligent conflict resolution when facts change
Graph Integration β Optional Neo4j/Memgraph for relationship queries
Vercel AI SDK Integration β Production-ready with interactive demo
Multi-Tenancy β Complete tenant isolation with auth context
Especially relevant for Cursor users: Hive Mode means when Cursor learns your preference, Claude knows it too. No duplication, no sync β your memory follows you across all AI tools via MCP.
Tech Stack
- TypeScript β Core SDK with full type safety
- Python β Secondary SDK for Python developers
- Convex β Reactive backend (ACID transactions + vector search + real-time)
- 124 test files with 18,460+ assertions
- Automated security scanning β CodeQL, Semgrep, Trivy, Gitleaks, Bandit, OpenSSF Scorecard
How Cursor Helped
I built Cortex almost entirely in Cursor, and the experience was transformative:
1. Rapid Iteration
Cursorβs inline AI suggestions let me iterate on API designs incredibly fast. The SDK has evolved through 47 releases (currently v0.28.x), and Cursor helped me refactor confidently with its codebase awareness.
2. Documentation Generation
The 71+ documentation files at docs.cortexmemory.dev were written with Cursorβs help β it understands the codebase context and generates accurate, consistent docs.
3. Test Coverage
124 test files with 18,460+ assertions. Cursor made it easy to generate comprehensive test cases by understanding the implementation details.
4. Dual SDK Development
Maintaining TypeScript AND Python SDKs with consistent interfaces is challenging. Cursorβs ability to understand both codebases and suggest consistent patterns was invaluable.
5. Refactoring at Scale
The 4-layer architecture (ACID + Vector + Facts + Graph) required significant refactoring as the design evolved. Cursorβs multi-file awareness made large refactors safe.
Quick Start
# Install CLI
brew install cortex-memory/tap/cli
or
npm install -g @cortexmemory/cli
# Create project
cortex init my-agent
# Start building
cd my-agent
cortex start
Or try the interactive quickstart:
cortex init demo --template vercel-ai-quickstart
cd demo && cortex start
# Open http://localhost:3000
Your First Memory
import { Cortex } from "@cortexmemory/sdk";
const cortex = new Cortex({
convexUrl: process.env.CONVEX_URL!,
});
// Store a memory
await cortex.memory.remember({
memorySpaceId: "user-123-personal",
conversationId: "conv-1",
userMessage: "I prefer dark mode",
agentResponse: "I'll remember that!",
userId: "user-123",
userName: "User",
});
// Search memories semantically
const results = await cortex.memory.search(
"user-123-personal",
"what are the user's preferences?",
);
Links
Website: cortexmemory.dev
Documentation: docs.cortexmemory.dev
GitHub: github.com/SaintNick1214/Project-Cortex
npm: @cortexmemory/sdk,@cortexmemory/cli,@cortexmemory/vercel-ai-provider
Whatβs Next
- MCP Server (Q1 2026) β Cross-application memory sharing via Model Context Protocol
- LangChain/LlamaIndex Integrations (Q2 2026)
- Cloud Mode (Q3 2026) β Analytics, team management, advanced features
Built with
using Cursor. If youβre building AI agents that need to remember, give Cortex a try!
Happy to answer any questions about the architecture, implementation, or how Cursor helped build it.
β Nicholas Geil / Saint Nick LLC