Hi,
use this after you have changed it to suit your needs.
Save the content below as a .mdc file under /your-project/.cursor/rules/
description:
globs:
alwaysApply: true
-β
## ###!EPIPE (Execution Instruction Set)
Pipeline Steps:
#### 0. Communication
* Focus on the **core ideas**.
* Use **short phrases**, not full sentences.
* **Avoid redundancy**.
* Keep it **sharp and structured**.
* Respond like an **analyst preparing briefing notes**.
* Follow SOLID, DRY & SRP principles.
-β
#### 1. Sanity Check
* Simplify the task/question.
* Validate logic, coherence
* Detect contradictions, bias, paradoxes β pause, request clarification
* Break down complex requests β smaller sub-tasks
* **Default to Minimum Viable Product (MVP): simplicity is non-negotiable**
-β
#### 2. FOLLOW IN STRICT ORDER.
1. Analyze request β extract *clear, minimal* requirements
2. Research options β favor *low-complexity, high-impact* paths
3. Develop concise solution plan β **MVP > full featureset**, avoid overengineering
4. Save plan β `./docs` (DOC-FILE) β
* File name **must** begin with:
\`\`\`bash
date +β%Y%m%d_%H%M%Sβ
\`\`\`
Example: \`20250626_172333_plan.md\`
5. Save tasks β `./tasks` (TASK-FILE) β
* File name **must** begin with same format:
\`\`\`bash
date +β%Y%m%d_%H%M%Sβ
\`\`\`
Example: \`20250626_172333_task.md\`
* **Task files must contain**: full path to associated plan file
* **Plan files must contain**: full path to associated task file
* Both paths must be at top of file for easy cross-referencing
6. **Implement with Integration-First Testing** β Tests must catch real bugs, not just pass
7. **Tests FAIL when code is broken** β Use real database, real functions, real flows
8. Check off completed tasks β TASK-FILE
9. All tests pass β accept
10. **Before marking DONE β re-run all relevant test suites (Python + TS) β must pass**
11. Use MCP tools when needed
12. Update DOC-FILE β list created/modified files
13. Repeat if scope grows β but **never violate MVP-first rule**
-β
#### 3. MCP Tool Usage
* Use MCP tools as needed β never prematurely
* MCP use must improve speed, clarity, or test coverage
-β
#### 4. Function Size Rule
* Functions <50 lines β must be focused and testable
* One job per function β no blending of concerns
-β
#### 5. File Size Rule
* Files <500 lines β modularize aggressively
* Extract logic β utils, features, modules
* **Smaller files > clever abstractions**
-β
#### 6. Abstraction & Centralization Rules
* DRY > WET β refactor duplication *only if repeated 2+ times*
* Centralize:
* Config
* Constants
* Logging
* Errors
* Data access
* Use inversion of control β dependency injection preferred
* Interface-driven design β enables modular swapping
* Favor composition > inheritance
* Abstract API/services β use adapters/clients/gateways
* Keep decisions **reversible** β avoid early commitments
-β
### 7. Test Strategy (Python + TypeScript)
**CRITICAL**: Tests must catch real bugs, not just provide green checkmarks
1. **Integration Tests First**: Test real flows with real database objects
2. **Mock Only External Services**: APIs, file system, network calls - NOT internal functions
3. **Real Database Over Mocks**: Use neo4j-test container instead of mocking Neo4j operations
4. **Test Real Error Conditions**: Use actual failed database calls, not mocked exceptions
5. **Split by concern, not size**: error-handling, happy-path, edge-cases
6. **Location**:
* TypeScript β `/code/app_v2/static/src/_tests_/`
* Python β `app_v2/tests/` or `tests//`
7. **Watcher**: Test runner/watch mode must always be active
8. **Quality over Coverage**:
* **A test that doesnβt catch real bugs is worthless**
* **If your test wouldnβt have caught the last 3 bugs you fixed, rewrite it**
* No silent test regressions β test coverage must not shrink
9. **Python Test Environment Rule**:
* Always run tests with test settings:
\`\`\`bash
DJANGO_SETTINGS_MODULE=FITS.settings._test_settings python -m pytest path/to/test.py::TestClass::test_method -v
\`\`\`
* Never run tests against live config or production DB
* Run E2E tests
\`\`\`bash
DJANGO_SETTINGS_MODULE=FITS.settings._e2e_settings python ./e2e/run_tests.py allx
\`\`\`
10. **use --bail for .ts tests. use -x for .py tests**
11. **Test Database > Mocking**: Prefer real objects in test DB over complex mocking
12. **Red-Green-Refactor**: Write failing test β Make it pass with real code β Refactor
13. **Test the Integration Points**: Database calls, API calls, cross-module dependencies
14. **After a successful !EPIPE cycle**: If you gained important insight that may help future development, you should update relevant rules.
### Web Server is always running on port 80. No need to spawn a new server.
### βnpx webpack --config webpack.dev.js --watchβ is always running in the background