My prompt for refactoring and fixing issues (agent mode)

I have found quite a lot of success with the below prompt, and I just wanted to share it here in case anyone would find it useful.

(Updated prompt in the latest comment to this post)

I need help with fixing and refactoring code in my codebase. 

TASK FILE MANAGEMENT (Maintained throughout the entire process):
The AI MUST maintain a task file at `PROJECT_ROOT/.tasks/CURRENT_DATE.md` where:
- CURRENT_DATE is obtained by running: `date +%Y-%m-%d`
- Every timestamp in the content MUST be obtained by running: `date +[%H:%M:%S]`
- The file MUST begin with:

# Context
Created: `date +[%H:%M:%S]`

## Original Prompt
[Copy of the complete issue description provided by user]

## Project Overview
[Copy of the General Project Overview section]

## Issues and Progress Below
---

- The file serves as the source of truth for all progress tracking
- Every status update MUST use the `date` command for timestamps
- All updates must be logged chronologically under a "Progress" section
- The file should maintain checkmarks and status in real-time

STEP 1 - ISSUE IDENTIFICATION:
Before ANY implementation or analysis begins, you MUST:
1. Analyze the codebase
2. List ALL identified issues
3. Wait for my confirmation that these are the issues to address

STEP 2 - ISSUE TEMPLATE CREATION:
After I confirm the issues, you MUST:
1. Convert each confirmed issue into the full template format below
2. Show me ALL issues in template format
3. Wait for my confirmation before starting any implementation

# Issues
1. ...

[Add more issues as needed]

>>> STOP HERE AND WAIT FOR USER CONFIRMATION OF ISSUES <

# Template Format (STEP 2 - NEEDS USER CONFIRMATION)
[After issues are confirmed, format EACH issue using this complete template before starting any implementation. Each issue must be separated by a clear divider]

-----------------------------------
# ISSUE #1 TEMPLATE
-----------------------------------
Issue #1: [Title from confirmed issue above]
## Analysis
- Current Implementation
- Root Cause
- Impact on System

## Solution
- Proposed Changes
- Potential Risks
- Expected Outcome

## Implementation
[Add concise console.log messages to help track execution flow during verification]
- [ ] Step 1: Description of what needs to be done
- [ ] Step 2: Description of what needs to be done
- [ ] Step 3: Description of what needs to be done
- [ ] Clean up: Remove all debug console.log messages
- [ ] Commit: [Concise, human-readable commit message to be used after verification]

## Verification (for the user to verify)
[AI should analyze codebase to understand full functionality and surrounding features before providing verification steps, the below steps are only examples to keep in mind for each issue]
- [ ] Step 1: Specific way to test in browser (using Console, Network tab, etc.)
- [ ] Step 2: Areas to inspect in Elements/Sources tab
- [ ] Step 3: User interactions to verify through browser
- [ ] Step 4: Areas of the system to check for impacts
- [ ] Step 5: User scenarios to verify end-to-end functionality

## Documentation
- Implementation Notes

## Status
- Current Status
- Next Action
- Blockers (if any)

-----------------------------------
# ISSUE #2 TEMPLATE
-----------------------------------
[Repeat the above template structure for Issue #2]

[Continue for any additional issues with the same clear separation]

>>> STOP HERE AND WAIT FOR USER CONFIRMATION OF TEMPLATED ISSUES <

[Only after user confirms the templated issues are correct, proceed with implementation using the rules below...]

# Processing Rules
1. Each numbered issue MUST be processed independently and sequentially.
2. Do NOT proceed to the next issue until the current one is fully resolved.
3. __NEVER SAY__ "I see the issue..." or "I found the problem" or similar.
4. Ask for clarification if ANY aspect is unclear.
5. For any missing context, search files using `tree` or request specific information.
6. Before starting each new section (Analysis/Solution/Implementation/etc.):
   - Recap what's been done so far
   - List what's about to be done
7. After completing each section:
   - Summarize what was completed
   - Show completed checkmarks
   - State what's coming next
8. After each implementation step:
   - Mark the step as complete [x]
   - Show the full implementation list with progress
   - Ask for confirmation before proceeding
9. Before proceeding to verification:
   - Show complete implementation checklist
   - Confirm all steps are done
   - Get user confirmation to proceed
10. After verification:
    - Show all completed verification steps
    - Get final confirmation before commit

# Progress Tracking Rules for AI
1. At the start of each issue:
   - Update task file with: "`date +[%H:%M:%S]` Starting Issue #[N]"
   - Show current task file status
   - List all sections to be completed

2. Before each section:
   - Update task file with: "`date +[%H:%M:%S]` Moving on to [Section Name]"
   - Show completed sections from task file
   - List upcoming steps

3. After each step completion:
   - Update task file: "`date +[%H:%M:%S]` Completed: [Step Description]"
   - Update checkmarks in task file
   - Show updated checklist from task file
   - Ask for confirmation

4. After each section completion:
   - Update task file with section completion and timestamp
   - Show all sections with completion status from task file
   - Update "Progress Updates" section
   
5. Before final commit:
   - Verify all checkmarks in task file
   - Show complete progress from task file
   - Get final confirmation

[All status updates and progress tracking should reference and update the task file]

# Final Review
1. Summary of Changes
2. Overall System Impact
3. Documentation Update

Please process ONE issue at a time, following each step in order. Wait for confirmation before proceeding to the next issue.

---

# General Project Overview
- ...

The prompt makes it easy to analyze everything that is happening during the process by looking at the task file that is created and maintained during the process. This file can itself be used as a prompt at a later date if one wants to continue working on any particular issue they have been dealing with in the past.

Try it out and let me know how it works for you.

6 Likes

is it .md ?

1 Like

What is .md?

.md stands for markdown or markdown format (Basic Syntax | Markdown Guide) which is the standard most models use for their default output.

Hi Maxfahl,

First, I want to thank you for the amazing work you’ve shared—it’s truly inspiring and incredibly helpful for structuring refactoring and debugging processes. :clap:

I have a quick question: do you use your model directly in the “AI Rules”, or do you use it as a systematic prompt in the composer?

Thanks in advance for your reply, and once again, great job on your methodology!

Thanks :slight_smile:

I’ve found it best to put everything in the prompt itself, as you can’t trust trust Cursor to always pass the rules etc on to the model.

I’ve iterated on the prompt a bit since I posted it. It now formulates the task file itself a bit like a prompt so that you can just point to it later and it will know what we were working on. It seem to get lost after 4-6 prompts otherwise. Having this file I can just pick up any issue days or weeks later.

I’ve also added stuff like creating a feature branch and then commit/merge at the end. Best used with YOLO mode :slight_smile:

The latest version for anyone interested. What you actually need to edit is the task itself at the top, and the “General Project overview” at the bottom. For the project overview, I’d recommend letting the AI itself generate this.

# IMPLEMENTATION TASK [THE FOLLOWING TASK MUST BE PROCESSED ACCORDING TO THE PROTOCOL BELOW]

(Insert task description here)

# Implementation Protocol

TASK FILE MANAGEMENT:
The AI MUST maintain a task file at `PROJECT_ROOT/.tasks/[CURRENT_DATE]_[TASK_NUMBER]_[TASK_IDENTIFIER].md` where:
- CURRENT_DATE is obtained by running: `date +%Y-%m-%d`
- TASK_NUMBER is a number that is incremented for each new task, `ls` the contents of `/.tasks/` to get the correct number
- TASK_IDENTIFIER is a short identifier for the task, such as "docker-build-migration" 
- Every timestamp in the content MUST be obtained by running: `date +[%H:%M:%S]`
- The file MUST begin with:

# Context
Created: `date +[%H:%M:%S]`

## Original Prompt
[Copy of the complete task description provided above]

## Project Overview
[Copy of the General Project Overview section]

## Current Branch
[Name of the current feature branch being worked on]

## Task Progress Below
---

- The file serves as the source of truth for all progress tracking
- Every status update MUST use the `date` command for timestamps
- All updates must be logged chronologically under a "Progress" section
- The file should maintain checkmarks and status in real-time

GIT BRANCH MANAGEMENT:
Before starting implementation:
1. Create a new feature branch from master using:
   `git checkout -b feature/[TASK_IDENTIFIER]`
2. Add the branch name to the task file under "Current Branch" section
3. Verify the branch was created and is active:
   `git branch --show-current`

IMPLEMENTATION STEPS:
STEP 1 - TASK TEMPLATE CREATION:
You MUST:
1. Convert the task into the full template format below
2. Show the complete templated task
3. Wait for confirmation before starting implementation

# Task Template Format
[Format the task using this complete template before starting implementation]

-----------------------------------
# TASK TEMPLATE
-----------------------------------
Task: Docker Build System Migration

## Analysis
- Current Implementation
- Root Cause
- Impact on System

## Solution
- Proposed Changes
- Potential Risks
- Expected Outcome

## Implementation
[Add concise console.log messages to help track execution flow during verification]
- [ ] Step 1: Description of what needs to be done
- [ ] Step 2: Description of what needs to be done
- [ ] Step 3: Description of what needs to be done
- [ ] Clean up: Remove all debug console.log messages
- [ ] Commit: [Concise, human-readable commit message to be used after verification]

## Verification (for the user to verify)
[AI should analyze codebase to understand full functionality and surrounding features before providing verification steps]
- [ ] Step 1: Specific way to test in browser (using Console, Network tab, etc.)
- [ ] Step 2: Areas to inspect in Elements/Sources tab
- [ ] Step 3: User interactions to verify through browser
- [ ] Step 4: Areas of the system to check for impacts
- [ ] Step 5: User scenarios to verify end-to-end functionality

## Documentation
- Implementation Notes

## Status
- Current Status
- Next Action
- Blockers (if any)

# Processing Rules
1. Process the task methodically according to the template sections.
2. __NEVER SAY__ "I see the issue..." or "I found the problem" or similar.
3. Ask for clarification if ANY aspect is unclear.
4. For any missing context, search files using `tree` or request specific information.
5. Before starting each new section (Analysis/Solution/Implementation/etc.):
   - Recap what's been done so far
   - List what's about to be done
6. After completing each section:
   - Summarize what was completed
   - Show completed checkmarks
   - State what's coming next
7. After each implementation step:
   - Mark the step as complete [x]
   - Show the full implementation list with progress
   - Ask for confirmation before proceeding
8. Before proceeding to verification:
   - Show complete implementation checklist
   - Confirm all steps are done
   - Get user confirmation to proceed
9. After verification:
    - Show all completed verification steps
    - Get final confirmation before commit
10. After successful verification and before final commit:
    - Ensure all changes are committed on the feature branch
        - `git add -A`
        - `git commit -m "[COMMIT_MESSAGE]"`
    - Switch to master: `git checkout master`
    - Get latest changes: `git pull origin master`
    - Merge feature branch: `git merge feature/[TASK_IDENTIFIER]`
    - Push changes: `git push origin master`
    - Delete feature branch: `git branch -d feature/[TASK_IDENTIFIER]`
11. Mark the task file as completed by renaming the file to `[CURRENT_DATE]_[TASK_NUMBER]_[TASK_IDENTIFIER]_completed.md`

# Progress Tracking Rules
1. At the start:
   - Update task file with: "`date +[%H:%M:%S]` Starting Task Implementation"
   - Show current task file status
   - List all sections to be completed

2. Before each section:
   - Update task file with: "`date +[%H:%M:%S]` Moving on to [Section Name]"
   - Show completed sections from task file
   - List upcoming steps

3. After each step completion:
   - Update task file: "`date +[%H:%M:%S]` Completed: [Step Description]"
   - Update checkmarks in task file
   - Show updated checklist from task file
   - Ask for confirmation

4. After each section completion:
   - Update task file with section completion and timestamp
   - Show all sections with completion status from task file
   - Update "Progress Updates" section
   
5. Before final commit:
   - Verify all checkmarks in task file
   - Show complete progress from task file
   - Get final confirmation

[All status updates and progress tracking should reference and update the task file]

# Final Review
1. Summary of Changes
2. Overall System Impact
3. Documentation Update
4. Confirm successful merge to master

---

# General Project Overview
- See ".notes/project_overview.md" for details, __IMPORTANT!!!__
1 Like

Hi maxfahl, thank you for sharing your prompt and I would like to first understand more before I use it. 1) what problem/s were you running into that motivated this prompt 2) do you find this consistently solves your problem?

I am finding my biggest challenge with the agent is its “forgetfulness” and am wondering if your prompt will help me with this