Use @Prompts in Chat for Custom Prompts

Regarding the requested feature of enabling custom prompts within Cursor:

Several workarounds have been suggested, such as canned prompts, using Espanso, or leveraging notepads. While these solutions might work in specific scenarios, they are not streamlined. To ensure an efficient workflow, I believe this feature should include the following key capabilities:

  1. First-party integration within Cursor IDE with intuitive triggering via @ commands, e.g., @Prompt.
  2. Per-project customization using version control, allowing prompts to be stored and managed alongside project files.
  3. Support for referencing external files or other prompts via markdown links or @ commands.
  4. System variable interpolation, such as ${user_input}, to dynamically insert user input into prompts.

Proposed Workflow

  1. Users define prompt files within the repository, possibly inside .cursor/prompts/, similar to how .cursor/rules/ allows per-project rules.

    Example prompt file:

    // .cursor/prompts/my-prompt.md  
    
    Refine the following user input for clarity.  
    Take into account the current tech stack: @docs/tech-stack.md  
    (Alternatively, using markdown links: [tech-stack](../docs/tech-stack.md))  
    
    **User input:**  
    ${user_input}
    
  2. Users reference the prompt in the chat box using the @ symbol:

    @my-prompt [the question to be refined...]
    
  3. When sent to the LLM, this expands into:

    Refine the following user input for clarity.  
    Take into account the current tech stack:  
    [... interpolated content from docs/tech-stack.md]  
    
    **User input:**  
    [... interpolated question]
    

Comparison with GitHub Copilot

GitHub Copilot already supports prompt files that:

  • Reference external files
  • Can be shared within VS Code
  • Are defined on a per-project basis

Bringing similar functionality to Cursor would enhance usability and streamline prompt-driven workflows.