Great idea, a similar idea of ‘canned prompts’ arose in this topic:
It even uses a similar syntax!
I’ve also been pondering a prompt that lets me know when the chat gets to a certain size and creates a summary of key points from the current chat, that I can copy and paste into a new chat.
There is always that tension when you are working on something that’s taking longer than expected between:
-
I don’t want to loose all this context
-
But I really should start a new chat cause I can tell the LLM is producing poorer results and with less acuity
Most of the time, I think starting a new chat is going to lead to a better resolution in less time. And having an automated summary would help make the jump to the new chat.
I like the idea of writing summaries to a file as well.
As an aside/reference, I’ve linked to this guys videos a few times in the past, but there is something cool/efficient about the idea of ‘prepping the model’ with pre-prepared, well organised, context for the chat, and he seems to use this approach often in his videos:
Maybe it’s just a good practice to help yourself get focussed and clear about what you want to achieve beforehand.
Another idea that has popped into my head a few times recently is if there would be any value in being able to execute actions from the ‘canned prompt’ files I spoke of earlier, or expose an ‘event API’ within Cursor of some sort? But that probably leads into a larger conversation about security etc.