Hello Cursor Development Team,
I am a Cursor editor user, primarily using it to edit C# code. I’ve noticed an issue: when working with very large single C# code files (.cs
) – for example, those exceeding 20,000 lines – the editor’s “Apply” (Apply Changes) feature often seems to malfunction. It might become extremely slow, laggy, or even fail completely.
To continue editing in such situations, I’ve found a workaround:
- First, I manually select the specific code snippet I want to modify.
- Then, I switch to Chat mode and describe my required changes to the AI, or ask it to modify the selected code directly.
- The AI generates the modified code, usually maintaining correct indentation.
- Finally, I copy the AI-generated code and manually paste it, replacing the original code snippet in the editor.
I’ve found this manual process works quite well and has several advantages:
- It seems to consume minimal resources: Even in very large files, this operation completes quickly without causing the editor to lag.
- It doesn’t rely on the full file context: This method only processes the selected snippet and doesn’t appear to need to load or analyze the context of the entire large file.
This makes me wonder: Since this “select → chat → copy/paste” method can efficiently handle localized changes in large files with low resource demands, why can’t the “Apply” feature reference or adopt a similar processing model?
If the “Apply” feature could also adopt a more “localized” processing logic behind the scenes – focusing only on processing the changed code snippet and its necessary immediate context, rather than attempting to analyze or manipulate the entire massive file – perhaps it could overcome the current limitations and support smooth AI editing and application of changes for files of any size.
I hope this feedback is helpful for improving the Cursor editor, making it more robust and efficient when handling large projects and files.
Thank you!