When the models analyse logs for troubleshooting, they often consider recent dates as ‘future dates’ and an issue that needs to be resolved, presumably due to their training data cut-off date.
If Cursor were aware of the current date/time, this would prevent a lot of wasted energy fighting the model to assure it of the current date. (I’ve had Claude hardcode the date I’ve given it to appease me, as it certainly did not believe me!)
We have some workarounds using MCP servers (Simple MCP server for the current time) or suggestions of making CLI calls in the rules, but this is inconsistent, and frankly mad, as the IDE should have no trouble picking up the current date/time from the user’s machine.
Please consider informing the AI of the current timestamp when making requests so we can avoid this ongoing battle - trying to convince it that the data it’s analysing isn’t from the future!
Instead of a solution like MCP, I have a simpler solution and it works perfectly. All you need to do is add a line like this to the general user rules in the settings.
I use this on my macOS device:
You use “date” terminal command to learn current date
I think the Get-Date command is used for Windows, I don’t know.
I’m aware of the time MCPs and terminal workarounds - It seems bonkers to me that we even require these workarounds, and how is using them different to me specifying the current date in the chat manually?
Surely a timestamp is something that can be included in each request
I agree with you, this should be a built-in feature and not something we have to do a workaround for. Especially when those workarounds don’t consistently work
In general, AI providers don’t add the date as its not always relevant to the prompt and they dont know on API usage where on earth the end-user is located (different date and time, 26h total span!).
I was thinking several times what potential issues could occur if Cursor were to add this automatically to each prompt, as it would have to be almost on the user UI side, since their API doesnt know users local time either.
There were few cases when handling historical info (files/data/specs…) in my personal usage where this could be an issue.
As with all programming anything that looks simple may not always be simple considering a wide range of usage.
(voted up this feature request, and about 5 similar ones about current date in last 7 days haha)
Great points, I can see how this is more complex than it appears on the surface.
Also for my personal experience, only ONCE have I had the lack of current datetime awareness be an issue, when I was analyzing some log files while troubleshooting a new bug. For all the rest of work I do with Cursor, it just hasn’t mattered.
Grok inside of x.com can always get the proper date and time based on my VPN settings. If it’s doable via browser, then surely it’s also doable via IDE.
And they just refreshed the settings UI with all the fancy cards and 5 different ways of handling related fields.
There’s enough space to add a preferred timezone field with off/on setting.
It should be trivial in Cursor AI. If you look at the contents of the state.vscdb files (in Linux) the Unix time is always included. Also, since the newer versions allow command execution, then a local execution of date to get a time stamp to add at the end of a response would do fine as well.
As to chatGPT, It should be possible to have it do a time check by going to time.gov but their web tool forces it to use whatever they have decided it should use. I have tried enforcing this in the User instructions (A sort of chatGPT user System Prompt) but it is ignored.
I really don’t understand what AI companies have against time.
Part of my workflow is to make the agent update TODOs, Docs, and various context documents via .cursorrules.
I’ve seen Gemini constantly spew out wrong dates and use it as reference at later steps, without real consistency causing “timeline comprehension” mistakes.
It would be great for Cursor to have some dynamic variables in the system prompt (time, system environnement, project environnement, etc)