I signed up for a month to test it, and in one week, I’ve already hit 240 out of 500 requests.
I don’t know why, but compared to the requests to GPT-4 within OpenAI’s chat, the responses are worse.
ChatGPT-4 gives me about 9,600 requests per month (40 every 3 hours)
vs 500 from Cody
Claude Opus 10
What I was looking for was the embedding of my code context, but nowadays, Copilot is already doing this with @workspace and has been catching up to compare with Cursor.
And now there’s Cody, which has a bonus of being just $9 and integrates with my Neovim.
I don’t want to be a hater or belittle the company, but it seems to me that if there isn’t a significant shift, it will be overtaken.
Let’s unpack this.
First, after 500 requests, you can keep going. I personally don’t notice any difference in speed between fast and slow request. But it might depends on the hours of the day I use it.
ChatGPT give you 9,600 request, yes, if you wake up every 3 hours every day to use 40 requests everytime. My usage partner is more in burst. I will use a lot of request sometime in an hour and then nothing for the remainder of the day.
Github Copilot is offering @workspace, have you tried it? For me it was not working great and their chat is still stuck on GPT4.
Cody is great, context window is smaller than Cursor (10k vs 7k). But from my experience the experience is not as great (you can’t pin the chat to the sidebar for example). But $9 for unlimited Opus usage it’s not bad. You can use both.
Cursor have the benefit of controlling the experience since it’s a fork, so they can do things that extension alone cannot do. I guess it will just get better overtime and the difference with other solution will become larger.
4 Likes
I really liked the points you raised! I’ve never managed to exceed 40 requests; I made so many requests in one week with the cursor just to test it out.
For some reason, the responses from GPT (chat) are better than with the cursor on the codebase. I think it imports too much irrelevant stuff, which ends up confusing the response.
Between copying and pasting and throwing things into the cursor chat, I prefer to stick with GPT OpenAI chat.
Regarding the 500 requests, fast and slow, I can’t comment; I haven’t hit 500 yet.
Well, those “AI” system are all changing very quickly. If you find something that works for you, you can stick with it.
My guess is you posted this is because you still have some doubts. It’s ok to try different solutions and revisit them as your workflow change and those services evolves.
As of now, there is no clear winner, but Cursor is certainly well position to win the race.
2 Likes