It would be nice if you could offer Claude 3 Haiku with a larger context window. Since the cost is much lower (even than GPT3.5), I think it would be possible.
To have a larger context would be useful when selecting @codebase + @docs + long chat.
I have done some testing of Haiku, not only it’s really fast, but the output is much better than GPT3.5. Also the data is more up to date.
Yes, I’ve even commented multiple times on that thread. What would you like me to read specifically?
Edit: Just to clarify, the request above is not just to add a specific model. But to add a larger context window when a much cheaper model like Haiku is used to promote usage of lower cost model and also get the benefit of a larger context window in some particular use case.
You can use Claude 3 on an Openrouter.ai right now.
As for the context window, you need to ask the Cursor developers about this. Let’s hope that they will increase it.
that’s one big feature request already, half the forum is spammed with those requests I bet they’ll be working on this soon . It’ll be a life changer with those 200k context models, and the Gemini 1.5 in the future (make sure to watch the three.js demo of Gemini 1.5 on youtube if you haven’t seen it )
Now with Haiku being as good as GPT4, I want to repush for that feature request. I think you could replace GPT3.5 as well. Lower your cost while increasing the context window for this particular model.