If we could use 2M context window in a long-context chat, possibly at a slightly higher price if necessary, that could fit whole large codebases into it. It could be really useful.
Great! It’s almost 10 times less expensive than Sonnet and is also really good + has 2M tokens. It should definitely be available in the long context chat!
Hey @andrewh !
Can you change “verification part” for the custom openai key?
I just try to use cursor with ollama / groq (llama3) and they haven’t gpt-3.5-turbo for cursor’s check and it failed
Seems pretty ease change but I’m really appreciate it!