Gemini 1.5 Flash long-context mode

“Gemini 1.5 Flash starts at 35 cents per 1 million tokens.”

“Anyone can sign up to try Google’s new 2 million token context window”.

From Google Gemini 1.5 Pro vs. Flash vs. Nano, Explained - CNET

If we could use 2M context window in a long-context chat, possibly at a slightly higher price if necessary, that could fit whole large codebases into it. It could be really useful.



Great! It’s almost 10 times less expensive than Sonnet and is also really good + has 2M tokens. It should definitely be available in the long context chat!


Yes this would be a great alternative/addition to long-context models.

1 Like

Cursor dev here. Good idea! It’ll ship in the next update.


amazing thx – coding really feels like riding a bike now :mountain_biking_man:

1 Like

Hey @andrewh !
Can you change “verification part” for the custom openai key?
I just try to use cursor with ollama / groq (llama3) and they haven’t gpt-3.5-turbo for cursor’s check and it failed :slight_smile:

Seems pretty ease change but I’m really appreciate it!