What is the context size for gpt 4o mini?

I would be interested to know:

01) The actual context size of the LLMs available, ie:

  • Normal Chat:

    • cursor-small
    • gpt-3.5
    • gpt-4o-mini
    • gpt-4o
    • claude-3.5-sonnet
  • Long Context Chat:

    • gpt-4o-128k
    • gemini-1.5-flash-500k
    • claude-3-haiku-200k
    • claude-3.5-sonnet-200k

02) The context size of the LLMs when used within Cursor

There seems to be a bit of uncertainty about this:

And I saw you recently asked a question about the general topic as well:

I am a bit confused why there would be long-context models available to select in Cursor, if they weren’t actually long-context models when utilized in Cursor.

Please excuse any ignorance if I am not understanding the idea of a ‘long-context’ model correctly.

1 Like