Add support for o3-mini-high

Would highly appreciate if o3-mini-high would be added to Cursor.

1 Like

Hey, this model is already used in Cursor.

5 Likes

Hey @deanrie ,
when is it expected to have o3-mini work as smoothly as claude with Composer Agent ?

because it doesn’t apply code to files currently

1 Like

When I try to use it, I get ā€œThe model o3-mini-high does not work with your current planā€¦ā€

When I turn on my openai API Key:
{ā€œerrorā€:{ā€œmessageā€:ā€œThe model o3-mini-high does not exist or you do not have access to it.ā€,ā€œtypeā€:ā€œinvalid_request_errorā€,ā€œparamā€:null,ā€œcodeā€:ā€œmodel_not_foundā€}}

Version: 0.45.9

VSCode Version: 1.96.2

Commit: cce0110ca40ceb61e76ecea08d24210123895320

I am on the pro plan as well. I can use o3-mini. But I have no idea what version of o3-mini it is? I tried manually setting o3-mini-<level> for each high, medium, and low, and they all give me an error message.

1 Like

Qwen pls

1 Like

it’s listed as ā€œo3-miniā€ in cursor. are you saying that’s the o3-mini-high model?

It appears o3-mini works in cursor, but o3-mini-high which is fine tuned on coding is not available yet. it is also not available in open router. any idea when this will be fixed @deanrie ?

Thank you!!!

I don’t think so? o3-mini != o3-mini-high

1 Like

o3-mini-high isn’t a model name. Instead, o3-mini-high represents the o3-mini model with a reasoning_effort parameter. For example:
{
model: ā€œo3-miniā€,
reasoning_effort: ā€œhighā€
}
Please support the reasoning_effort parameter. Thanks!

1 Like

Hi, so if I want use o3 mini high is enough to select o3 mini in the Cursor settings? thank you

1 Like

I get this same thing. Anyone got to a solution? O3 in cursor isn’t the best i feel like

1 Like

Hi all, o3-mini is set to high reasoning behind the scenes, but you cannot manually specific the reasoning with model names like o3-mini-high right now!

4 Likes

I recommend to indicate that it is the high reasoning mode in

Because the ā€˜o3-mini’ moniker alone is underspecified.

2 Likes

Thanks danperks for the update, how can I make sure that o3-mini is truly set as to high by default?

Hey, o3-mini is set to high behind the scenes by us on our back end and is not configurable by a user yet.

2 Likes

+100, big difference between o3-mini-med and o3-mini-high. It’s fine to have just o3-mini in the IDE, but as Orpheus mentioned, having the details on the models page would be appreciated. Copy change = 1 story point :wink:

Guys i still have a feeling that o3-mini for me is mid or even low o3-mini. o1 is giving me significantly better results and that shouldn’t be the case if it’s o3-mini high. I have o3-mini-high in my menu but can’t use it.

I agree, the output from ā€˜o3-mini’ model on cursor is much lesser intelligent than response from ā€˜o3-mini-high’ on ChatGPT or API calls. (e.g. For migrating @nuxt/content from v2 to v3, the same model on ChatGPT/API calls is giving different output from the calls within Cursor).