Would highly appreciate if o3-mini-high
would be added to Cursor.
Hey, this model is already used in Cursor.
Hey @deanrie ,
when is it expected to have o3-mini work as smoothly as claude with Composer Agent ?
because it doesnāt apply code to files currently
When I try to use it, I get āThe model o3-mini-high does not work with your current planā¦ā
When I turn on my openai API Key:
{āerrorā:{āmessageā:āThe model o3-mini-high
does not exist or you do not have access to it.ā,ātypeā:āinvalid_request_errorā,āparamā:null,ācodeā:āmodel_not_foundā}}
Version: 0.45.9
VSCode Version: 1.96.2
Commit: cce0110ca40ceb61e76ecea08d24210123895320
I am on the pro plan as well. I can use o3-mini. But I have no idea what version of o3-mini it is? I tried manually setting o3-mini-<level> for each high, medium, and low, and they all give me an error message.
Qwen pls
itās listed as āo3-miniā in cursor. are you saying thatās the o3-mini-high model?
It appears o3-mini works in cursor, but o3-mini-high which is fine tuned on coding is not available yet. it is also not available in open router. any idea when this will be fixed @deanrie ?
Thank you!!!
I donāt think so? o3-mini != o3-mini-high
o3-mini-high isnāt a model name. Instead, o3-mini-high represents the o3-mini model with a reasoning_effort parameter. For example:
{
model: āo3-miniā,
reasoning_effort: āhighā
}
Please support the reasoning_effort parameter. Thanks!
Hi, so if I want use o3 mini high is enough to select o3 mini in the Cursor settings? thank you
I get this same thing. Anyone got to a solution? O3 in cursor isnāt the best i feel like
Hi all, o3-mini
is set to high reasoning behind the scenes, but you cannot manually specific the reasoning with model names like o3-mini-high
right now!
I recommend to indicate that it is the high reasoning mode in
- Cursor ā Models
- Update: o3-mini pricing
- and in the Cursor app
Because the āo3-miniā moniker alone is underspecified.
Thanks danperks for the update, how can I make sure that o3-mini is truly set as to high by default?
Hey, o3-mini is set to high behind the scenes by us on our back end and is not configurable by a user yet.
+100, big difference between o3-mini-med and o3-mini-high. Itās fine to have just o3-mini in the IDE, but as Orpheus mentioned, having the details on the models page would be appreciated. Copy change = 1 story point
Guys i still have a feeling that o3-mini for me is mid or even low o3-mini. o1 is giving me significantly better results and that shouldnāt be the case if itās o3-mini high. I have o3-mini-high in my menu but canāt use it.
I agree, the output from āo3-miniā model on cursor is much lesser intelligent than response from āo3-mini-highā on ChatGPT or API calls. (e.g. For migrating @nuxt/content from v2 to v3, the same model on ChatGPT/API calls is giving different output from the calls within Cursor).