Increase support for gpt-5-mini and gpt-5-codex

Feature request for product/service

Cursor IDE

Describe the request

Even after sonnet 4.5, gpt-5 remains the best model for my type of usage by a significant margin (rust, solidity, node.js, some finance stuff). However, it’s a slow model, and gpt-5-codex doesn’t seem to achieve as good results as I get in the codex-cli - not fully sure if it’s a system prompt thing, or related to the adaptive reasoning effort used for Cursor’s integration (seems to be the case). So, I’d be extremely happy if:

  • gpt-5-mini-fast and gpt-5-mini-high-fast were supported; It seems like the current medium reasoning version doesn’t use OpenAI’s priority lane, so its latency seems similar to gpt-5-fast’s. If a “fast” variant a high reasoning one are added, I think both would be really strong daily drivers for those like myself that love gpt-5 but sometimes need a faster model, yet smarter than sonnet for my use cases. From what I tested I still get really good and arguably stronger results with gpt-5-mini than sonnet.
  • Add support/selection of the reasoning effort for gpt-5-codex, so its results are closer to what we get when using its CLI.

I love Cursor’s UX, specially the new Agent Window that lets me stay in flow while collaborating with gpt-5’s variants, but I still have to resort to the codex-cli when I have a bigger task and need to make sure the model stays in high reasoning effort. Those additions would make me really happy =D

Operating System (if it applies)

MacOS

Hi, can you try gpt-5-codex again starting tomorrow? We fixed a critical bug that was reducing performance and we believe you’ll be much happier with the results at that point. Thanks, and please let me know what you think!

It’s much better and I see you guys now support GPT-5 Codex High, thanks a lot!