Codex model from openai

Trying to use codex model from openai and getting this:

This model is only supported in v1/responses and not in v1/chat/completions.

would be nice to have it fixed

2 Likes

I second this, Open AI’s models have performed better for me then Claude or Gemini so it would be great to get their best model in Cursor. It is readily available now via API but not in Cursor.