Custom API key breaks other models

Thanks for waiting. This is a known bug - the team is tracking the prompt_cache_retention issue with GPT-5 High/Fast (see related topic: Issue when using custom OpenAI Key with GPT 5 High Fast).

Unfortunately, there’s no ETA for a fix yet. For now, please use these workarounds:

  • Turn off your OpenAI API key when using Claude/Gemini/Grok. You can use the shortcut Cmd+Shift+0 to quickly toggle the API key on or off
  • Use GPT-5 Codex instead of GPT-5 High when you need your custom OpenAI key

I know this isn’t ideal for your workflow. I’ll flag this thread so the team sees the ongoing impact.

2 Likes