Provide a clear description of the bug
If I enter an OpenAI API key and have deepseek-r1 enabled as a model, OpenAI will try to use deepseek. Also, I cannot use deepseek-r1 while my OpenAI API key is enabled.
Output:
curl https://api.openai.com/v1/chat/completions -H “Content-Type: application/json” -H “Authorization: Bearer xxxxxxx” -d ‘{
“messages”: [
{
“role”: “system”,
“content”: “You are a test assistant.”
},
{
“role”: “user”,
“content”: “Testing. Just say hi and nothing else.”
}
],
“model”: “deepseek-r1” <—Why is it using deepseek-r1??
}’
Also, you cannot use the deepseek-r1 model anymore. It will give an error. If you disable your OpenAI API Key, then you can use deepseek-r1 with no issues.
Because of this, it’s not possible to use deepseek-r1 and have an OpenAI API Key enabled
Explain how to reproduce the bug (if known)
- Have deepseek-r1 model enabled
- Perform a test query in Chat and see it’s working fine.
- Enable OpenAI API key and enter a key and press Verify
- You get an error saying the model is not valid for your API key
- Notice how at the bottom of the error message it’s using deepseek-r1 as the model
- Enable the gpt-3.5-turbo model
- Enable and verify your OpenAI API key
- It will succeed now
- Try to run a query in Chat with deepseek-r1
- It will say “The model deepseek-r1 does not work with your current plan”
Tell us your operating system and your Cursor version (e.g.,
Windows, 0.x.x
).
macOS 15.3.1 (24D70)
Version: 0.45.11
VSCode Version: 1.96.2
Commit: f5f18731406b73244e0558ee7716d77c8096d150
Date: 2025-02-07T09:43:58.555Z
Electron: 32.2.6
Chromium: 128.0.6613.186
Node.js: 20.18.1
V8: 12.8.374.38-electron.0
OS: Darwin arm64 24.3.0