Cannot use default model in latest cursor-cli version on grandfathered plan

@Colin let me know if i should create a separate thread for this but on the latest version of cursor-cli we cannot use the “default” model for the auto mode. i was using --model auto before and that worked but now it doesnt work. I am on the grandfathered plan.. why is there a limit??

please let me know if i need to downgrade as a workaround. this is affecting all my workflow

$ agent --version

2026.03.18-f6873f7

$ agent -p "what model is being used"                        
b: You've hit your usage limit for Opus You've saved $5185 on API model usage thi
s month with Pro. Switch to Auto for more usage or set a Spend Limit to continue 
with Opus. Your usage limits will reset when your monthly cycle ends on 4/12/2026
.

$ agent --model default -p “hello”

Cannot use this model: default. Available models: composer-2-fast, composer-2, co

mposer-1.5, gpt-5.3-codex-low, gpt-5.3-codex-low-fast, gpt-5.3-codex, gpt-5.3-cod

ex-fast, gpt-5.3-codex-high, gpt-5.3-codex-high-fast, gpt-5.3-codex-xhigh, gpt-5.

3-codex-xhigh-fast, gpt-5.2, gpt-5.3-codex-spark-preview-low, gpt-5.3-codex-spark

-preview, gpt-5.3-codex-spark-preview-high, gpt-5.3-codex-spark-preview-xhigh, gp

t-5.2-codex-low, gpt-5.2-codex-low-fast, gpt-5.2-codex, gpt-5.2-codex-fast, gpt-5

.2-codex-high, gpt-5.2-codex-high-fast, gpt-5.2-codex-xhigh, gpt-5.2-codex-xhigh-

fast, gpt-5.1-codex-max-low, gpt-5.1-codex-max-low-fast, gpt-5.1-codex-max-medium

, gpt-5.1-codex-max-medium-fast, gpt-5.1-codex-max-high, gpt-5.1-codex-max-high-f

ast, gpt-5.1-codex-max-xhigh, gpt-5.1-codex-max-xhigh-fast, gpt-5.4-high, gpt-5.4

-high-fast, gpt-5.4-xhigh-fast, claude-4.6-opus-high-thinking, gpt-5.4-low, gpt-5

.4-medium, gpt-5.4-medium-fast, gpt-5.4-xhigh, claude-4.6-sonnet-medium, claude-4

.6-sonnet-medium-thinking, claude-4.6-opus-high, claude-4.6-opus-max, claude-4.6-

opus-max-thinking, claude-4.5-opus-high, claude-4.5-opus-high-thinking, gpt-5.2-l

ow, gpt-5.2-low-fast, gpt-5.2-fast, gpt-5.2-high, gpt-5.2-high-fast, gpt-5.2-xhig

h, gpt-5.2-xhigh-fast, gemini-3.1-pro, gpt-5.4-mini-none, gpt-5.4-mini-low, gpt-5

.4-mini-medium, gpt-5.4-mini-high, gpt-5.4-mini-xhigh, gpt-5.4-nano-none, gpt-5.4

-nano-low, gpt-5.4-nano-medium, gpt-5.4-nano-high, gpt-5.4-nano-xhigh, grok-4-20,

grok-4-20-thinking, claude-4.5-sonnet, claude-4.5-sonnet-thinking, gpt-5.1-low,

gpt-5.1, gpt-5.1-high, gemini-3-pro, gemini-3-flash, gpt-5.1-codex-mini-low, gpt-

5.1-codex-mini, gpt-5.1-codex-mini-high, claude-4-sonnet, claude-4-sonnet-1m, cla

ude-4-sonnet-thinking, claude-4-sonnet-1m-thinking, gpt-5-mini, kimi-k2.5



$ agent --model auto -p "what model is being used"                                          
Cannot use this model: auto. Available models: composer-2-fast, composer-2, composer-1.5, gpt-5.3-codex-low, gpt
-5.3-codex-low-fast, gpt-5.3-codex, gpt-5.3-codex-fast, gpt-5.3-codex-high, gpt-5.3-codex-high-fast, gpt-5.3-cod
ex-xhigh, gpt-5.3-codex-xhigh-fast, gpt-5.2, gpt-5.3-codex-spark-preview-low, gpt-5.3-codex-spark-preview, gpt-5
.3-codex-spark-preview-high, gpt-5.3-codex-spark-preview-xhigh, gpt-5.2-codex-low, gpt-5.2-codex-low-fast, gpt-5
.2-codex, gpt-5.2-codex-fast, gpt-5.2-codex-high, gpt-5.2-codex-high-fast, gpt-5.2-codex-xhigh, gpt-5.2-codex-xh
igh-fast, gpt-5.1-codex-max-low, gpt-5.1-codex-max-low-fast, gpt-5.1-codex-max-medium, gpt-5.1-codex-max-medium-
fast, gpt-5.1-codex-max-high, gpt-5.1-codex-max-high-fast, gpt-5.1-codex-max-xhigh, gpt-5.1-codex-max-xhigh-fast
, gpt-5.4-high, gpt-5.4-high-fast, gpt-5.4-xhigh-fast, claude-4.6-opus-high-thinking, gpt-5.4-low, gpt-5.4-mediu
m, gpt-5.4-medium-fast, gpt-5.4-xhigh, claude-4.6-sonnet-medium, claude-4.6-sonnet-medium-thinking, claude-4.6-o
pus-high, claude-4.6-opus-max, claude-4.6-opus-max-thinking, claude-4.5-opus-high, claude-4.5-opus-high-thinking
, gpt-5.2-low, gpt-5.2-low-fast, gpt-5.2-fast, gpt-5.2-high, gpt-5.2-high-fast, gpt-5.2-xhigh, gpt-5.2-xhigh-fas
t, gemini-3.1-pro, gpt-5.4-mini-none, gpt-5.4-mini-low, gpt-5.4-mini-medium, gpt-5.4-mini-high, gpt-5.4-mini-xhi
gh, gpt-5.4-nano-none, gpt-5.4-nano-low, gpt-5.4-nano-medium, gpt-5.4-nano-high, gpt-5.4-nano-xhigh, grok-4-20, 
grok-4-20-thinking, claude-4.5-sonnet, claude-4.5-sonnet-thinking, gpt-5.1-low, gpt-5.1, gpt-5.1-high, gemini-3-
pro, gemini-3-flash, gpt-5.1-codex-mini-low, gpt-5.1-codex-mini, gpt-5.1-codex-mini-high, claude-4-sonnet, claud
e-4-sonnet-1m, claude-4-sonnet-thinking, claude-4-sonnet-1m-thinking, gpt-5-mini, kimi-k2.5 
3 Likes

update.. something weird is going on? now it’s showing

$ agent -p "what model is being used"                        
I'm powered by **Claude 4.6 Opus** (with extended thinking enabled).

$ agent -p "how much usage do i have left?"                                                 
b: Increase limits for faster responses claude-4.6-opus-high-thinking is not available in the slow pool. Please 
switch to Auto.  

$ agent -p "what is 2*6"                                                                    
b: Increase limits for faster responses claude-4.6-opus-high-thinking is not available in the slow pool. Please 
switch to Auto.     
1 Like

Where does the bug appear (feature/product)?

Cursor CLI

Describe the Bug

Since today, I can’t choose the Auto model anymore: Error: Unknown model: Auto.
Them same applies for “auto”.

agent --model “Auto” returns:
Cannot use this model: Auto. Available models: composer-2-fast, composer-2, composer-1.5, gpt-5.3-codex-low, gpt-5.3-codex-low-fast, gpt-5.3-codex, gpt-5.3-codex-fast, gpt-5.3-codex-high, gpt-5.3-codex-high-fast, gpt-5.3-codex-xhigh, gpt-5.3-codex-xhigh-fast, gpt-5.2, gpt-5.3-codex-spark-preview-low, gpt-5.3-codex-spark-preview, gpt-5.3-codex-spark-preview-high, gpt-5.3-codex-spark-preview-xhigh, gpt-5.2-codex-low, gpt-5.2-codex-low-fast, gpt-5.2-codex, gpt-5.2-codex-fast, gpt-5.2-codex-high, gpt-5.2-codex-high-fast, gpt-5.2-codex-xhigh, gpt-5.2-codex-xhigh-fast, gpt-5.1-codex-max-low, gpt-5.1-codex-max-low-fast, gpt-5.1-codex-max-medium, gpt-5.1-codex-max-medium-fast, gpt-5.1-codex-max-high, gpt-5.1-codex-max-high-fast, gpt-5.1-codex-max-xhigh, gpt-5.1-codex-max-xhigh-fast, gpt-5.4-high, gpt-5.4-high-fast, gpt-5.4-xhigh-fast, claude-4.6-opus-high-thinking, gpt-5.4-low, gpt-5.4-medium, gpt-5.4-medium-fast, gpt-5.4-xhigh, claude-4.6-sonnet-medium, claude-4.6-sonnet-medium-thinking, claude-4.6-opus-high, claude-4.6-opus-max, claude-4.6-opus-max-thinking, claude-4.5-opus-high, claude-4.5-opus-high-thinking, gpt-5.2-low, gpt-5.2-low-fast, gpt-5.2-fast, gpt-5.2-high, gpt-5.2-high-fast, gpt-5.2-xhigh, gpt-5.2-xhigh-fast, gemini-3.1-pro, gpt-5.4-mini-none, gpt-5.4-mini-low, gpt-5.4-mini-medium, gpt-5.4-mini-high, gpt-5.4-mini-xhigh, gpt-5.4-nano-none, gpt-5.4-nano-low, gpt-5.4-nano-medium, gpt-5.4-nano-high, gpt-5.4-nano-xhigh, grok-4-20, grok-4-20-thinking, claude-4.5-sonnet, claude-4.5-sonnet-thinking, gpt-5.1-low, gpt-5.1, gpt-5.1-high, gemini-3-pro, gemini-3-flash, gpt-5.1-codex-mini-low, gpt-5.1-codex-mini, gpt-5.1-codex-mini-high, claude-4-sonnet, claude-4-sonnet-1m, claude-4-sonnet-thinking, claude-4-sonnet-1m-thinking, gpt-5-mini, kimi-k2.5

Steps to Reproduce

agent
/model Auto
/model auto

agent --model “Auto”
agent --model “auto”

Expected Behavior

Be able to use the model Auto in the CLI

Screenshots / Screen Recordings

Operating System

MacOS

Version Information

CLI Version 2026.03.18-f6873f7

Does this stop you from using Cursor

No - Cursor works, but with this issue

2 Likes

@deanrie

Hi!

I just posted an error message on the Help board,But I think the bot selected it wrong, my post has been hidden.
So Looking at the merged post here, commenting because I don’t think I need to post any additional bug reports.

I can’t use Auto mode in Cursor Cli window powershell now
I thought it was in auto mode, so I kept working on it,
time goes the speed was so slow that I was checking to change model.
We found that the change setting is claude!!
I was using Auto mode, The model was suddenly changed i don’t know.

I used to save Claude and only use it in important moments..
I think I used up the month token

Please check what happened.

Thank you

Did they remove the auto model in cursor-agent 2026.03.18-f6873f7 ??

No, it was working yesterday.

Hey everyone! This is a confirmed regression in the CLI version. The team is aware and is tracking the issue.

As a temporary workaround until a fix is released, you can explicitly set a model, for example:

agent --model claude-4.6-sonnet-medium -p "your prompt"

Or use any other model from the available list.

@water1 about your concern that the model switched from Auto to Claude and started using your quota, this is most likely related to the same issue. Since Auto was unavailable, the CLI may have fallen back to a specific model. Unfortunately, I can’t adjust the charge retroactively, but the fix for Auto should prevent this going forward.

I’ll update this thread when the fix is out. Sorry for the trouble.

1 Like

hi @deanrie the workaround is not working for me

agent -p "test"                                          
b: Increase limits for faster responses claude-4.6-sonnet-medium is not available
 in the slow pool. Please switch to Auto.

agent --model claude-4.6-sonnet-medium -p "hi there"
b: Increase limits for faster responses claude-4.6-sonnet-medium is not available in the slow pool. Please switch to Auto.

Hey, I see the workaround doesn’t work in your case, sorry about that.

The issue is that you hit the usage limit. Usually the system would switch you to Auto, but Auto is currently unavailable due to a bug, so it turns into a loop.

You can try low-tier models that can run in the slow pool:

agent --model gpt-5.4-low -p "your prompt"
# or
agent --model gpt-5.2-low -p "your prompt"
# or
agent --model composer-2-fast -p "your prompt"

If that still doesn’t work, until Auto is fixed the CLI will be blocked when you hit limits.

Hi!

Thank you for your explanation.

It would have been better if the next chat was interrupted or at least switched to a cheaper model after the saved Auto mode settings became unavailable. Waiting requests(CUE) kept running on more expensive models, making it hard to notice what was happening.

I understand that auto mode is currently unavailable, so I’ll wait.

Current Status:

  • Agent – Model Auto-p “Hi”
  • ‘This model is not available: Auto.’

Thank you

1 Like

How funny! A bug that makes you money :sweat_smile:.

But seriously :folded_hands: please fix this :sad_but_relieved_face:

4 Likes

Auto is working for me!

$ agent --model gpt-5.2-low -p "your prompt"                 
b: Increase limits for faster responses You're out of usage. Switch to Auto, or a
sk your admin to increase your limit to continue.                                
$ agent --model gpt-5.4-low -p "your prompt"                 
b: Increase limits for faster responses You're out of usage. Switch to Auto, or a
sk your admin to increase your limit to continue.                                
$ agent --model composer-2-fast -p "your prompt"             
b: Increase limits for faster responses You're out of usage. Switch to Auto, or a
sk your admin to increase your limit to continue.

which version of the cli are you on? if you do agent –version

It’s working.
Thanks @deanrie and team! :clap:

2 Likes

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.