I’m pretty sure the system prompt includes something like “never ask the user for information”. Even when I have a rule explicitly stating that it should stop and ask me when it’s unsure of something it ignores the rule.
This usually takes the form of some type errors it doesn’t understand followed by “let’s simplify and…” at which point it goes entirely off script and implements whatever the ■■■■ it wants just to silence the type errors.
This is super frustrating. I have to watch it carefully for the “let’s simplify and …” because that’s always when it goes off script and mangles things when it could instead follow my rule, stop and ask. Even the expensive 3.7 max does this!
I think it’s a bug in the system prompts (which the models absolutely refuse to reveal or ignore, you basically get “I’m sorry Dave, I’m afraid I can’t do that” if you ask the model what the system prompt is or ask it to ignore the system prompt).