Claude 4 sonnet don't understand what you're saying, non-English language

As the title suggests, have you guys encountered this? It has happened to me so far. My language is Chinese, I told him that it is forbidden to modify the backend, the problem is in the frontend, Claude 4 said: ok I’ll modify the backend…

It seems he doesn’t understand what I’m saying. It’s only happened in this one model.

Sometimes I’ll tell him to do A, but he’ll do B. He won’t do what I tell him to do. It’s like the data is contaminated.

it seems to be partially an issue with model training and post training reinforcement, not a language issue.

It could be that Anthropic will adjust this as they did with previous models too

Maybe. I just gave 3.7 the cue word and he fixed it for me.:thinking:

Although the model supports multiple languages, I think it might sometimes struggle to understand you if you’re using complex languages like Chinese. Gemini models also have this issue.

1 Like

Thank you for your response This is a rare problem, he understands most of the vocabulary and I need to tweak the cues.:grinning_face_with_smiling_eyes: