I asked the agent to do something quite reasonable and it instead generated this image
It would be funnier if it didn’t likely burn up a mountain of credits.
It would be funnier if it didn’t likely burn up a mountain of credits.
Haha, that’s funny. Maybe you meant ‘generated’, as in the typical term used for images. If you didn’t generate it, then it might be a bug.
Whats your prompt and Model?
Assuming its using Gemini 3 Pro, often Gemini 3 Pro have high hallucinate rate or its bug from Cursor Agent harness.
if you believe its bug, better report on Bug Report with Request id and cursor version
Heh actually funny, cheers ![]()
My prompt was something like, “Remove values from a list that are not contained in the following list…”. I didn’t use the word ‘generated’. This was just GPT-5.2 hallucination 100%. Its only ever happened to me this once. Its not worth an actual bug report, its just funny.