I am trying to setup LMStudio with cursor AI to use local ai for my code instead of their suggested models. I have followed a few tutorials online but none of the methods work for me. I added the link to my LMStudio in openAI override box but everytime i try it it says “The model *name of the model* does not work with your current plan or api key”. I have the pro plan so i don’t think that is the problem.
Hey, thanks for the report. Since Cursor doesn’t support local models, what you’re doing is a workaround and doesn’t guarantee stable operation. You can try setting it up through OpenRouter, and it should work.