I’m building a Cursor/VSCode extension that has a custom interface and I’m trying to hook into LLM support within cursor.
I’m using code like this in the extension host, and expecting to see some of the models from cursor. It works in VSCode, but I can’t seem to see any models available in Cursor (tried calling with no parameters as well.
async function checkForLM() {
const models = await vscode.lm.selectChatModels({ family: "gpt-4o" });
console.log("Found models", models);
if (models.length > 0) {
const [first] = models;
const response = await first.sendRequest([
{ role: vscode.LanguageModelChatMessageRole.User, name: "user", content: [{ value: "Hello, how are you?" }] },
]);
// ...
} else {
// NO chat models available
}
}