Ollama local LLaModels request

Feature request for product/service

AI Models

Describe the request

Guys from a project dev, you know we all need the local LLAMA access by the way. It’s better to implement it in Cursor;->