Support for ollama provided local models

Feature request for product/service

AI Models

Describe the request

I would love Cursor to officially support local Ollama endpoints