Hi Dan, thanks for replying.
These are the parameters I replaced in the OpenAI API Key configuration:
• Model: ollama
• URL: https://-----.ngrok-free.app/v1
However, I get the following error:
(status code 0)
TypeError: Failed to fetch
url https://--------.ngrok-free.app/v1/chat/completions -H “Content-Type: application/json” -H “Authorization: Bearer ollama” -d ‘{
“messages”: [
{
“role”: “system”,
“content”: “You are a test assistant.”
},
{
“role”: “user”,
“content”: “Testing. Just say hi and nothing else.”
}
],
“model”: “deepseek-r1:1.5b”
}’
When I check the ngrok logs, I see this response:
HTTP Requests
16:45:20.677 -04 OPTIONS /v1/chat/completions 403 Forbidden
However, when I run the request via cURL in the CLI, it works perfectly:
{“id”:“chatcmpl-91”,“object”:“chat.completion”,“created”:1738266602,“model”:“deepseek-r1:1.5b”,“system_fingerprint”:“fp_ollama”,“choices”:[{“index”:0,“message”:{“role”:“assistant”,“content”:“\u003cthink\u003e\nAlright, the user wants me to test how they’re interacting with my chat interface.\n\nI should wait for their input before I respond.\n\nMaybe after a few sentences, they can just confirm whether I’m ready or not.\n\nThis way, it keeps the conversation flowing smoothly without any abrupt stops.\n\u003c/think\u003e\n\nGreat! Just tell me anything. I’ll be happy to assist you or talk about something else with you.”},“finish_reason”:“stop”}],“usage”:{“prompt_tokens”:18,“completion_tokens”:85,“total_tokens”:103}}
It seems like the issue might be related to how the request is being handled by ngrok.
Any ideas on what could be causing this?