OpenAI New o1 models

I found this:

2 Likes

I knew, but this is my first time to use o1-mini, it should include 10-per-day, but it not. :sweat_smile:

1 Like

both the mini o1 and the preview larger o1 are both asking me to enable usage based pricing, I am already a pro subscriber… then why the additional expense

1 Like

According to https://livebench.ai/
o1-mini seems to be the best at code generation currently:

Also this shows interesting results:

1 Like

Anyone have a good “system prompt” or .cursorrules setups? I am getting alot more “Apply” errors with o1 - even though it is making better suggestions. I think it is because o1 is making code suggestions for the same file multiple times, showing the tiny block changes, and then also doing a complete at the end.

2 Likes

Oh, I’ve wondered how one sets the system prompt with Cursor. How are you doing that?

I’m know about ‘rules for AI’ and .cursorrules, but don’t know how to edit the system prompt.

when i searched “o1 cursor” and i found a tutorial about using o1 on cursor. i never heard of this but just tried, seems all good. How to use GPT O1 Preview (o1-preview) on Cursor | ModelBox Blog

1 Like

Is long context available with API? I am using o1-preview-128k but it doesn’t seem to work.

1 Like

It seems that when I turn on OpenAI key and try to use o1-preview-128k or o1-mini-128k in the Long Context Chat, i get an error “Failed to reach Anthropic”. After waiting a bit the issue gets resolved and the models start successfully working in the Long Context Chat. I’ve also tried adding openai/o1-preview-128k instead of o1-preview-128k and perhaps this was somehow related to the resolution.

I’m using Openrouter btw:
image

1 Like

This is what I am getting. And there is no choice to add model to long context unless you -128k or -200k at the end for Open AI and Anthropic. Gemini seems different. I wish we have other ways to add models to long context.
{“error”:{“message”:“The model o1-preview-128k does not exist or you do not have access to it.”,“type”:“invalid_request_error”,“param”:null,“code”:“model_not_found”}}

1 Like

I have had a few API errors/not connecting to Open AI and was charged for the runs but they did not run.

Today, I tried again, it worked fine, included in pro.

@truell20 Would it be possible to add support for o1 long-context models for API users?

1 Like

Good news :+1:

i can see that the option of o1-mini and o1-preview gets deleted when we try to put openai api. it seems we can only use api provided by cursor for now, which means only 5-10 chats per day? im waiting for feature to add my own api, so i can use as much as i want

1 Like

Hi, you can use openrouter.ai, or if you have Tier 5 OpenAI API access, you can use it directly. You get 10 fast requests per day, after which you will receive slow requests.

Hey, my first o1-preview request went flawlessly after Claude was failing multiple times. However, my second and third o1-preview request did nothing but duplicate a part of my code and place it on top. Is this a known issue, should I stop using it? Don’t want to waste more $ :')

I used the composer window btw.

without any extra change of using o1-mini after 10 requests?

I think so, but it changes from the fast requests to the slow requests