When model GPT-4 Turbo will be added for pro users?

In fact, the input tokens of GPT-4 Turbo are 3 times cheaper than the current GPT-4 being used in the pro plan. However, it has a 15 times larger context window indeed. Assuming that the average user will increase their request context by 15 times simply because they “can” (which is likely not accurate), they could still maintain the same cost limit as the current GPT-4 by limiting it to ~100 (500 divided by 15 multiplied by 3) GPT-4 Turbo requests.

1 Like

Do we know it has bigger context? They said they are looking into it but for now it’s still 8k GPT-4 or GPT-4 Turbo? - #10 by arvid220u

Maybe I missed a thread where they announced expanded context?