How much are token context lengths of different models

hello there can anybody tell me how much the context window lengths, if possible, of all models are?

or at least of gemini pro exp xxx, gpt-4o, and claude sonnet 3.5

thanks in advance