17 Billions Token

I just received a 1-year anniversary badge notification from the forum. I hadn’t shared my usage for 2025 here. How many tokens have you used?

2 Likes

Thanks a lot of tokens, What you mainly built for?

Here is mine!

You do realise you were a major contributor to these things, do you?

  1. Rising energy costs
  2. Rising memory prices
  3. Water crisis due to high consumption by AI data centres

Using a high amount of tokens is not a thing to boast about, but it’s a flex of how much money you have spent using LLM services and pumped into the “AI” trend.

ESG was tossed aside by these companies a while back.

No offence, dude, you are just a user, but we should be aware of the consequences of our actions over personal gain - hence this message.

1 Like

Seriously? The total COMBINED usage represented by all three people in this thread so far, is a TEENSY DROP IN THE BUCKET compared to the grand total usage of AI by the world today. This guy isn’t even shifting the needle here!!

OpenRouter ALONE, in just HALF the year last year, reported they had processed 100 TRILLION tokens. So if we take 18 billion, out of 100 trillion? It is a paltry 0.018%. Not one percent, not a tenth of a percent, not even two hundredths of a percent here. And that is ONLY relative to OpenRouter’s publicized HALF-YEAR token rate! That isn’t even relative to the ENTIRE AI usage by users, let alone the grand total energy cost of AI for everything related to AI…

Now, if you take the ENTIRE COMMUNITY of agentic programmers together, WHICH WOULD INCLUDE YOU, then sure, were twitching the needle a bit. Again, though, even if you took ALL the actual AI USAGE combined, it is still not the majority of the COST of AI. The major costs involve the constant training efforts to enhance, refine, and create new models, which require significant computational resources to build the models in the first place.

So, to be quite frank: What a monumentally uninformed post. The total combined usage in this thread doesn’t even register as NOISE in the overall signal of AI energy cost!

Actually, OpenRouter’s study was barely even 1/3rd of the year. They stated that their total daily burn was 8.6 trillion tokens PER DAY. They were still experiencing growth, so by the end of the year the token burn was undoutedly even higher. Assuming a constant 8.6 trillion tokens a day, though, over 365 days, that means a grand total of 3139 TRILLION TOKENS were used, through OpenRouter ALONE! Just in 2025!

This was the most cringe thing I have ever read.

Get off your high horse.