I don’t think these data centers really are for LLMs. Right now, I can go to a dozen websites and use some LLM, without sitting on a wait list, for exactly $0.00 out of my pocket. So there’s obviously enough processing power to meet demand as-is, but… What? Demand will skyrocket when they crank up the fees? OpenAI operated around ~$18 billion in deficit last year, is everyone really gonna pay $200 - $600 per month for this? Plus, LLMs are reaching a plateau, more data doesn’t equal a more coherent model, they’re running into a dead end.
My local data center is steamrolling over public opinion. We’re not allowed to ask who will own it, how much power it will consume, nothing. “Officially,” the installation has stalled, but they’re still bulldozing the trees to make the lot where its supposed to go.
My personal conspiracy theory is that this is coming from Palantir, laundering resources through the tech companies, using DoD money. The data centers aren’t for LLMs, but to build out a massive dragnet to track civilian travel, who goes where and when, to be used by DHS. That explains why they need to be distributed geographically per capita, the extreme secrecy around them, and the way utility companies and local politicians keep bending over despite public outcry.
What determines the value of a token? Is it Standley Nickels calculation, or did they actually attempt to tie it out to infrastructure and operating costs? If the latter, then anyone using these systems needs to be prepared for a serious rug pull as that’s squarely in “the first hit is free” territory.
That doesn’t answer the question. What determines “$1800 worth of tokens”? Is that value calculated from computer time-infrastructure cost? Is it what they think an equivalent of work would be for the time it takes the query to run? Or is it an entirely arbitrary number?
If the last one, most likely they’re running at a loss and it’s gonna bite them hard when the bill is due for infrastructure.
If I recall, it was something Ed Zitron said but I can’t find the quote, so it might not be. The implication that the cost of delivering that $20 user account was approximately $1800 worth of compute.
I don’t think these data centers really are for LLMs. Right now, I can go to a dozen websites and use some LLM, without sitting on a wait list, for exactly $0.00 out of my pocket. So there’s obviously enough processing power to meet demand as-is, but… What? Demand will skyrocket when they crank up the fees? OpenAI operated around ~$18 billion in deficit last year, is everyone really gonna pay $200 - $600 per month for this? Plus, LLMs are reaching a plateau, more data doesn’t equal a more coherent model, they’re running into a dead end.
My local data center is steamrolling over public opinion. We’re not allowed to ask who will own it, how much power it will consume, nothing. “Officially,” the installation has stalled, but they’re still bulldozing the trees to make the lot where its supposed to go.
My personal conspiracy theory is that this is coming from Palantir, laundering resources through the tech companies, using DoD money. The data centers aren’t for LLMs, but to build out a massive dragnet to track civilian travel, who goes where and when, to be used by DHS. That explains why they need to be distributed geographically per capita, the extreme secrecy around them, and the way utility companies and local politicians keep bending over despite public outcry.
Its not consumers they charge.
They were talking about businesses budgeting about 50% of a developer salary per developer on tokens.
A consumer $20 claude code account gets about $1800 worth of tokens, iirc.
What determines the value of a token? Is it Standley Nickels calculation, or did they actually attempt to tie it out to infrastructure and operating costs? If the latter, then anyone using these systems needs to be prepared for a serious rug pull as that’s squarely in “the first hit is free” territory.
They wanted companies to normalise a dollar spend on tokens equivalent to half a dev salary per dev.
They want to pay you in it too
That doesn’t answer the question. What determines “$1800 worth of tokens”? Is that value calculated from computer time-infrastructure cost? Is it what they think an equivalent of work would be for the time it takes the query to run? Or is it an entirely arbitrary number?
If the last one, most likely they’re running at a loss and it’s gonna bite them hard when the bill is due for infrastructure.
If I recall, it was something Ed Zitron said but I can’t find the quote, so it might not be. The implication that the cost of delivering that $20 user account was approximately $1800 worth of compute.