“I’ve been saving for months to get the Corsair Dominator 64GB CL30 kit,” one beleagured PC builder wrote on Reddit. “It was about $280 when I looked,” said u/RaidriarT, “Fast forward today on PCPartPicker, they want $547 for the same kit? A nearly 100% increase in a couple months?”
I just got a 2x64GB 6000 kit before its price skyrocketed by like $130. I saw other kits going up, but had no clue I timed it so well.
…Also, why does “AI” need so much CPU RAM?
In actual server deployments, pretty much all inference work is done in VRAM (read: HBM/GDDR); they could get by with almost no system RAM. And honestly most businesses are too dumb to train anything that extensively. ASICs that would use, say, LPDDR are super rare, and stuff like Hybrid/IGP inference is the realm of a few random folks with homelabs… Like me.
I think ‘AI’ might be an overly broad term for general server buildout.
There was a recentish model, qwen next that was advertised as smth that can be run entirely on RAM.
They can ALL be run on RAM, theoretically. I bought 128GB so I can run GLM 4.5 with the experts offloaded to CPU, with a custom trellis/K quant mix; but this is a ‘personal use’ tinkerer setup basically no one but hobbyists will touch.
Qwen Next is good at that because its very low active parameter.
…But they aren’t actually deployed that way. They’re basically always deployed on cloud GPU boxes that serve dozens/hundreds of people at once, in parallel.
AFAIK the only major model actually developed for CPU inference is one of the esoteric Gemma releases, aimed at mobile.
I bought 16gb of ddr4 for 110eu back in 2018. Welcome back to the DDR wars, with NAND soon to follow.
Welp, it’s probably about time for my PC to break its RAM again.
Old computer blew up, had to buy a new one. Nice 128gb ddr5. Just mobo, mem, cpu. Cpu is a rhyzen 9 9700
The memory was well over 40% of the cost, wtf?
Not sure if it’s exactly the same kit but you can get that for £300 in the UK.
Wonder if the guy in the article is in the US. Tariffs would probably explain the extra $200 price difference.
What are the odds SAltman and the circular investment crew are pivoting to a pump and dump scheme on server hardware (as well as just GPGPUs) after finally admitting to themselves LLMs have hit a wall not even trillions of dollars will fix (architecture failure, no AGI for you, no getting away from those pesky workers like promised). If everything pans out nicely the world will end up with a bunch of new chip fabs (a real bottleneck) and nothing to use them for, cheap computers for all. Probably won’t, instead we’ll get a geopolitical shitshow as China (and hopefully Europe) will have functional, actually producing valuable stuff, economies. What happens to the US is left as an exercise for the reader.
OpenAI’s “Stargate” project has recently signed an agreement with Samsung and SK hynix for up to 900,000 wafers of DRAM per month. That figure alone would account for close to 40% of global DRAM output.
High-density NAND products are effectively sold out months in advance. Samsung’s next-generation V9 NAND is already nearly booked before it’s even launched. Micron has presold almost all of its High Bandwidth Memory (HBM) output through 2026. Contracts that once covered a quarter now span years, with hyperscalers buying directly at the source.
If China’s going to compete on AI, it’s going to be doing so with a limited supply of memory, I expect.
They’re going all in on domestic chip manufacturing, and they are catching up much faster than armchair generals, and even actual generals, predicted.
AI increases my power utility bill
AI takes my water
AI increases the price of GPUs
AI increases the price of RAM
AI makes my search results worse and slower
AI is inserted into every website, app, program, and service making them all worseAll so businesses and companies can increase productivity, reduce staff, and then turn around and increase prices to customers.
Yeah I am beyond ready for the next new fad.
Isn’t capitalism a blast?
Right, what you said, but I’d remove the part where it increases productivity.
People buying RAM: oh no, what do we do?
People buying GPUs: first time?
What?
You think this is the first ram crunch?
It’s not even the first one in this decade…
Well, I was aware RAM prices fluctuate.
I’ve never been so unfortunate when buying larger RAM, or building a new system with a new DDR version.
They don’t real fluctuate, it’s more oscillating.
Sometimes it’s “normal” but due to a wide range of issues that can change quickly and stay that way for months because everyone waits till prices go down. So as they go down, people stop waiting,
Those are the same people.
Even ignoring the price hike, if you have to save for months to buy RAM, you should probably save it instead. Doesnt sound like a very smart financial decision to spend every last cent on your PC.
It could be just a kid doing odd jobs to upgrade their computer.
For some that’s their main hobby. If they’ve got an emergency fund, retirement account, and can cover expenses, why shouldn’t they save for their hobby? Seems like they’re doing it the smart way.
Exactly. I “save for months” to buy my contacts, but that means setting aside $85 a month so I can drop $1000 every year on a new order of contacts without thinking about it. Saving for months for something is just budgeting.
I just successfully bought from contact lens king for cheaper than I could find anywhere else.
Sort of a sketchy domain name but I got my contacts.
Saving for months ≠ using money earmarked for necessary expenses. I saved for months to buy the parts for my PC, and all of it was discretionary income.
But I otherwise agree that you should not spend your necessary funds on unnecessary expenses.
This is like the fifteenth time I’ve seen this and it seems unique to Lemmy. You contradict their main point, but concede there is a reasonable undercurrent. I like this a lot. It makes the internet feel a lot less toxic.
Thanks to you and everyone else doing the same!
Reddit and much of the internet was like that a long time ago.
I like to think it came from the higher barrier to entry, kinda like Lemmy has now. One doesn’t just accidentally use Lemmy because it was recommended on their App Store feed. You have to seek it out.
It’s just really refreshing instead some knee jerk, “You don’t know their situation shit for brains!” It’s measured to highlight a flaw in the core logic but civil in acknowledging the truth in the nuance. I hope we keep it! Lol.









