This is yet another thing I blame on American Business sacrificing itself on the altar of Shareholder Value. It’s no longer acceptable for a public business to simply make a profit. It has to grow that profit, every quarter, without fail.
So, simply having a good consumer product division that makes money won’t be enough. At some point some executive will decide that he can’t possibly get his bonus if that’s all they do, and decide they need to blow it all up to chase larger profits elsewhere.
Maybe we need a small, private company to come along and start making good consumer hardware. They still need components, though, so will have to navigate getting that from public companies who won’t return their calls. And even once they are successful, the first thing they will do is cash out and go public, and the cycle starts again.
Eh, maybe a chance for a architecturial redesign too, that caters more to desktop than to server. We could build computers that don’t need cycles while waiting for input, displays that don’t need fps for a still image. And that don’t need GHz while doing so, allowing for modular, easy to reair designs
Removed by mod
Imma remember what Curcial and others are doing, so when the AI bubble pops I’ll skip on all their products.
I’m looking forward to cheap Chinese video cards that out perform Nvidia shit for 1/4 the price.
Hopefully Linux supported. That’s the main selling point of AMD GPUs right now for me, since there’s less problems with trying to get stuff like HDR running on it than NVIDIA.
I wonder why China is still for the most part ignoring Linux in favor of Windows. Like to update 8bitdo controllers they only provide a Microsoft program and no Linux version.
You’d think they’d be rushing towards pushing Linux adoption.
That’s capitalism for you. But also Linux, where it’s typical to upstream hardware support and rely on existing ecosystems rather than release addon drivers or niche supporting apps.
China has made some strategic investments in Linux over the years though – often domestically targeted, like Red Flag Linux, and drivers for chinese hardware, etc.
But also Linux, where it’s typical to upstream hardware support and rely on existing ecosystems rather than release addon drivers or niche supporting apps.
Still possible though, right?
It does afterall support out of tree device drivers now.Sure… but why would el cheapo hardware want/need to support proprietary drivers? Now, for premium hardware and software, they might still want vendor lock-in mechanisms… So unless you absolutely have to, you should avoid hardware on Linux that needs proprietary drivers.
So either way, it make it better to support Linux over MS Windows.
Linux seems to be common to run things like servers, but is that the case for consumer hardware?
When I’ve looked at peripherals like keyboards and controllers linux support has been lacking. Of course, for keyboards I went out of my way to get qmk compatible ones to use via and vial instead so I dont need to run an exe of unknown origins to remap or update the firmware.
And how is it for games. Is there more of a push to support Linux for their games? Since like Genshin Impact they only officially support Windows. There’s work around with anime launcher which disables the DRM, but I wouldn’t consider that Linux support with it risking a ban. They have their own version of Finals now and Ive wondered if that has Linux support or at least have the DRM work with wine type methods instead of the approach Valorant took.
There is plenty of consumer hardware that is supported on Linux, or will be as soon as a kernel developer gets their hands on it, reverse engineers the protocol if necessary, and adds support. For things like keyboards, there are often proprietary extensions (eg. for built-in displays, macros, etc.). It pays to check for Linux support before buying hardware though. Sometimes it’s not the kernel drivers, but supporting software (eg. Steam input) that might not support it.
First class vendor support for Linux is more common for niche/premium hardware designed in the west, than cheap chinese knockoffs that follow it. Long term customer support is not their strong suit.
What do you mean lacking support for keyboards and controllers? Maybe for doing weird custom stuff like RGB, but for anything else they’re standard HIDs and will work with anything, no “support” needed. You can plug a USB keyboard and mouse into your phone and it’ll work if you want.
I’m currently playing Clair Obscur on linux through steam with a cheap fake xbox controller I got off ebay, and it works perfectly. I’m using an Nvidia card too, and I haven’t had to do any customisation or anything.
Easy anti-cheat won’t work, so Valorant/Fortnite, etc. are out of the question for now, but any games that don’t use that kind of malware are probably fine.
China has no need for open source because they steal everything anyway.
My hope for open source is that if something sketchy is pushed there might be a chance to catch as opposed to a proprietary approach where nobody has a chance of knowing what is going on.
Likewise. Don’t expect it from China though.
Great. Now they’re building infrastructure and industry atop a stolen Trojan Horse, which may still bite them the moment the little oange man tells Nadella to flip the switch.
I hope you’re right because Intel and AMD still can’t compete with high end Nvidia cards, and that’s how we ended up with a $5000 5090.
FIVE THOUSAND?!
Jesus nun-fucking Christ, what an absolute scam. I bought a 1070 for $220 in the first few months after release. Guess I’ll just have to hope it can run for another 10 years…
AMD can already beat nvidia at the price tiers most people actually buy at, and Intel is gaining ground way faster than anyone expected.
But outside of the GPU shakeup, I could give a shit about Intel. Let China kill us. We earned this.
We also partly ended up with the 5k 5090 because it’s just the TITAN RTX of the 50xx generation - the absolute top of the line a card where you pay 200% extra for that last +10% performance.
nVidia just realized few generations back that naming those cards the xx90 gets a bunch of more people to buy them, because they always desperately need to have the shiniest newest xx90 cards, no matter the cost.
AMERICAN manufacturers, just waint until the Chinese industries swoop in to fill the gap. I seriously feel America just wants to kneecap itself.
“banned for security concerns”
America doesn’t. The Russian asset in the White House and its brainwashed minions do.
You guys voted him in, twice, with the popular vote the second time. Don’t pretend you don’t own him.
Wants to kneecap itself?
Dude, the US is going full seppuku and we’re going to gut ourselves on the floor.
Hard to swoop in with massive tariffs. The few players that remain will just charge a lot more…it’ll become the rich lucky few who can afford their own hardware.
No such tariffs in the EU 🥹
yet
The US is not the only place to sell to
Part of this has been a long-standing move by every industry to prioritize business-to-business sales as opposed to consumer sales simply because businesses have money and consumers don’t, because businesses are pocketing all the profits and refusing to pay their employees (consumers) a living wage, let alone a thriving wage.
It’s been a long time coming for the PC industry, because it’s been a general trend for at least two decades as sales to business have become more profitable over consumer sales ever since the late 90s.
It’s just more evidence that the bottom 90% of earners are just being priced out of anything but bare subsistence and the wealthy do not give a single fuck about it, in fact, they’re embracing it.
This is an important point in general. The old story of “voting with your wallet” is now more and more obviously mathematically absurd.
You can only vote with your wallet if there’s something in it.
Eat the rich!
The silver lining is that hardware performance gains have been so minor from generation to generation that upgrading isn’t really that important anymore. Like if i upgrade from next generation equivalent GPU it would give like 8% more fps… and it costs like 1,5k… No thanks.
You used to get a fairly significant upgrade ever few years for about the same cost as the old hardware. Transistors aren’t really getting much smaller anymore, so more performance needs a bigger die and costs more money.
Is Moore’s Law being resurrected?
Transistor size downscaling is pretty much done. Also mosfets can’t much improve in this race anymore. We would need a new computing paradigm to see manufacturing cost reductions or major performance leaps. For consumers thats still years away.
off to sell it cheaper to companies, so they can rent it back to us.
For some workloads, yes. I don’t think that the personal computer is going to go away.
But it also makes a lot of economic and technical sense for some of those workloads.
Historically — like, think up to about the late 1970s — useful computing hardware was very expensive. And most people didn’t have a requirement to keep computing hardware constantly loaded. In that kind of environment, we built datacenters and it was typical to time-share them. You’d use something like a teletype or some other kind of thin client to access a “real” computer to do your work.
What happened at the end of the 1970s was that prices came down enough and there was enough capability to do useful work to start putting personal computers in front of everyone. You had enough useful capability to do real computing work locally. They were still quite expensive compared to the great majority of today’s personal computers:
https://en.wikipedia.org/wiki/Apple_II
The original retail price of the computer was US$1,298 (equivalent to $6,700 in 2024)[18][19] with 4 KB of RAM and US$2,638 (equivalent to $13,700 in 2024) with the maximum 48 KB of RAM.
But they were getting down to the point where they weren’t an unreasonable expense for people who had a use for them.
At the time, telecommunications infrastructure was much more limited than it was today, so using a “real” computer remotely from many locations was a pain, which also made the PC make sense.
From about the late 1970s to today, the workloads that have dominated most software packages have been more-or-less serial computation. While “big iron” computers could do faster serial compute than personal computers, it wasn’t radically faster. Video games with dedicated 3D hardware were a notable exception, but those were latency sensitive and bandwidth intensive, especially relative to the available telecommunication infrastructure, so time-sharing remote “big iron” hardware just didn’t make a lot of sense.
And while we could — and to some extent, did — ramp up serial computational capacity by using more power, there were limits on the returns we could get.
However, what AI stuff represents has notable differences in workload characteristics. AI requires parallel processing. AI uses expensive hardware. We can throw a lot of power at things to get meaningful, useful increases in compute capability.
-
Just like in the 1970s, the hardware to do competitive AI stuff for many things that we want to do is expensive. Some of that is just short term, like the fact that we don’t have the memory manufacturing capacity in 2026 to meet need, so prices will rise to price out sufficient people that the available chips go to whoever the highest bidders are. That’ll resolve itself one way or another, like via buildout in memory capacity. But some of it is also that the quantities of memory are still pretty expensive. Even at pre-AI-boom prices, if you want the kind of memory that it’s useful to have available — hundreds of gigabytes — you’re going to be significantly increasing the price of a PC, and that’s before whatever the cost of the computation hardware is.
-
Power. Currently, we can usefully scale out parallel compute by using a lot more power. Under current regulations, a laptop that can go on an airline in the US can have an 100 Wh battery and a 100 Wh spare, separate battery. If you pull 100W on a sustained basis, you blow through a battery like that in an hour. A desktop can go further, but is limited by heat and cooling and is going to start running into a limit for US household circuits at something like 1800 W, and is going to be emitting a very considerable amount of heat dumped into a house at that point. Current NVidia hardware pulls over 1kW. A phone can’t do anything like any of the above. The power and cooling demands range from totally unreasonable to at least somewhat problematic. So even if we work out the cost issues, I think that it’s very likely that the power and cooling issues will be a fundamental bound.
In those conditions, it makes sense for many users to stick the hardware in a datacenter with strong cooling capability and time-share it.
Now, I personally really favor having local compute capability. I have a dedicated computer, a Framework Desktop, to do AI compute, and also have a 24GB GPU that I bought in significant part to do that. I’m not at all opposed to doing local compute. But at current prices, unless that kind of hardware can provide a lot more benefit than it currently does to most, most people are probably not going to buy local hardware.
If your workload keeps hardware active 1% of the time — and maybe use as a chatbot might do that — then it is something like a hundred times cheaper in terms of the hardware cost to have the hardware timeshared. If the hardware is expensive — and current Nvidia hardware runs tens of thousands of dollars, too rich for most people’s taste unless they’re getting Real Work done with the stuff — it looks a lot more appealing to time-share it.
There are some workloads for which there might be constant load, like maybe constantly analyzing speech, doing speech recognition. For those, then yeah, local hardware might make sense. But…if weaker hardware can sufficiently solve that problem, then we’re still back to the “expensive hardware in the datacenter” thing.
Now, a lot of Nvidia’s costs are going to be fixed, not variable. And assuming that AMD and so forth catch up, in a competitive market, will come down — with scale, one can spread fixed costs out, and only the variable costs will place a floor on hardware costs. So I can maybe buy that, if we hit limits that mean that buying a ton of memory isn’t very interesting, price will come down. But I am not at all sure that the “more electrical power provides more capability” aspect will change. And as long as that holds, it’s likely going to make a lot of sense to use “big iron” hardware remotely.
What you might see is a computer on the order of, say, a 2022 computer on everyone’s desk…but that a lot of parallel compute workloads are farmed out to datacenters, which have computers more-capable of doing parallel compute there.
Cloud gaming is a thing. I’m not at all sure that there the cloud will dominate, even though it can leverage parallel compute. There, latency and bandwidth are real issues. You’d have to put enough datacenters close enough to people to make that viable and run enough fiber. And I’m not sure that we’ll ever reach the point where it makes sense to do remote compute for cloud gaming for everyone. Maybe.
But for AI-type parallel compute workloads, where the bandwidth and latency requirements are a lot less severe, and the useful returns from throwing a lot of electricity at the thing significant…then it might make a lot more sense.
I’d also point out that my guess is that AI probably will not be the only major parallel-compute application moving forward. Unless we can find some new properties in physics or something like that, we just aren’t advancing serial compute very rapidly any more; things have slowed down for over 20 years now. If you want more performance, as a software developer, there will be ever-greater relative returns from parallelizing problems and running them on parallel hardware.
I don’t think that, a few years down the road, building a computer comparable to the one you might in 2024 is going to cost more than it did in 2024. I think that people will have PCs.
But those PCs might running software that will be doing an increasing amount of parallel compute in the cloud, as the years go by.
-
Even as consumer revenue remains sizable and maintains steady year-on-year growth, it finds itself competing against segments that grow exponentially faster and earn more per unit.
So it has nothing to do with people having less money. It honestly gives me hope, things could change with a bubble burst.
The point is to make the bubble bigger, then they’ll pretend that’s why they have to exit consumer market.
The goal is the so called “thin client” - i.e. absolutely everything in cloud.
Edit: my phone can’t spell
Bingo! And they’re doing it to enterprises too. Why do you think copilot is shoved into everything? Why do you think Recall is creeping towards being mandatory? Why do you think OneDrive isn’t optional anymore?
They don’t just want your data for advertising. They want to watch the entire capital machine in real-time to make sure there aren’t any gaps, and dissent, and most of all, anyone getting the jump on something new and big. OneDrive to rule them all.
I hear ya.
It pisses me off the way they market this shit as AI; just a god damn surveillance tool, hiding behind a fucking chat bot.
And, anybody reading who’s tempted to defend the technology, fuck off - it’s a prediction algorithm. A fancy one, but that’s it. There no steps or logic or reasoning. There’s nothing hiding in the “black box”. No steps taken to construct the response - just statistical analysis on steroids spewing out most likely matches.
At what point can AI companies play the “too big to fail” card though, like the banks?
Bubble bursts, and the government uses our taxes to bail out the companies. Again.
The nazis will bail out everyone who bends the knee and submits a bribe. It’s not like they’re spending their own money…
Probably because the hardware is going into systems that eliminate jobs and we become broke. All that gear is gonna sit on the shelf if we can’t afford it.
You can make a lot of money selling imaginary products to nonfunctional industries.














