Regardless of the OS, if you’re using the computer for anything productive, the application software, not the OS, will eat the majority of the RAM anyway. If you’re looking at the minimum requirements, chances are you’re not looking to do anything besides browsing the web with 5 tabs open.
It sucks though, I agree - software should get more efficient over time, just like hardware does. Out of curiosity, do we have anything more specific, i.e. how they tested that, what apps were running and so on? Or maybe they now deem that more things should be running?
This doesn’t seem so bad, though. 2 GB more in about 10 years is pretty reasonable in terms of an increase.
It’s not like they doubled it.
2GB is a lot
It seems to imply that software has gotten way worse in the last 10 years.
Worse is relative, a proportion of the requirement increase will be due to worse code, but much more will be for features to make the software more accessible to more people, and adding features without needing to remove old ones, neither of which are a bad thing, otherwise everything would be a command line tool that removes options every few months and only has one way to use it
no it is not reasonable. What the hell do they need an extra 2gb for? What the hell is the operating system taking up that much resources for?
My first pc needed 4MiB of ram for the os. Why does this need 1536x as much to provide… not much else tbh?
According to the article linked in the article, it’s not that the operating system itself is more demanding, but more that the DE, and Browsers/Websites are more demanding now.
It feels like that Canonical basically needs to do the games thing of having a set of minimum specs for Ubuntu to run at all, and a recommend specs for Ubuntu to run well. Canonically basically bumped up the latter, but it’s being taken as the former.
Ubuntu is the Windows of Linux.
It’s getting more and more bloated with unnecessary and unwanted things, because of canonicals bad management decisions. They seemingly care more about “business” rather than users.
Use Debian if you want a system like Ubuntu that isn’t full of Canonical’s corporate shit. Ubuntu is based on Debian.
Would lmde be as good?
Yeah, LMDE is pretty good. I used it for a couple of years during my rage-against-Ubuntu phase.
My Tandy Sensation required 256MB and everything worked fine.
Shocked i got this far without someone blaming snaps
i blame snaps
Just use Debian
Or arch
I use arch myself but I don’t think it’s a good alternative top Ubuntu
or if you can’t get that to go (i had some hardware support issues) look to MX Linux or Linux Mint XFCE
Mint is based on Ubuntu.
yes but Linux Mint XFCE doesn’t use all the heavy weight sytems that mean Ubuntu desktop needs 6 GB ram. for that matter, Xubuntu will, too, but fuck canonical.
however, as much as i loathe canonical i’m not gonna attract people to more secure from bullshit solutions if i’m net willing to meet them where they are. i’d rather we all be on debian however if even i can’t get my laptop to work properly with it, i’m gonna direct some people to mint just to get them a little farther afield.
LMDE is based on Debian.
- Everything is a framework under a framework running on a pseudo virtual machine. 6 GB are just for the notepad and the mouse driver.
I wonder how much of this is just modern web apps… even running without a containerized distro and a leaner DE - I still have +90% of my RAM taken up by websites.
Modern UI development is such fucking shit. I have no idea why they went with all of these heavyweight shit frameworks.
Seriously, why do so many simple websites take more than and run worse that so many video games?
I thought part of the point was to have a website work more as an application: one update to a piece of information results in that information being near instantly updated across the site.
Then I looked into the angular stuff the UI people were working on and yeah… something like 10 (costly) requests for the same exact fucking JSON. They were talking about doing caching on the frontend to optimize it. What are we even doing?!
Me with 16 GB on my computer
(╯°□°)╯︵ ┻━┻
That’s not an argument for me, because less RAM usage by the operating system leaves you with more resources left for your applications and programms.
What a bloat
Yeah, 6gb RAM is CRAZY! It’s almost like you’d have to buy a computer that’s at least 15 years old to get that!
What the hell is wrong with these people giving us free software for free and then having the audacity to expect us to pay more than $32 for a computer to run it!
THE NERVE!
Many people have 8 and 8 is going to be more popular again because of ai.
Only having 2gbs leftover to run everything isn’t great.
Fun thing, I just booted up an old computer. Started right up. It had Ubuntu 11.10 on it.
Now, I obviously didn’t connect the thing to the Internet. Updates would have probably failed hard. Not because it’s missing over a decade of updates so there might be some complications on that front, but because it’s a Pentium III with Definitely Not Even a Gigabyte of memory. (Oh and a Nvidia GeForce 2 MX. I’m pretty sure that’s not supported by… any driver any more.)
Clone the hard drive and see what happens!
I’m putting money on what we in the business technically refer to as “complete pants shitting”
That’s pretty much what a browser needs these days.
Reject modernity. Embrace lynx browser from terminal.
Gemini protocol is pretty good and minimal. But very little usage unfortunately.
“HERE’S A NICKLE, KID. GET YOURSELF A BETTER COMPUTER.”

20 years ago when Scott Adams was still a moderately sane human.
8GB was barely enough 10 years ago. That’s when I switched to Arch+KDE. Then KDE started using more. memory.
More memory meaning? From 800Mb to 1GB? I’d say, for what plasma is, its ram usage is low.
agreed. KDE is pretty much the gold standard of the usability versus resource usage tradeoff, IMHO. From what I’ve seen: Websites/Web Browsers = worst offenders.
The difference was hundreds of MB, but when you’re working with 8GB every bit counts. At that time KDE had an edge over Gnome. At some point the difference wasn’t there anymore.
I was rationing what software I had open so as to avoid hitting swap because that’s when there’s a noticeable lag. Gnome was worse at recovering from that.
If you want lightweight, KDE is not your best choice. LXQt and Xfce are actually intended for low resource usage.
I have a POS HP stream kicking around. Celeron and 2gb ram on it. It runs well with antix, I have used lxqt and xfce on it and been able to Google things and browse articles on the web, write word docs smoothly. I put kde plasma on it struggles now. Really it’s time to retire that thing, but I like to occasionally distro hop on it and see what will run on that bunk hardware. So far antiX has been by far snappiest on that laptop, but it’s not as pretty as KDE :c
In post PC ownership cyberpunk dystopia well all be running slack
I would be but I misplaced the “a1” disk.










