The year of Linux on the desktop is whatever year you personally switched over.
FLOSS virtualization hacker, occasional brewer
The year of Linux on the desktop is whatever year you personally switched over.


Now I’ve read the article it’s unnamed industry analysts and it’s written by an AI. For all I know the AI has hallucinated the number.


I assume microcontrollers. Most of those are invisible to consumers.
I would not want anything that requires a cloud connection to be responsible for securing my house. The security record of these smart locks also isn’t great.
The final question you need to ask yourself is how they fail safe? There have been Tesla owners trapped in burning cars. If, god forbid, your house caught fire can you get out of your door secured with a smart lock?
Once we summit the peak of inflated expectations and the bubble bursts hopefully we’ll get back to evaluating the technology on its merits.
LLM’s definitely have some interesting properties but they are not universal problem solvers. They are great at parsing and summarizing language. There ability to vibe code is entirely based on how closely your needs match the (vast) training data. They can synthesise tutorials and stack overflow answers much faster than you can. But if you are writing something new or specialised the limits of their “reasoning” soon show up in dead ends and sycophantic “you are absolutely right, I missed that” responses.
More than the technology the social context is a challenge. We are already seeing humans form dangerous parasocial relationships with token predictors with some tragic results. If you abdicate your learning to an LLM you are not really learning and that could have profound impacts on the current cohort of learners who might be assuming they no longer need to learn as the computer can do it for them.
We are certainly experiencing a very fast technological disruption event and it’s hard to predict where the next few years will take us.


Fundamentally the reason they want to use kernel modules is to observe the system for other executables interfering with the game. This is a hacky solution at best
The TPM hardware can support attested boot so you can verify with the hardware nothing but the verified kernel and userspace is running. That gives you the same guarantees but without letting third parties mess with your kernel.


It’s nice to see Valve and Igalia see the benefit of open GPU drivers for Proton and FEX utilise.
I would have thought unified memory would pay off, otherwise you spend your time shuffling stuff between system memory and vram. Isn’t the deck unified memory?


mu4e inside my Emacs session.
I ran into something similar when in haste I went from Raspbian Stretch to plain Bookworm and discovered the Debian version of Kodi didn’t have all the userspace drivers to drive the hardware decoding. In the end I worked around it by running Kodi from a container with stretch in it until the official Raspbian Bookworm got released. Maybe you could build a stretch based container for your VLC setup?


Did you ever play with the audio visualiser? I believe it was built in with the CD-ROM drive? What about Tempest 2000?


I never got a Jaguar despite being a signed up Atari fan boy at the time. The hardware was ridiculously complex which made ports to it a hard sell and Atari just didn’t have the first party exclusive clout needed to sustain a console at launch.
I do wish I’d had a chance to play with some of Jeff Minter’s creations on it though. Apparently there was a nice audio visualiser that built on the trip-a-tron from the ST days as well as some reboots of classic arcade games like Tempest 2000.


I’ve generally been up front when starting new jobs that nothing impinges my ability to work on FLOSS software on my own time. Only one company put a restriction in for working on FLOSS software in the same technical space as my $DAYJOB.
Nice to see QEMU was leading on LLM policies. I suspect more open source projects are going to have to come up with some sort of policy on these contributions going forward.
Not totally unexpected, I mean look at what brain rot does to humans.


The article mentioned there is a long history of forks in the open source Doom world. It seems the majority of the active developers just moved to the new repository.


Cost, the reason is cost.
What ever happend to the classic “reticulating splines”?
He does?
I read the first link in the thread that examines his blog post about London. While I don’t agree with his politics he wouldn’t be unusual amongst a significant minority of the population who vote for the likes of Reform. That seems to be enough for some to draw the conclusion he’s a Nazi he wants to arbitrarily murder people.
This automatic jump to accusing anyone who you disagree with a Nazi just devalues the term.
I think the OP’s analysis might have made a bit of a jump from overall levels of hobbyist maintainers to what percentage of shipping code is maintained by people in their spare time.
While the experiences of OpenSSL and xz should certainly drive us find better ways of funding underlying infrastructure you do see a higher participation rates of paid maintainers where the returns are more obvious. The silicon vendors get involved in the kernel because it’s in their underlying interests to do so - and the kernel benefits as a result.
I maintain a couple of hobbyist packages on my spare time but it will never be a funded gig because comparatively fewer people use them compared to DAYJOB’s project which can make a difference to companies bottom lines.