Okay, but that’s still partially on Nvidia for refusing to participate. They could have argued for explicit sync early in Wayland’s development but they weren’t at the table at all, so they got stuck with the technology that was decided on without them and had to argue for changes much later.
And they started off arguing for EGLStreams, but it didn’t work well either. Explicit sync came later.
Wayland has a bunch of features that are so new they aren’t in the stable distros yet.
Nvidia went from declaring they were never going to support Wayland to trying to force their own EGLStreams stuff on everybody to reluctantly accepting the standard that was developed without them and trying to make it work for their driver. They’re playing catchup and it’s entirely their own fault for refusing to cooperate with anybody.
They’re moving more towards open source drivers now, probably because the people buying billions of dollars worth of GPUs to use on Linux servers for AI training have had words with Nvidia on the subject.
Voting third party under the US system doesn’t improve society so, like you, the meme kind of misses the point.
I mean, it’s bits of configuration all over the place that I’ve built up over time. It isn’t a single script on one machine, and you’d need to change a lot of things if you weren’t running Slackware. I can’t really copy and paste it all.
Network namespaces and policy based routing are black magic, IMO.
I’ve got a VPN set up on my router and separate VLANs set up for ordinary traffic and VPN traffic. A device doesn’t need to support VPNs at all, I just connect it to the VPN VLAN and all its traffic goes over the VPN whether it likes it or not. I’ve got separate wifi SSIDs for each VLAN.
My desktop is connected to both VLANs with a network namespace set up for the VPN VLAN, so sudo vpn rtorrent
runs rtorrent in the namespace that’s connected to the VPN VLAN.
My setup is nice, but I wouldn’t recommend it to anyone who doesn’t want to learn quite a bit about networking.
It’s been 30 years and jokes about Windows expanding to fill available space still work.
Yeah, but because pricing jumped like someone set a firecracker off under it’s chair people are actually still using vintage GPUs.
Whatever compromise anyone tries to come up with will be ignored and exploited as hard as advertisers possibly can.
A compromise that actually works would depend on advertisers actually complying. The advertisers that do will be vastly outnumbered by the advertisers that don’t.
So we’re getting the arms race either way.
QC 2.0 is proprietary but it would probably still be identified as a device on a standard USB port. For $2 it’s probably worth giving one a try, anyway.
Looking at the PD spec I got the impression devices are supposed to pull the D+ pin up to a certain voltage, but I got lost partway through.
OP asked for the easiest way and deciphering the spec docs probably isn’t it.
I’m not interested in my computer striking a balance between my needs and the needs of people seeking to manipulate me into buying things.
I paid for my computer, it serves my needs. Yes I do run Linux, how did you guess?
at least, not with my 1080p monitors, which I prefer over higher-res ones
Blasphemy!
4k monitors are beautiful for normal desktop usage, making text crisp and clean with smooth curves and none of that blockiness that comes from low resolution, and with modern scaling settings you can even have 4K text and 1080p graphics at the same time with the same performance as native 1080p.
Mint 22 would be straightforward, at least: https://linuxmint-user-guide.readthedocs.io/en/latest/upgrade-to-mint-22.html
Mint 21.3 might be a bit too ‘stable’ for your new GPU.
Linux graphics move fast. You generally won’t have a good experience with an older distro and a brand new GPU.
The devs have been working hard to hammer out those troublesome edge cases. There’s a lot less of them than there was a year or two ago.
IIRC Nvidia needs explicit sync support to work reliably. It’s fairly new and might not have landed in some distros, especially the stable releases.
Are you saying people would tell lies?! On the internet?!
As a large language model, I don’t have an opinion on this subject.