

What advances?


What advances?


I guess the idea just didn’t have legs!


LLMs don’t have anything to do with abstract ideas, they quite literally produce derivative content based on their training data & prompt.


The same can be said of the approach described in the article, the “GPLv4” would be useless unless the resulting weights are considered a derivative product.
A paint manufacturer can’t claim copyright on paintings made using that paint.


Seems like the easiest fix is to consider the produce of LLMs to be derivative products of the training data.
No need for a new license, if you’re training code on GPL code the code produced by LLMs is GPL.


Hey I’m a human, but have you considered letting the Lord our Savior Tux into your boot partition?


Yeah the article is pretty trash.
15% of the top subs contain corporate propaganda becomes “15% of the subs are compromised”, “compromised” means something more than “contains propaganda” to me.


Wasn’t Digg 2.0 what “created” Reddit (or at least gave it critical mass) (along with the child porn obviously)


Lemmy has VC funding?
I actually think the lack of funding means Lemmy doesnt have the same incentives to enshitify itself.
flatpaks are all updated at once, just like distro packages, so yeah you might need to commands, but that’s still very different to having each application update itself (and the security hell implied by that)
Also I think pkcon can manage your updates across various backends (unless you are on Arch, where I think there are both technical & ideological objections to having a simple tool that just works)
Do you have examples?
Because most of what you are listing is stuff that has been using ML for years (possibly decades when it comes to meteorology) and just slapped “AI” on as a buzzword.