My instance went down, so I’m way too late to make this joke, but anyways:
We’re not cantankerous, just a little …crabby. 🙃
My instance went down, so I’m way too late to make this joke, but anyways:
We’re not cantankerous, just a little …crabby. 🙃
I hear, it helps with saving up for treatment by not paying for nudes. 🥴

Gonna be interesting, if they spontaneously decide they wanted to open-source all along, like how LLAMA did back then…

The shape kind of reminds me of those hearse cars:



Yeah, I might block a contributor on sight, if they post something like that.


In case, you’re not aware, you can also email the dev. You can code up your commits as normal and then use e.g. git format-patch -3 to put the last 3 commits patch files. You can then attach those files to an e-mail and the dev can apply those patches with git am.
It takes a bit of playing around, but it’s actually really easy.
The Linux kernel, one of the most complex projects on the planet, develops like this.


I think, you could open the same file multiple times and then just skip ahead by some number of bytes before you start reading.
But yeah, no idea if this would actually be efficient. The bottleneck is likely still the hard drive and trying to fit multiple sections of the file into RAM might end up being worse than reading linearly…


Yeah, and the worst part is that submitting the PR is trivial. You just offload the reviewing work onto the maintainer and then feed the review comments back into the AI. Effectively, you’re making the maintainer talk to the AI, by going through you as a middleman, a.k.a. completely wasting their time.


Yeah, I always plead for as much as possible to be automated offline. Ideally, I’d like the CI/CD job to trigger just one command, which is what you’d trigger offline as well.
In practice, that doesn’t always work out. Because the runners aren’t insanely beefy, you need to split up your tasks into multiple jobs, so that they can be put onto multiple runners.
And that means you need to trigger multiple partial commands and need additional logic in the CI/CD to download any previous artifacts and upload the results.
It also means you can restart intermediate jobs.
But yeah, I do often wonder whether that’s really worth the added complexity…
You can get some brands which have a pinch of salt added, but in my experience, most brands don’t…
I always thought openSUSE’s package manager zypper has quite a few neat ideas:
zypper install→ zypper in, update → up, remove → rm.fish git texlivezypper repos gives you a list of your repositories, numberered 1, 2, 3 etc., and then if you want to remove a repo, you can run zypper removerepo 3.zypper search, it prints the results in a nicely formatted table.Documentation: https://doc.opensuse.org/documentation/tumbleweed/zypper/
Hmm, I don’t know about Pacman, but for example openSUSE’s zypper remove has a --clean-deps flag, which doesn’t exist on the other subcommands. So, it wouldn’t make sense to have it be zypper --remove --clean-deps…


Food packaging also really irks me. Technically, it has a use beyond being thrown away, but there’s just so much of it. You can readily find products in the shops that should be advertised as “plastic trash” and they just stuck a bit of food inside to keep it in shape…
Yeah, this is one of those issues that I feel separates the seniors from the, uh, less experienced seniors. (Let’s be real, as a junior, you know jackshit about this.)
Knowing when to use an ORM, when to use SQL vs. NoSQL, all of that is stuff you basically only learn through experience. And experience means building multiple larger applications with different database technologies, bringing them into production and seeing them evolve over time.
It takes multiple years to do that for one application, so you need a decade or more experience to be able to have somewhat of an opinion.
And of course, it is all too easy to never explore outside of your pond, to always have similar problems to solve, where an SQL database does the job well enough, so a decade of experience is not a guarantee of anything either…


Yeah, I really wonder what their thought process was. Are you supposed to bid on multiple foods, so that if you get outbid, you can fall back to the next one?


When you ring the doorbell to pick it up, they quickly chuck it into the microwave. 🙃
It’s key-based client authentication. Just open your SSH key’s .pub file in Microsoft Publisher, then export to PDF.


I do agree, yeah, although I can certainly also understand LISP fans being annoyed that someone created a custom DSL for something that is adequately solved by the LISPs. I’m also certainly not enamored with the Nix syntax myself, but do find it easier to parse than a million parentheses.
But yeah, ultimately the complexity of Nix and Guix isn’t in the particular symbols you type out. The complexity comes from them being expression-based (which does make sense for the use-case, but isn’t as familiar as e.g. imperative languages), as well as just having to learn tons of modules for the different things you want to configure…


Wikipedia seems to do a decent enough job defining it:
Authoritarianism is a political system characterized by the rejection of political plurality, the use of strong central power to preserve the political status quo, and reductions in democracy, separation of powers, civil liberties, and the rule of law.
But basically, my point is:
Basically, my opinion is that politics is a constant work in progress, no matter the political system.
It’s Apple’s programming language, kind of intended as a successor to Objective-C.
From what I hear, it’s actually decently designed and has quite a few similarities to Rust. Still not sure, how great it is outside of the Apple ecosystem…