Please stop saying this.
There’s jobs you can do on a day of training, and there’s jobs that need six years of higher education to not kill people.
Please stop saying this.
There’s jobs you can do on a day of training, and there’s jobs that need six years of higher education to not kill people.

For perspective, this is 4% of the revenue they pretend they’ll make. The trillion-dollar fantasy is obviously not happening - but if even an embarrassing sliver of that happens for real, this was just investment.
Well, that’s the soundtrack of my brain for the next week.


Frustrating part A is that we have a universal binary format… and it’s HTML5. Frustrating part B is that nobody with a purchasing department wants to admit it. Slack ships with its own browser like you don’t have one. Modern web games can run on a sufficiently fancy Amiga, yet there have been Electron apps without a Linux version. That Amiga’s gonna suffer overhead and unstable performance, but I mean, so do native Unreal 5 games.
The good ending from here would be a period of buck-wild development. RISC-V, MIPS, finally doing that guy’s Mill CPU. I was gonna say that neural networks might finally get high parallelism taken seriously, but no, optimized matrix algebra will stay relegated to specialty hardware. Somewhere between a GPU and an FPU. There’s server chips with a hundred cores and it still hasn’t revived Tilera. They’re just running more stuff, at normal speed.
The few things that need to happen quickly instead of a lot will probably push FPGAs toward the mainstream. The finance-bro firehose of money barely splashed it, when high-frequency trading was the hot new thing. Oh yeah: I guess some exchanges put in several entire seconds of fiber optics, to keep the market comprehensible. Anyway, big FPGAs at sane prices would be great for experimentation, as the hardware market splinters into anything with an LLVM back-end. Also nice for anything you need to happen a zillion times a second on one AA battery, but neural networks will probably cover that as well, anywhere accuracy is negotiable.
Sheer quantity of memory will be a deciding factor for a while. Phones and laptops put us in a weird place where 6 GB was considered plenty, for over a decade. DRAM sucks battery and SRAM is priced like it’s hand-etched by artisanal craftsmen. Now this AI summer has produced guides like ‘If you only have 96 GB of VRAM, set it to FP8. Peasant.’ Then again - with SSDs, maybe anything that’s not state is just cache. Occasionally your program hitches for an entire millisecond. Even a spinning disk makes a terabyte of swap dirrrt cheap. That and patience will run any damn thing.


Fortunately a lot of early Windows shit runs in Wine, since the most stable Linux API is Win32. Anything older than that either works in 86box or was broken to begin with. Okay, that’s not fair - WineVDM is necessary to bridge the gap for the dozen Windows 3.1 programs that matter. I am never allowed to write those off when one of them is Castle Of The Winds.
What Intel learned with Itanium is that compatibility is god. They thought their big thing was good chip design and modern foundries. They were stupid. AMD understood that what kept Intel relevant was last year’s software running better this year. This was evident back in the 486 days, when AMD was kicking their ass in terms of cycles per operation, and it caused division-by-zero errors with network benchmarks taking less than one millisecond.
But software has won.
The open architecture of RISC-V is feasible mostly because architecture doesn’t fucking matter. People are running Steam on their goddamn phones. It’s not because ARM is amazing; it’s because machine code is irrelevant. Intermediate formats can be forced upon even proprietary native programs. Macs get one last gasp of custom bullshit, with Metal just barely predating Vulkan, and if they try anything unique after that then it’s a deliberate waste of everyone’s time. We are entering an era where all software for major platforms should Just Work.


“Oh come on, Samson, it’s nothing you haven’t seen before.”
“He IS something I’ve never seen before!”


“Just” read documentation, says someone assuming past documentation is accurate, comprehensible, and relevant.
I taught myself QBASIC from the help files. I still found Open Watcom’s documentation frankly terrible, bordering useless. There’s comments in the original Doom source code lamenting how shite the dead-tree books were.


Due to some disagreements—some recent; some tolerated for close to 2 decades—with how collaboration should work, we’ve decided that the best course of action was to fork the project
Okay, that was always allowed!
Programming is the weirdest place for kneejerk opposition to anything labeled AI, because we’ve been trying to automate our jobs for most of a century. Artists will juke from ‘the quality is bad!’ to ‘the quality doesn’t matter!’ the moment their field becomes legitimately vulnerable. Most programmers would love if the robot did the thing we wanted. That’s like 90% of what we’re looking for in the first place. If writing ‘is Linux in dark mode?’ counted as code, we’d gladly use that, instead of doing some arcane low-level bullshit. I say this as someone who has recently read through IBM’s CGA documentation to puzzle out low-level bullshit.
You have to check if it works. But if it works… what is anyone bitching about?


Windows ME broke DOS support just to pretend it wasn’t 9x in a new hat. At the time - this was kind of a big deal. 98 was objectively better for any typical suburban setup.
XP was also NT in a new hat, but they had it fake all the bugs that popular games expected. Plus the hat was nice. That Fisher-Price UI era was genuinely great, especially compared to modern ultra-flat nonsense. Windows 95 had instantly visible hierarchy in sixteen colors. Nowadays you can’t even tell what’s clickable without guesswork and memorization. Or at best you get a dozen indistinct monochrome icons.
Windows 7 was the only time Microsoft nailed everything. Each XP service pack broke and fixed a random assortment of features. Everything from 8 onward is broken on purpose, first for the stupid tablet interface (when WinCE’s 9x UI worked just fucking fine on 3" screens with Super Nintendo resolutions), then to openly betray all trust and control. I would still be using Windows 7 to-day if modern malware wasn’t so scary. It’s not even about vulnerability - I must have reinstalled XP once a month, thanks to the sketchiest codec packs ever published. But since I can’t back up my whole hard drive on five dollars worth of DVD-Rs, the existence of ransomware pushed me back to Linux Mint.


Oh my god, they Windows ME’d it.
And Henry Blake paddled a liferaft onto the Tracy Ullman show.


Humans will always outsmart the chatbot. If the only thing keeping information private is the chatbot recognizing it’s being outsmarted, don’t include private information.
As for ‘how do I…?’ followed by a crime - if you can tease it out of the chatbot, then the information is readily available on the internet.
Dreamcast: the first AI console.
It really was ahead on everything!





Couldn’t spell it.


Yeah, buried at the end. Any style guide (or common sense) will tell you to expand an acronym the first time it comes up.


DRM has a much more commonplace meaning that has also been a contentious topic for FOSS development.
I level the same complaint about online newspapers that cover local politics without bothering to specify where the hell they are. ‘If you already knew then you’d plainly know.’ Okay, what if I fucking don’t? How does one divine this information? If I search for Greenville or Jackson County, is your podunk locale the first thing that comes up?
Fairly relevant video from Your Dinosaurs Are Wrong: Air Hulks.