

The enshittification of Duolingo has already been going on for quite a while. It has really gone downhill in the last few years.
The enshittification of Duolingo has already been going on for quite a while. It has really gone downhill in the last few years.
Depends on viewing conditions. As of yet there isn’t an objectively superior display technology.
OLEDs have the best contrast in a dark room as black pixels can be fully turned off, but they are generally less bright and use more power than comparable LCD TVs or monitors (especially when you compare models of a similar price range).
LCD based monitors and TVs can get brighter and can actually achieve a higher contrast in a well lit room as the black pixels on an LCD are less reflective than black pixels on an OLED, and when viewing in daylight the ambient light is more than enough to drown out the backlight bleed.
There are also other smaller pros and cons. OLED for example has a better pixel response time, while IPS LCDs are more colour accurate. Text rendering and other fine graphics also generally look slightly sharper on an LCD than on an OLED display (when comparing displays of equal resolution / pixel density) due to the subpixel layout.
Any guesses how long it will take for someone to use this jailbreak to get Doom to run on just the CPU?
In theory, at least some of the affected processors should have more than enough cache to run it directly from there, right?
Though I have to admit that I don’t understand CPU internals well enough to know if the microcode even has enough control over the chip to make that physically possible.
It was successfull for a while up to 10 years or so ago, when it was the main free option for video calling. But nowadays there are plenty of alternatives, pretty much all of which do a better job than Skype ever did.
Skype has now been pretty much obsolete for years so I don’t think it’s too bad that it’s ending.
The Google approach would have been to already have killed it in 2004 before it ever even had a chance to be successful.
x86 has bit manipulation instructions for any bit. If you have a book stored in bit 5 it doesn’t need to do anything masking, it can just directly check the state of bit 5. If you do masking in a low-level programming language to access individual bits then the compiler optimization will almost always change them to the corresponding bit manipulation instructions.
So there’s not even a performance impact if you’re cycle limited. If you have to operate on a large number of bools then packing 8 of them in bytes can sometimes actually improve performance, as then you can more efficiently use the cache. Though unless you’re working with thousands of bools in a fast running loop you’re likely not going to really notice the difference.
But most bool implementations still end up wasting 7 out of 8 bits (or sometimes even 15 out of 16 or 31 out of 32 to align to the word size of the device) simply because that generally produces the most readable code. Programming languages are not only designed for computers, but also for humans to work on and maintain, and waisting bits in a bool happens to be more optimal for keeping code readable and maintainable.