• 0 Posts
  • 12 Comments
Joined 3 years ago
cake
Cake day: June 12th, 2023

help-circle
  • I am typing this on a 5 year old Android phone. It has 128GB of memory and 8GB of RAM, very decent cameras, a beautiful OLED screen and a processor that is more than fast enough for everything I do with it. And even now the battery still lasts two days with normal use. It cost me about €300 at the time.

    Unfortunately the Android version is getting so far behind that some apps are starting to get a few issues, so I have been checking out some black Friday deals for new phones, but they look very disappointing.

    In the current market it seems like I’d have to pay about €500 to effectively just get a side-grade. All €300 offerings look like just a straight up downgrade in any way apart from the more recent android version.

    So I think I’ll hold on to this one a while longer. Hardware-wise it’s still in perfect condition, and if software support really becomes an issue then perhaps I’ll try out a custom ROM.


  • The main reason is tech debt and proprietary software. Most companies have decades of software infrastructure all built on Microsoft based systems. Transitioning all that stuff to Linux is a massive investment, especially taking into account the downtime it’ll cause combined with the temporary decrease in productivity when everyone has to get trained and build up experience with the new platform.

    And then you have to deal with proprietary software. A lot of niche corporate or industrial hardware only supports Windows. And you probably have to regularly interact with customers who use Windows and share files with you that can only be opened in Windows only proprietary software.

    Linux also frequently struggles with a lot of weird driver issues and other weird quirks, causing an increased burden on the IT department.

    Basically you’re looking at a massive investment in the short term, for significantly reduced productivity in the long run. And all that mostly to save a bit of hardware costs, which are only a fraction of the operating costs for most companies. Just sticking with Windows ends up being the more economical choice for most companies.



  • In this case it’s somewhat different.

    We have seen almost these exact formations on earth, where they are created by microbiological lifeforms which could survive in the condition of how we expect ancient Mars was like when this sediment was formed.

    We have been able to reproduce similar patterns in the lab, but only in conditions with much higher temperatures or with much higher acidity than what we’d expect Mars to have been like back then.

    So the possible options are:

    1. Ancient Mars was how we expect it to have been, and these patterns were formed by ancient microbiological, Martian lifeforms.

    2. These patterns were formed by a known chemical process, and ancient Mars was much hotter or more acidic (or both) than we expected based on all other research.

    3. These patterns were formed by a currently unknown chemical process that does not require the high temperature of acidity that the known processes require.

    So in this case it’s not just wishful thinking. The hypothesis of this being formed by microbiological life is the hypothesis that best fits with what we currently know about the conditions in which the sediment was formed (which doesn’t fully prove that it’s true, but does give it credibility). And even if options 2 or 3 will end up being the right explanation, then we’ll still at least learn something interesting from this.





  • x86 has bit manipulation instructions for any bit. If you have a book stored in bit 5 it doesn’t need to do anything masking, it can just directly check the state of bit 5. If you do masking in a low-level programming language to access individual bits then the compiler optimization will almost always change them to the corresponding bit manipulation instructions.

    So there’s not even a performance impact if you’re cycle limited. If you have to operate on a large number of bools then packing 8 of them in bytes can sometimes actually improve performance, as then you can more efficiently use the cache. Though unless you’re working with thousands of bools in a fast running loop you’re likely not going to really notice the difference.

    But most bool implementations still end up wasting 7 out of 8 bits (or sometimes even 15 out of 16 or 31 out of 32 to align to the word size of the device) simply because that generally produces the most readable code. Programming languages are not only designed for computers, but also for humans to work on and maintain, and waisting bits in a bool happens to be more optimal for keeping code readable and maintainable.



  • Depends on viewing conditions. As of yet there isn’t an objectively superior display technology.

    OLEDs have the best contrast in a dark room as black pixels can be fully turned off, but they are generally less bright and use more power than comparable LCD TVs or monitors (especially when you compare models of a similar price range).

    LCD based monitors and TVs can get brighter and can actually achieve a higher contrast in a well lit room as the black pixels on an LCD are less reflective than black pixels on an OLED, and when viewing in daylight the ambient light is more than enough to drown out the backlight bleed.

    There are also other smaller pros and cons. OLED for example has a better pixel response time, while IPS LCDs are more colour accurate. Text rendering and other fine graphics also generally look slightly sharper on an LCD than on an OLED display (when comparing displays of equal resolution / pixel density) due to the subpixel layout.



  • It was successfull for a while up to 10 years or so ago, when it was the main free option for video calling. But nowadays there are plenty of alternatives, pretty much all of which do a better job than Skype ever did.

    Skype has now been pretty much obsolete for years so I don’t think it’s too bad that it’s ending.

    The Google approach would have been to already have killed it in 2004 before it ever even had a chance to be successful.