(_____(_____________(#)~~~~~~

  • 0 Posts
  • 21 Comments
Joined 3 years ago
cake
Cake day: April 11th, 2022

help-circle

  • It doesn’t help that they keep deprecating and changing standard stuff every other version. It’s like they can’t make up their mind and everything may be subject to change. Updating to the most recent release can suddenly cause 10s or 100s of compiler warnings/errors and things may no longer behave the same. Then you look up the new documentation and realize that you have to refactor a large part of the codebase because the “new way” is for whatever reason vastly different.







  • Westmere Xeon processors are still quite OK imo. I have an old enterprise machine with one. 12 Threads, 2.6 GHz is still quite usable for many things. I mostly use it to compile larger software. But personally I’d argue that Longsoon is already far better than Intel/AMD since Longsoon is based on MIPS, which is based on RISC, while Intel/AMD still clings to their bloated and way too power hungry CISC crap. Plus today most performance comes from parallelism and cache size rather than core frequency and Longsoon does already have 128 and 256-bit vector instructions in their ISA, which is pretty decent. Maybe they can figure out a 512-bit vector extension that doesn’t severely throttle the CPU when using it before Intel can, lol.


  • I have the same experience. I wrote a simple program with SDL2 to test a software renderer. All it does is create a window then go into an event loop and after each iteration it streams a framebuffer to a texture that gets displayed in the window. In the default mode (X11) my frame timings fluctuate a lot and for a while I tried to massage the code to get it stable because I was convinced that it was just my draw code. Then I eventually forced SDL2 to use Wayland and not only did the draw time per frame go down by 2ms but the fluctuations went away completely.


  • It’s depressing to see so many AI powered FOSS projects that are basically just a chatbot/autocomplete or something that can spit out some images, when there are so many cool things you could use Neural nets for. For instance, as a FLOSS enthusiast a tool to help with reverse engineering proprietary binaries, specifically firmware and driver blobs would be awesome and could permanently change computing for the better. But everyone in the west seems to be more concerned with how they can use Neural nets to reduce production costs and increase profits.





  • Exactly. Also someone can only release parts of the source code of their software and still license it under a permissive license like MIT, BSD, Apache, CDDL, etc. enabling them to claim that their software is “open source”. And usually in that case the released source code just so happens to mostly be wrapper/glue code that calls out into closed source binaries which is where the actual magic happens.