also I just realized that Brazil did NOT make a programming language entirely in Spanish and call it “Si” and that my professor was making a joke about C… god damn it

this post is probably too nieche but I feel like Lemmy is nerdy enough that enough people will get it lol

  • DarkAri@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    3 hours ago

    I’m on mobile so it’s hard to memorize all the things you wrote. Maybe I’ll clarify a few points that bothered me. You are obviously very knowledgeable in these things, even more than me in many areas. I am a hobbies not professional programmer but I have been programming since I was 12 or so and I’m in my 30s now, and also have always been a white hat hacker.

    I don’t mean you can literally understand everything about a computer, just that you can understand everything you need to in order to do 99% of things and this isn’t some crazy thing. You would obviously use openGL or vulkan or direct X to access the GPU instead of writing binaries.

    Modern machines do use several hundred watts just doing regular things. Not idle sure I less you have tons of junk running in the background, but even basic tasks on modern machines which utilize code written in languages like Python and Java and electron and web stuff, will absolutely use much of your systems hardware for simple tasks.

    Managing memory in C++ is easy but you have to not be stupid. C++ isn’t stupid proof. It’s also not a good fit for some things because people make mistakes or just take advantage of the fact that C is low level and has direct access to exploit things. The issue is really that if you aren’t on a certain level of programming then c++ can be really unsafe. You need to understand concepts like creating your own node graph with inheritance to make managing memory easy. It is easy once you understand these things. Garbage collectors are not a good solution for many things. I would argue most things. It’s easy sure, but also buggy and breaks the idea of just having smooth running software. You should be freeing your memory just as you called it in an organized and thoughtful way.

    By memory bus I mean the front side bus, which if you have programs running at uncapped speeds is bad or just programs running with 100x the overhead that they would have if written in C. Again this is just basic knowledge that any programmer should know without even being taught really. There is no reason to have programs bottleneck your machine when we live in an era of multitasking.

    Writing code for C++ also doesn’t take longer after like 5 minutes, it’s actually much quicker because you can just write it and not have it complain about indentation or anything. It is a bit verbose with brackets and stuff but these are there to facilitate having a powerful language that can do pretty much anything. There is also string libraries and stuff that handle strings without the security issues.

    Linux also is tiny without being devoid of software. It’s because it’s written in C and stuff is only as large as it needs to be. My entire Linux OS for my phone with all its files that I’m working on is less then 10 GBs and it has emulators, many libraries, many applications, several web browsers, wine, virtual machines, servers, several different development environments, different window managers, and all kinds of other stuff. On android installing a web browser could take hundreds of MBs for essentially zero benefit.

    No benefits to web assembly? I guess to you that may be true because you don’t care about optimization, download size, energy use and stuff like this. It does have benefits, because one not everyone has thousands to upgrade their computer every two years, and where I live in a Republican state in America, the internet maxes out at 200 KBps, and on a good day maybe 500 KBs.

    The first step on fixing a problem is admitting you have a problem. Software is only going to get worse if devs are in denial about it.

    • squaresinger@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      There’s one big difference between hobby work and professional work: If you do hobby stuff, you can spend as much time on it as you want and you are doing what you want. So likely, you will do the best you can do with your skill level, and you are done when you are done, or when you give up and stop caring.

      If you do professional work, there’s a budget and a deadline. There’s a dozen things you need to do RIGHT NOW and that need to be finished until yesterday. There’s not one person working on things, but dozens and your code doesn’t live for weeks or months but for years or decades and will be worked on by someone when you are long gone. It’s not rare to stumble upon 10 or 20 years old code in most bigger applications. There are parts of the Linux kernel that are 30 years old.

      Also in professional work, you have non-technical managers dictating what to do and often even the technical implementation. You have rapidly shifting design goals, stuff needs to be implemented in crunch time, but then gets cancelled a day before release. Systems are way more complex than they need to be.

      I am currently working on the backend of the website and app for a large retail business. The project is simple, really. Get content from the content managers, display a webside with a webshop, handle user logins and loyalty program data. Not a ton of stuff.

      Until you realize:

      • The loyalty program is handled by what used to be a separate company but got folded into our company.
      • The webshop used to be run by the same team, but the team got spawned out into its own organisation in the company.
      • The user data comes from a separate system, managed by a team in a completely different organization unit in the company.
      • That team doesn’t actually manage the user data, but only aggregates the user data and provides it in a somewhat standardized form for the backends of user-facing services. The data itself lives in an entirely separate environment managed by a different sub-company in a different country.
      • They actually don’t own the data either. They are just an interface that was made to provide customer data to the physical stores. They get their customer data from another service, managed by another sub-company, that was originally made to just handle physical newsletter subscriptions, 20 years ago.

      We are trying to overhaul this right now, and we just had a meeting last week, where we got someone from all of these teams around a table to figure out how different calls to the customer database actually work. It took us 6 hours and 15 people just to reverse-engineer the flow of two separate REST calls.

      If you see bugs and issues in a software, that’s hardly ever due to bad programmers, but due to bad organizations and bad management.

      I don’t mean you can literally understand everything about a computer, just that you can understand everything you need to in order to do 99% of things and this isn’t some crazy thing. You would obviously use openGL or vulkan or direct X to access the GPU instead of writing binaries.

      This is exactly what the software crisis is, btw. With infinite time and infinite brain capacity, one could program optimally. But we don’t have infinite time, we don’t have infinite budget, and while processors get faster each year, developers just disappointingly stay human.

      So we abstract stuff away. Java is slower than C (though not by a ton), mostly because it has managed memory. Managed memory means no memory management issues. That’s a whole load of potential bugs, vulnerabilities and issues just removed by changing the language. Multitasking is also much, much easier in Java than in C.

      Now you choose a framework like Spring Boot. Yes, it’s big, yes you won’t use most of it, but it also means you don’t need to reimplement REST and request handling. Another chunk of work and potential bugs just removed by installing a dependency. And so on.

      Put it differently: How much does a let’s say 20% slow down due to managed memory cost a corporation?

      How much does a critical security vulnerability due to a buffer overflow cost a corporation?

      Hobby development and professional development aren’t that similar when it comes to all that stuff.

      • DarkAri@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        2 hours ago

        Maybe it’s sort of a tragedy of the commons thing. Maybe new standards should have support for limiting the resource use of stuff and the defaults would be low enough that it would force companies to allow time to write good code or it will be unusable on 90% of machines. This might actually fix the issue. Companies could force programmers to just churn out terrible code as fast as possible but would have to actually allow time to optimize and clean up. Idk. I just deal with it by avoiding all that stuff because I actually have enough willpower to stop using something even when it’s more convenient out of principle, which I realize is rare. Most people just want their tik tok OS and they don’t care if they have to pay $1000 for a device that’s a glorified streaming media player. I’m glad Linux exists and it’s still written in C. I’m going to release a game some day and Im going to target Linux as the native client and I don’t care if I lose 80% of my customers. I want to be part of the solution and not the problem, but I understand survival and keeping a job is important to someone like you. Anyways good talk, and windows XP and 7 were much better then any modern operating system ever will be. Linux is catching up fast and we will probably all be on Linux running C code before long with the state of the industry. I can’t even use windows anymore. Much of the web is becoming that way as well. Purely profit driven, run by publicly traded companies that hate humans. Always brownosing the state and their corporate sponsors so the gestapo doesn’t come for their profits next.