• msage@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    3 days ago

    LOL at that.

    LLMs need to disappear before that happens.

    In order to not have any bugs, and for anything to produce perfect software, you need to define perfect business rules, and if managers could do that, they wouldn’t have needed developers for decades.

    If we have AI that can produce the perfect code, you won’t have access to it. Why giving everyone something so powerful when now you can circle around everyone easily?

    • Bazoogle@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      17 hours ago

      If we have AI that can produce the perfect code, you won’t have access to it.

      If one company can make it, then other will make it too. Someone will be the first, but others will follow behind. It is too critical for each countries national security to not research it themselves, let alone the profit the companies can make. It will definitely be longer before someone like me will get access, and even longer before it is cost effective, but it will eventually happen.

      In order to not have any bugs

      I should have been clearer. I meant exploitable vulnerabilities in the software. “Bugs” and “features” can have an overlap, but that’s not what I meant. The only attack surface left would be the human one, which would still be a massive vulnerability like it currently is.

      • msage@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        5 hours ago

        That’s not how anything works.

        You are assuming a god-like coder entity which can consider everything, and that’s a whole new problem which we can’t solve right now.

        And if it’s a national security, it won’t be shared with others, so if one country stumbles upon it, others won’t know how.