• TropicalDingdong@lemmy.world
    link
    fedilink
    arrow-up
    14
    arrow-down
    8
    ·
    9 months ago

    Bro if you could get there just by prompting, it would be.

    There are no models good enough to just ask for something to be done and it gets done.

    There will be someday though.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      40
      ·
      edit-2
      9 months ago

      Build an entire ecosystem, with multiple frontends, apps, databases, admin portals. It needs to work with my industry. Make it run cheap on the cloud. Also make sure it’s pretty.

      The prompts are getting so large we may need to make some sort of… Structured language to pipe into… a device that would… compile it all…

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          17
          ·
          9 months ago

          Perfect! We’ll just write out the definition of the product completely in Jira, in a specific way, so the application can understand it - tweak until it’s perfect, write unit tests around our Jira to make sure those all work - maybe we write a structured way to describe each item aaand we’ve reinvented programming.

          I see where you’re going, but I’ve worked with AI models for the last year in depth, and there’s some really cool stuff they can do. However, truly learning about them means learning their hard pitfalls, and LLMs as written would not be able to build an entire application. They can help speed up parts of it, but the more context means more VRAM exponentially, and eventually larger models, and that’s just to get code spit out. Not to mention there is nuance in English that’s hard to express, that requirements are never perfect, that LLMs can iterate for very long before they run out of VRAM, that they can’t do devops or hook into running apps - the list goes on.

          AI has been overhyped by business because they’re frothing at the mouth to automate everyone away - which is too bad because what it does do well it does great at - with limitations. This is my… 3rd or 4th cycle where business has assumed they can automate away engineers, and each time it just ends up generating new problems that need to be solved. Our jobs will evolve, sure, but we’re not going away.

          • TropicalDingdong@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            11
            ·
            9 months ago

            I mean, I had beta access to ChatGPT and have gotten excellent results from clever use, so I don’t appreciate the appeal to authority.

            No, the jobs are going away and you are delusional if you think otherwise. ChatGPT is the DeepBlue of these kinds of models, and a global effort is being made to get to the AlphaGo level of these models. It will happen, probably in weeks to months. A company, like Microsoft for example, could build something like this, never release it to the public, and if successful, can suddenly out-compete every other software company on the planet. 100%.

            Your attitude is a carbon copy of the same naysaying attitude that could be see all over hackernews before ChatGPT found its way to the front page. That AI wasn’t ever going to do XY or Z. Then it does. Then the goal posts have to move.

            AI will be writing end to end architecture, writing teh requirements documents, filling out the jira tickets. Building the unit tests. If you don’t think that a company would LOVE to depart with its 250k+ per year software engineers, bro…

            • Scrubbles@poptalk.scrubbles.tech
              link
              fedilink
              English
              arrow-up
              10
              ·
              9 months ago

              lol okay dude. Flippantly you ignored all of the limitations I pointed out. Sure it could happen, but not on the timeline you’re discussing. There is no way within a year that they have replaced software engineers, I call absolute BS on that. I doubt it will rise above copilot within a year. I see it being used alongside code for a long time, calling out potential issues, optimizing where it can, and helping in things like building out yaml files. It cannot handle an entire solution, the hardware doesn’t exist for it. It also can’t handle specific contexts for business use-cases. Again maybe, but it’ll be a while - and even then our jobs shift to building out models and structuring AI prompts in a stable way.

              My attitude is the same because these are the same issues that it’s faced. I’m not arguing that it’s not a great tool to be used, and I see a lot of places for it. But it’s naiive to say that it can replace an engineer at it’s stage, or in the near future. Anyone who has worked with it would tell you that.

              I firmly do think companies want to replace their 250k engineers. That’s why I know that most of it is hype. The same hype that existed 20 years ago when they came out with designers for UIs, the same hype when react and frontend frameworks came out. Python was built to allow anyone to code, and that was another “end of engineers”. Cloud claimed to be able to remove entire IT departments, but those jobs just shifted to DevOps engineers. The goalposts moved each time, but the demand for qualified engineers went up because now they needed to know these new technologies.

              Why do you think I worked with AI so much over the last year? I see my job evolving, I’m getting ready for it. This has happened before - those who don’t learn new tech get left behind, those who learn it keep going. I may not be coding in python in 10 years, god knows I wasn’t doing what I was 10 years ago - but it’s laughable to me to think that engineers are done and over with.

              • TropicalDingdong@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                6
                ·
                9 months ago

                You seem mad and strongly opinionated, but I hate arguing when there is nothing on the line. Would you be interested in a gentleman’s bet then?

                My thesis is that we’ll have (or some one will, you and I may not have access) to a form of interactive AI that can effectively code from scratch some kind of large-ish application (like a website), make changes to that website, add features, etc, in the next few years, like, very few.

                I’d like to come to terms with you and lay down a bet. If need be we can start a sublemmy to post the bet publicly and we can be held accountable for public shaming if we fail to put up.

                For the purposes of a bet, I want to suggest that a code base ‘as complicated’ as Lemmy is a good barometer. My getting this prediction right will be to show you an example of that happening in media, or ideally, being able to show it in use. I think in media should be considered acceptable.

                In my circles, we usually make these bets beers or bottles of the counterparties favorite drink, and I’m willing to offer you the following terms: 3:1 in the first year, 2:1 in the second year, and 1:1 in the first year. If the above thesis isn’t confirm, I’m wrong and I’ll make it clear that I acknowledge that I’m wrong.

                I would like to bet 12 bottles on my thesis based on the above terms, (where a case of 12 bottles of the preferred liquor or beer or whatever does not exceed $200, so like a 12 pack of good beer or mid tier wine).

                Is that a deal you can agree to?

    • marcos@lemmy.world
      link
      fedilink
      arrow-up
      22
      arrow-down
      1
      ·
      9 months ago

      There are no models good enough to just ask for something to be done and it gets done.

      We call those “compilers”. There are many of them.