• collapse_already@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    24 days ago

    Seems like getting a decent answer out of an LLM for a novel problem requires a high degree of specificity and thoroughness in your prompt engineering. Seems like a set of rigid rules for specifying what you want are required to avoid getting broken slop. Fortunately, we have great specific ways of specifying what we want computers to do. We call them programming languages.

    • Ignotum@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      21 days ago

      If i write the code, the computer yells at me
      If the LLM writes the code, i get to yell at the LLM
      Both approaches results in code that doesn’t work, so i might as well go for the more cathartic one