• taladar@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    A system that has no idea if what it is saying is true or false or what true or false even mean is not very consistent in answering things truthfully?

    • tracyspcy@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Wait for the next version which will be trained on data that includes gpt generated word salad

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      1 year ago

      No that is not the thesis of this story. If I’m reading the headline correctly, the rate of its being correct has changed from one stable distribution to another one.