• filister@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    7 months ago

    Just ask ChatGPT what it thinks for some non-existing product and it will start hallucinating.

    This is a known issue of LLMs and DL in general as their reasoning is a black box for scientists.

    • db0@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      7 months ago

      It’s not that their reasoning is a black box. It’s that they do not have reasoning! They just guess what the next word in the sentence is likely to be.