• Crozekiel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    22 hours ago

    It’s a chat bot that googles your question before answering in the hopes to cut down on hallucinations. It doesn’t solve this problem at all.

    • Bazell@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      21 hours ago

      Your explanation is not completely correct. More correct explanation would be: an AI chatbot that has an ability to gather relatable info to the user input from internal or external sources allowing the AI model to answer more precisely on questions even if the model wasn’t trained on this data at all. This lowers the amount and degree of hallucinations to some point but doesn’t eliminate them.