Albin Jose@lemm.ee to Technology@lemmy.worldEnglish · 7 months agoChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euexternal-linkmessage-square37fedilinkarrow-up1120arrow-down15cross-posted to: [email protected]
arrow-up1115arrow-down1external-linkChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euAlbin Jose@lemm.ee to Technology@lemmy.worldEnglish · 7 months agomessage-square37fedilinkcross-posted to: [email protected]
minus-squarefilister@lemmy.worldlinkfedilinkEnglisharrow-up13arrow-down1·7 months agoJust ask ChatGPT what it thinks for some non-existing product and it will start hallucinating. This is a known issue of LLMs and DL in general as their reasoning is a black box for scientists.
minus-squaredb0@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up6arrow-down2·7 months agoIt’s not that their reasoning is a black box. It’s that they do not have reasoning! They just guess what the next word in the sentence is likely to be.
Just ask ChatGPT what it thinks for some non-existing product and it will start hallucinating.
This is a known issue of LLMs and DL in general as their reasoning is a black box for scientists.
It’s not that their reasoning is a black box. It’s that they do not have reasoning! They just guess what the next word in the sentence is likely to be.