Sibbo@sopuli.xyz to Programmer Humor@lemmy.ml · 1 year agoGitHub Copilot will respect your privacy!sopuli.xyzimagemessage-square54fedilinkarrow-up1885arrow-down12
arrow-up1883arrow-down1imageGitHub Copilot will respect your privacy!sopuli.xyzSibbo@sopuli.xyz to Programmer Humor@lemmy.ml · 1 year agomessage-square54fedilink
minus-squarepearsaltchocolatebar@discuss.onlinelinkfedilinkarrow-up11arrow-down3·1 year agoYou can absolutely add constraints to control for hallucinations. Copilot apparently doesn’t have enough, though.
minus-squareDarkassassin07@lemmy.calinkfedilinkEnglisharrow-up20·1 year agoLmao. That’s even better when you consider the copilot button replaced the ‘show desktop’ (ie ‘minimize all my windows’) button.
minus-squareshootwhatsmyname@lemm.eelinkfedilinkEnglisharrow-up16·1 year agoMy guess is that Copilot was using a ton of other lines as context, so in that specific case his name was a more likely match for the next characters
minus-squarejherazob@beehaw.orglinkfedilinkEnglisharrow-up1·1 year agoNo matter how many constraints you add, it’s never enough, that’s the weakness of a model that only knows language and nothing else
You can absolutely add constraints to control for hallucinations. Copilot apparently doesn’t have enough, though.
deleted by creator
Lmao. That’s even better when you consider the copilot button replaced the ‘show desktop’ (ie ‘minimize all my windows’) button.
My guess is that Copilot was using a ton of other lines as context, so in that specific case his name was a more likely match for the next characters
No matter how many constraints you add, it’s never enough, that’s the weakness of a model that only knows language and nothing else