• MrSoup@lemmy.zip
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    1 day ago

    Ok, but what’s the prompt used? Let it generate a Dr House script?

  • mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 day ago

    Humans will always outsmart the chatbot. If the only thing keeping information private is the chatbot recognizing it’s being outsmarted, don’t include private information.

    As for ‘how do I…?’ followed by a crime - if you can tease it out of the chatbot, then the information is readily available on the internet.

  • Lojcs@piefed.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    I don’t understand how Ai can understand ‘3nr1cH 4n7hr4X 5p0r3s’. How would that even be tokenized?