This screenshot and similar ones have been circulating with concerns that these chatbots are dangerously sycophantic.

  • otacon239@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    10 days ago

    This right here. If someone can maliciously make an LLM do this, there are plenty of others out there that will do it unknowingly and take the advice at face value.

    It’s a search engine at the end of the day and only knows how to parrot.