• knokelmaat@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    1 hour ago

    Absolutely not. It’s a tool that they used successfully, and they can put it in their method or whatever, but it’s not a person. We never describe other scientific applications as authors, even though they are often essential to reaching a goal or understanding. Think about all those proofs using computer based exhaustion methods and the like. I think people are confused because an LLM interacts so human like. But it is still quite a logical statistical algorithm, as long as we don’t go towards genuine general intelligence with reasoning capabilities and an own identity / “soul”, it is absurd to act like it’s a person. And the situation of a truly conscious and independent AI is still very far off, if it’s even possible at all, in my opinion.