He asks whatever model is running behind either system to do the comparison and pastes the text. It’s full of errors, like perplexica saying farfalle doesn’t use LLM. Meanwhile, I just checked and it supports anything from ollama to groq (gpt4o, sonnet, etc.)
He asks whatever model is running behind either system to do the comparison and pastes the text. It’s full of errors, like perplexica saying farfalle doesn’t use LLM. Meanwhile, I just checked and it supports anything from ollama to groq (gpt4o, sonnet, etc.)
This post is ultra low quality.
That is easy to make errors about farfalle when there is not much in search result… All would fail, That wasnt a point of that post.
Thanks for checking but sonnet and chatgpt were irrelevant to me. Because not selfhosted.