• snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    70
    ·
    23 hours ago

    Or if you are set on using AI Overviews to research products, then be intentional about asking for negatives and always fact-check the output as it is likely to include hallucinations.

    If it is necessary to fact check something every single time you use it, what benefit does it give?

    • Feyd@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      ·
      14 hours ago

      That is my entire problem with llms and llm based tools. I get especially salty when someone sends me output from one and I confirm it’s lying in 2 minutes.

    • brsrklf@jlai.lu
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      22 hours ago

      None. It’s made with the clear intention of substituting itself to actual search results.

      If you don’t fact-check it, it’s dangerous and/or a thinly disguised ad. If you do fact-check it, it brings absolutely nothing that you couldn’t find on your own.

      Well, except hallucinations, of course.

    • artyom@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      4
      ·
      19 hours ago

      It hasn’t stopped anyone from using ChatGPT, which has become their biggest competitor since the inception of web search.

      So yes, it’s dumb, but they kind of have to do it at this point. And they need everyone to know it’s available from the site they’re already using, so they push it on everyone.

    • TXL@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      6
      ·
      20 hours ago

      It might be able to give you tables or otherwise collated sets of information about multiple products etc.

      I don’t know if Google does, but LLMs can. Also do unit conversions. You probably still want to check the critical ones. It’s a bit like using an encyclopedia or a catalog except more convenient and even less reliable.

      • Feyd@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        14 hours ago

        You can do unit conversions with powertoys on windows, spotlight on mac and whatever they call the nifty search bar on various Linux desktop environments without even hitting the internet with exactly the same convenience as an llm. Doing discrete things like that with an llm inference is the most inefficient and stupid way to do them.

        • TXL@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          3 hours ago

          All things were doable before. The point is that they were manual extra steps.

          • Feyd@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            51 minutes ago

            They weren’t though. You put stuff in the search bar and it detected you were asking about unit conversion and gave you an answer, without ever involving an llm. Are you being dense on purpose?

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        19 hours ago

        Google had a feature for converting units way before the AI boom and there are multiple websites that do conversions and calculations with real logic instead of LLM approximation.

        It is more like asking a random person who will answer whether they know the right answer or not. An encyclopedia or catalog at least have some time of a time frame context of when they were published.

        Putting the data into tables and other formats isn’t helpful if the data is wrong!