Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.

  • adam_y@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Can we swap out the word “hallucinations” for the word “bullshit”?

    I think all AI/LLM stuf should be prefaced as “someone down the pub said…”

    So, “someone down the pub said you can eat rocks” or, “someone down the pub said you should put glue on your pizza”.

    Hallucinations are cool, shit like this is worthless.

    • Eheran@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      No, hallucination is a really good term. It can be super confident and seemingly correct but still completely made up.

      • richieadler@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        3 months ago

        It’s a really bad term because it’s usually associated with a mind, and LLMs are nothing of the sort.