• c10l@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    2 months ago

    I understand the benefits of running things locally, but why not just use Google’s or OpenAI’s LLM?

    I understand the benefits of cutting down sugar, but why not just binge on cake and ice cream?

    Sounds like you don’t understand the benefits of running things, and specifically LLMs and other kinds of AI models locally.

      • b34k@lemmy.world
        link
        fedilink
        arrow-up
        15
        ·
        2 months ago

        If you’re doing it locally, more sensitive queries become ok, because that data is never leaving your computer……

        • c10l@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          2 months ago

          Even when you’re not sending data that you consider sensitive, it’s helping train their models (and you’re paying for it!).

          Also what’s not sensitive to one person might be extremely sensitive to another.

          Also something you run locally, by definition, can be used with no Internet connection (like writing code on a plane or in a train tunnel).

          For me as a consultant, it means I can generally use an assistant without worrying about privacy policies on the LLM provider or client policies related to AI and third parties in general.

          For me as an individual, it means I can query the model away without worrying that every word I send it will be used to build a profile of who I am that can later be exploited by ad companies and other adversaries.