• oce 🐆@jlai.lu
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 month ago

    I think it’s more about asking it the steps to create a bomb or how to disrupt the grid, for example, where to cut the major edges.

      • dual_sport_dork 🐧🗡️@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 month ago

        That, and the Internet has been teaching people how to create bombs since the dial-up days. I don’t predict that LLM’s will be either a benefit or a detriment to that particular strain of natural selection.

          • oce 🐆@jlai.lu
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 month ago

            No, but I think it could make the knowledge more easily available which increases the risk that it may happen.

              • oce 🐆@jlai.lu
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 month ago

                I think I heard about it before, but instead of having to remember that, I could just ask an uncensored LLM.

                • AwesomeLowlander@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  1 month ago

                  The actual point was, bomb making instructions have been floating around on search engine results since the days of dial up. That particular manuscript itself has existed since before the days of the Internet. There’s nothing cgpt could give you that you couldn’t have found by typing the same query into Google. Getting the instructions is literally the easiest, least effort, least risk part of building a bomb.