• glimse@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I don’t know why this bugs me but it does. It’s like he’s implying Turing was wrong and that he knows better. He reminds me of those “we’ve been thinking about the pyramids wrong!” guys.

    • Lvxferre@mander.xyz
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Nah. Turing skipped this matter altogether. In fact, it’s the main point of the Turing test aka imitation game:

      I PROPOSE to consider the question, ‘Can machines think?’ This should begin with definitions of the meaning of the terms 'machine 'and ‘think’. The definitions might be framed so as to reflect so far as possible the normal use of the words, but this attitude is dangerous. If the meaning of the words ‘machine’ and 'think 'are to be found by examining how they are commonly used it is difficult to escape the conclusion that the meaning and the answer to the question, ‘Can machines think?’ is to be sought in a statistical survey such as a Gallup poll. But this is absurd. Instead of attempting such a definition I shall replace the question by another, which is closely related to it and is expressed in relatively unambiguous words.

      In other words what’s Turing is saying is “who cares if they think? Focus on their behaviour dammit, do they behave intelligently?”. And consciousness is intrinsically tied to thinking, so… yeah.

    • nova_ad_vitum@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      The validity of Turing tests at determining whether something is “intelligent” and what that means exactly has been debated since…well…Turing.