When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Chinese room - Wikipedia

    en.wikipedia.org/wiki/Chinese_room

    The Chinese room is designed to show that the Turing test is insufficient to detect the presence of consciousness, even if the room can behave or function as a conscious mind would. Symbol processing [ edit ]

  3. Intuition pump - Wikipedia

    en.wikipedia.org/wiki/Intuition_pump

    The term was coined by Daniel Dennett. [2] In Consciousness Explained, he uses the term to describe John Searle's Chinese room thought experiment, characterizing it as designed to elicit intuitive but incorrect answers by formulating the description in such a way that important implications of the experiment would be difficult to imagine and tend to be ignored.

  4. Symbol grounding problem - Wikipedia

    en.wikipedia.org/wiki/Symbol_Grounding_Problem

    As Harnad describes that the symbol grounding problem is exemplified in John R. Searle's Chinese Room argument, [3] the definition of "formal" in relation to formal symbols relative to a formal symbol system may be interpreted from John R. Searle's 1980 article "Minds, brains, and programs", whereby the Chinese Room argument is described in ...

  5. Computational theory of mind - Wikipedia

    en.wikipedia.org/wiki/Computational_theory_of_mind

    Putnam himself (see in particular Representation and Reality and the first part of Renewing Philosophy) became a prominent critic of computationalism for a variety of reasons, including ones related to Searle's Chinese room arguments, questions of world-word reference relations, and thoughts about the mind-body problem.

  6. Philosophical zombie - Wikipedia

    en.wikipedia.org/wiki/Philosophical_zombie

    John Searle's Chinese room argument deals with the nature of artificial intelligence: it imagines a room in which a conversation is held by means of written Chinese characters that the subject cannot actually read, but is able to manipulate meaningfully using a set of algorithms. Searle holds that a program cannot give a computer a "mind" or ...

  7. Turing test - Wikipedia

    en.wikipedia.org/wiki/Turing_test

    His Chinese room argument is intended to show that, even if the Turing test is a good operational definition of intelligence, it may not indicate that the machine has a mind, consciousness, or intentionality. (Intentionality is a philosophical term for the power of thoughts to be "about" something.)

  8. China brain - Wikipedia

    en.wikipedia.org/wiki/China_brain

    The Chinese room scenario analyzed by John Searle, [8] is a similar thought experiment in philosophy of mind that relates to artificial intelligence. Instead of people who each model a single neuron of the brain, in the Chinese room, clerks who do not speak Chinese accept notes in Chinese and return an answer in Chinese according to a set of ...

  9. Outline of artificial intelligence - Wikipedia

    en.wikipedia.org/wiki/Outline_of_artificial...

    Chinese room § Strong AI. A machine that has mind, consciousness and understanding. (Also, the philosophical position that any digital computer can have a mind by running the right program.) Technological singularity. The short period of time when an exponentially self-improving computer is able to increase its capabilities to a ...