Functionalism According to Fodor and Searle

Measuring the IQ of Mind and Machine:
an Examination of Functionalism as Represented by Fodor and Searle
Shanaree Sailor

Fodor begins his article on the mind-body problem with a review of the current theories of dualism and materialism. According to dualism, the mind and body are two separate entities with the body being physical and the mind being nonphysical. If this is the case, though, then there can be no interaction between the two. The mind could not influence anything physical without violating the laws of physics. The materialist theory, on the other hand, states that the mind is not distinct from the physical. In fact, supporters of the materialist theory believe that behavior does not have mental causes. When the materialist theory is split into logical behaviorism and the central-state identity theory, the foundation of functionalism begins to form. Logical behaviorism states that every mental feeling has the same meaning as an if-then statement. For example, instead of saying "Dr. Lux is hungry," one would say "If there was a quart of macadamia brittle nut in the freezer, Dr. Lux would eat it." The central-state identity theory states that a certain mental state equals a certain neurophysiological state. The theory works in a way similar to Berkeley’s representation of objects. Both mental states and objects are a certain collection of perceptions that together identify the particular state or object.
Fodor develops the idea of functionalism by combining certain parts of logical behaviorism and the central-state identity theory. From logical behaviorism, Fodor incorporates the idea that mental processes can be represented by physical if-then statements. As such, behavior and mental causation are no longer distinct and unable to interact. Also, logical behaviorism provides a way for mental causes to interact with other mental causes. This, in turn, may result in a behavioral effect. The last point is also a characteristic of the central-state identity theory. One doctrine of the central-state identity theory is called "token physicalism." Token physicalism states that all mental states that currently exist are neurophysiological. Thus, token physicalism does not place physical restrictions on the type of substance capable of having mental properties. When the points of logical behaviorism and the central-state identity theory, as described here, are combined, functionalism is the result. The theory of functionalism supposes that a mental state depends upon how a system is put together rather than upon the material which composes the system. Functionalism also states that the output of the system is related to both the input and the internal status of the system at a given time.
Based on the definition of functionalism, the mental processes of a human are not distinct from the systemic processes of a machine. Mental processes are defined as an operation on symbols to yield certain results. Thus, if the same symbols yielded the same results in two separate systems, then the mental states can be seen as similar, or even identical. Along this vein, consider a computer programmed with the same reasoning process as a mind. When the input "B" is entered, the output depends both upon "B" and upon the state of the system resulting from the computation of "A." If the computer was programmed with the exact same reasoning process as a mind, then the result would be the same. Thus, the mental state of the mind would be indistinguishable from the systemic state of the computer. The computer metaphor upholds the theory of functionalism because the output is the result of interaction between the input and the current state of the system. The metaphor also demonstrates the insignificance of the physical state of the system when determining whether two mental states are alike. Thus, it shows that the processes, rather than the composition, of the system determine the mental state.
Searle disagrees with the view that the physical composition of the system does not influence the mental state of the system. To support this, he develops the Chinese room argument. Suppose a computer program is written that simulates an understanding of Chinese. Thus, when the computer is presented with a question in Chinese, it searches its memory and answers appropriately in Chinese. If the program is written well enough, the answers may be indistinguishable from a native speaker’s answers. According to