I was just reading about Searle's reply to the Systems Reply to his notorious Chinese Room thought experiment, and thought I'd pass along what I think is a knock-down refutation. First, though, a thumbnail sketch of the Chinese Room dialectic.
The Chinese Room ("CR"). A native English-speaking male who (as it happens) speaks no Chinese sits in a room. A printer prints out strings of Chinese characters. When he receives the printouts, the man consults a rule book (in English) that tells him how to correlate the strings of characters with other Chinese characters in his inventory, stringing these together in a rule-based way to provide appropriate "responses" to the input. Like a computer outfitted with the appropriate program, then, the man here has passed the Turing Test but clearly doesn't understand Chinese. Thus, the Turing Test fails.
The Systems Reply ("SR"). While it's true the man doesn't understand the sentences, he is nonetheless a part (the "CPU," as it were) of a system--memory, instructions, operations--that does.
Searle's reply to SR (SRSR). But the man could (in principle) internalize the system (i.e., memorize the rule book and do the operations in his head), go out in a field somewhere and still pass the Turing Test without understanding a word of Chinese.
Now the refutation. Despite Searle's treating understanding as an all-or-nothing matter, it's pretty clear that Searle's hypothetical man in the field is demonstrating some kind of understanding of written Chinese. Granted, it's not the standard sort of understanding a competent speaker of Chinese would have. But then that's mostly1 because Searle's man in the field hasn't been given the appropriate rule book--the book that provides instructions about what to do in the world when confronted with strings of Chinese symbols having such-and-such syntactical properties. A man who could perform the corresponding, rule-based tasks would "understand" written Chinese in the relevant way.2 But in principle (as Searle would agree), nothing prevents our programming a robot to be able to perform these corresponding, rule-based tasks as well. And if a man so performing these tasks understands the Chinese, then so would the similarly competent robot.
NOTES
1. It's also because Searle analogizes at the wrong level of description. (See John Haugelean's remark in the Stanford Chinese Room article.)
2. For example, a man in a field has memorized my book of instructions. Upon reading the string of Chinese symbols that translates as "Go fly this kite and I will give you 1000 yuan," he takes the kite, flies it for a time, then returns with an outstretched hand.
The intuitive assessment here would seem to be that the man "understands" the Chinese offer. And yet to bite the bullet Searle would have to say, "Well, he still doesn't understand the Chinese sentence." To which our response might well be: Go fly a kite!
Recent Comments