2015-12-16
John Searle devised the Chinese Room thought experiment to refute the Strong AI position, which is that a system capable of behaving in a way that conscious beings behave must be conscious. (The Weak AI position is just that it is possible to build such a system.) Inside the Chinese Room, there is a person who doesn't understand Chinese, and a very large book containing detailed instructions on how to reply to messages in Chinese received from outside the room. Someone outside the room can communicate with the person in the room in Chinese, and will believe the person in the room also understands Chinese. It took me a long time to understand Searle's argument, because he conflates consciousness and intelligence, and this confuses matters. It is important to distinguish between understanding Chinese and knowing what it's like to understand Chinese. Understanding Chinese is a difficult problem requiring intelligence, but I don't think it requires consciousness. I'm satisfied that it's possible, at least in principle, to build a computer system capable of understanding (and conversing in) Chinese. But knowing what it's like to understand Chinese involves various qualia, shared by speakers of other languages. So I'll simplify Searle's argument. Instead of a room with a book containing rules for conversing in Chinese, and a person inside with no understanding of Chinese, we have a room, with coloured filters (i.e. transparent coloured plastic sheets) labelled "red", "green", and "blue", and a person who can't see any colours at all (i.e. who has achromatopsia). Such people (e.g. Knut Nordby) will readily confirm they have no idea what it's like to see colours, even though they have learned the colours of various objects, and understand the meaning of phrases like "seeing red" and "feeling blue". If you show them a banana, they'll tell you it's yellow (even if you painted it blue). If you shove a sheet of coloured paper under the door, the person in the room will place the different filters on top of the sheet one after the other, and by seeing how dark the paper then looks, be able to determine its colour, which he'll write on the paper before passing it back to the person outside. The person outside thinks the person inside can distinguish colours, but the person inside will confirm that not only can he not, but he doesn't even know what it's like. The only items in the room are the person, a single light bulb hanging from the ceiling, and the coloured filters. What experiences the colours? We have the person's word he only sees in monochrome, so are the filters conscious? They are just inanimate pieces of plastic. © Copyright Donald Fisk 2016 |