I Am a Strange Loop - Douglas R. Hofstadter [202]
To ask this question is to make the tacit assumption that there could be brains of any desired level of complexity that are not conscious. It is to buy into the distinction between Machines Q and Z sitting side by side on the old oaken table in Room 641, carrying out identical operations but one of them doing so with feeling and the other doing so without feeling. It assumes that consciousness is some kind of orderable “extra feature” that some models, even the fanciest ones, might or might not have, much as a fancy car can be ordered with or without a DVD player or a power moonroof.
But consciousness is not a power moonroof (you can quote me on that). Consciousness is not an optional feature that one can order independently of how the brain is built. You cannot order a car with a two-cylinder motor and then tell the dealer, “Also, please throw in Racecar Power® for me.” (To be sure, nothing will keep you from placing such an order, but don’t hold your breath for it to arrive.) Nor does it make sense to order a car with a hot sixteen-cylinder motor and then to ask, “Excuse me, but how much more would I have to throw in if I also want to get Racecar Power®?”
Like my fatuous notion of optional “Racecar Power®”, which in reality is nothing but the upper end of a continuous spectrum of horsepower levels that engines automatically possess as a result of their design, consciousness is nothing but the upper end of a spectrum of self-perception levels that brains automatically possess as a result of their design. Fancy 100-hunekerand-higher racecar brains like yours and mine have a lot of self-perception and hence a lot of consciousness, while very primitive wind-up rubber-band brains like those of mosquitoes have essentially none of it, and lastly, middle-level brains, with just a handful of hunekers (like that of a two-year-old, or a pet cat or dog) come with a modicum of it.
Consciousness is not an add-on option when one has a 100-huneker brain; it is an inevitable emergent consequence of the fact that the system has a sufficiently sophisticated repertoire of categories. Like Gödel’s strange loop, which arises automatically in any sufficiently powerful formal system of number theory, the strange loop of selfhood will automatically arise in any sufficiently sophisticated repertoire of categories, and once you’ve got self, you’ve got consciousness. Élan mental is not needed.
Liphosophy
Philosophers who believe that consciousness comes from something over and above physical law are dualists. They believe we inhabit a world like that of magical realism, in which there are two types of entities: magical entities, which possess élan mental, and ordinary entities, which lack it. More specifically, a magical entity has a nonphysical soul, which is to say, it is imbued with exactly one “dollop of consciousness” (a dollop being the standard unit of élan mental), while ordinary entities have no such dollop. (Dave Chalmers believes in two types of universe rather than two types of entity in a single universe, but to me it’s a similar dichotomy, since we can consider various universes to be entities inside a greater “meta-verse”.) Now I should like to be very sure, dear reader, that you and I are on the same page about this dichotomy between magical and ordinary entities, so to make it maximally clear, I shall now parody it, albeit ever so gently.
Imagine a philosophical school called “liphosophy” whose disciples, known as “liphosophers”, believe in an elusive — in fact, undetectable — and yet terribly important nonphysical quality called Leafpilishness (always with a capital “L”) and who also believe that