Going on a Date With Panpsychism, Tononi, and Edelman

I agreed to go on a date with panpsychism. Not that panpsychism came up to me after class and asked if I wanted to grab a coffee, or slipped me its phone number at a bar, but, you know, I decided to give it a chance. Just to be safe, I made it a double date with Giulio Tononi and Gerald Edelman.

“Panpsychism is the doctrine that mind is a fundamental feature of the world [that] exists throughout the universe,” and all parts of that clause are subject to interpretation.1 Subjectivity to interpretation means there are several different strains of panpsychism, but essentially the view is that everywhere there is physicality, there is also mentality. In other words, there is experience at all levels of the physical world, even at the level of fundamental physical particles.

To reduce to an almost offensive degree, by experience I mean the process of interacting with or being exposed to information in one’s environment. By this definition, I can get behind a panpsychist theory of experience that goes something like: experience is a fundamental feature of the world that exists throughout the universe. An electron, for example, has experiences in the sense that it is exposed to the surrounding electromagnetic field, the charges of surrounding particles, and their positions and motions. It interacts with this information by moving, changing the orientation of its spin, and so on and so forth. On the other hand, a human being is exposed to its environment via its senses. It interacts with this information in many ways, including decision making and action.

So I have experiences, and electrons have experiences. Okay, fine. But is an electron conscious? Does an electron have conscious experiences? Over brunch yesterday, DC pressed me to explain the distinction between experience and conscious experience. Needless to say, I was hard pressed.

In their paper Consciousness and Complexity, Tononi and Edelman  propose that conscious experience has two features:

  1. Conscious experience is integrated
  2. Conscious experience is highly differentiated

By integrated they mean that each conscious “scene”, if you will, is unified; in other words, it cannot be decomposed into independent components.2 By differentiated they mean each conscious state is selected from among a vast repertoire of possible states. A physical system should be able to generate conscious experience to the extent that it has a large repertoire of available states—information in differentiation—yet it cannot be decomposed into a collection of causally independent subsystems—integration of information.

Essentially, consciousness (on Tononi and Edelman’s view) has to do with a physical system’s capacity to integrate information. By Shannon’s classical definition, information is the reduction of uncertainty among a number of alternative outcomes when one outcome occurs. Information on this definition can thus be measured by the entropy function, which is the weighted sum of of the logarithm of the probability of alternative outcomes. Which all sounds fancy, but the point is that you can actually measure information.

Without getting into the math of it all, take two examples: a coin and a die. When you flip a coin, it turning up heads corresponds to 1 bit of information because there are only 2 alternatives. When you roll a die, on the other hand, it turning up 3 corresponds to ≈2.59 bits of information, because there are 6 alternatives. You can see that as the number of possible alternatives increases, the amount of information corresponding to each outcome increases. There is more information in rolling a 3 than in a coin coming up heads. When Tononi and Edelman propose that conscious experience must be highly differentiated, this is what they’re talking about. To put it simply: the more possible states a physical system can enter, the more information there is in each possible state. Differentiation is the selection of a particular experience from among a great many possible experiences. This is often taken for granted, and yet the selection of one particular experience from among millions and billions of other possible experiences constitutes a lot of information.

Imagine you have a photodiode that can differentiate between light and dark and then provide an audible output (1 beep or no beep, for example) and a human that can differentiate between light and dark and also provide an audible output. Both the photodiode and the human are performing the same task—differentiating between, and reporting on, light and dark. The question is: “why should the differentiation between light and dark performed by the human be associated with conscious experience, while presumably that performed by the photodiode is not?” It doesn’t seem fair to the photodiode, and yet we don’t want to say that the photodiode is conscious.3 Differentiation—assuming it is one of the features of conscious experience—gives us a way out. Bad news for photodiodes, good news for anthropocentrics.

To the photodiode, the discrimination between dark and light is the only distinction available, and is therefore minimally informative. Like the coin, the photodiode’s output corresponds to only 1 bit of information. To a human, on the other hand, the distinction is between but 2 experiential states among millions and billions of possible experiential states. Each selection makes available a correspondingly large amount of information.

While differentiation sets humans apart from photodiodes, it is not by itself enough to constitute conscious experience. Think about it: if differentiation were the only requisite for conscious experience, we could theoretically connect a bunch of photodiodes together—like on a sensor chip in a digital camera—enough photodiodes such that the whole camera would have as many possible states as a human brain. But we don’t want to say the camera is conscious any more than we wanted to say one photodiode was conscious. Why?

Because information integration is also requisite for conscious experience, not just differentiation. The capacity to integrate information rests on causal interactions between elements of the physical system, in this case the causal interactions between brain states and the causal interactions between photodiodes in a digital camera, or lack there of. There are, in fact, no causal interactions between individual photodiodes, and this makes all the difference.

While a chip can be considered easily as both a whole chip and as a collection of a bunch of individual photodiodes, the brain cannot be considered as both a whole brain and as a collection of a bunch of individual brain states. If the chip were cut in half, or even cut down into each of its individual photodiodes, we could still conceive of it functioning. This is because there are no interactions among the photodiodes on a chip. The functioning of one photodiode has no causal effect on the functioning of its neighbor. The same cannot be said of the brain. This is because while there are no interactions among the photodiodes on a chip, there are a multitude of interactions among brain states. Unlike separating the photodiodes on a sensor chip, separating brain states has disastrous effects. Because brain states causally interact, information can be integrated among them in a way that is impossible on a sensory chip.

The characteristic feature of integrated information is that it cannot be broken down into independent “bits” among which no information can be integrated. A sensor chip, as has already been pointed out, can be broken down in this way. The experiencing brain, as studies of brain damaged subjects (especially split brain patients) have shown, cannot, at least not without affecting conscious experience or wiping it out completely. Integration of information, like differentiation, can be measured. To measure information integration, it is essential to know whether a system can or cannot be broken down into independent subsystems. Again, there’s math involved that I won’t get into, but if you’re computationally inclined, go here.

Taking all this into account, what is the difference between experience and conscious experience? This is not a distinction that Tononi and Edelman draw, but I do think they’re a good place to start in trying to understand what determines to what extent a system has conscious experience, and, furthermore, what determines the kind of consciousness a system has.

Plus, on their view, there’s no reason we shouldn’t be able to build conscious things. Which is both a terrifying and awesome prospect.

  1. Stanford Encyclopedia of Philosophy: Panpsychism 

  2. This, they claim, is a feature of all conscious experience, irrespective of content. Evidence for this includes, but is not limited to: our inability to perform multiple tasks at once (unless some of those tasks are automatic); our inability to make more than a single conscious decision within a few hundred milliseconds (psychological refractory period); and that we cannot be aware of two incongruent scenes at the same time (binocular rivalry). 

  3. Wants and desires are, of course, suspect. 

Updated: