This page has been proofread, but needs to be validated.

this proposal is tracking something, but it isn’t easy to say precisely what. We could say that it is easier for the equilibrium of my body to be disturbed by the right (or, rather, wrong) sort of interaction between my liver and heart than it is for that same equilibrium to be disturbed by the right kind of interaction between me and a stranger on the subway, but this still isn’t quite correct. It might be true that the processes that go on between my organs are more fragile—in the sense of being more easily perturbed out of a state where they’re functioning normally—than the processes that go on between me and the strangers standing around me on the subway as I write this, but without a precise account of the source and nature of this fragility, we haven’t moved too far beyond the intuitive first-pass account of complexity offered at the outset of Section 2.1. Just as with mereological size, there seems to be a nugget of truth embedded in the hierarchical account of complexity, but it will take some work to extract it from the surrounding difficulties.

2.1.3 Complexity as Shannon Entropy

Here’s a still more serious proposal. Given the discussion in Chapter One, there’s another approach that might occur to us: perhaps complexity is a measure of information content or degree of surprise in a system. We can recruit some of the machinery from the last chapter to help make this notion precise. We can think of “information content” as being a fact about how much structure (or lack thereof) exists in a particular system—how much of a pattern there is to be found in the way a system is put together. More formally, we might think of complexity as being a fact about the Shannon entropy[1] in a system. Let’s take a moment to remind ourselves


  1. See Shannon (1948) and Shannon & Weaver (1949)

60