So Jim Crutchfield is our guest spot
for this unit. He’s Professor of Physics at
the University of California at Davis and Director of the Complexity Science
Center. He’s also an external professor at the Santa Fe Institute.
Jim is one of the pioneers in the field of chaos theory and has worked
for many years on a variety of topics in complexity research
especially in respect to information processing in complex systems.
So welcome, Jim. Hi Melanie, how are you doing? I’m good. I’ll just ask you
what do you think is the role of the concept of information in understanding
complex systems? Well, the short and simple answer is that it's
absolutely a key concept. One of the
important roles that it plays is in some sense it’s a stand-in
for the quantities that we’re interested in.
The contrast or parallel I would draw would be in physics
physics that sort of dominant object or
concept or entity we’re interested in is energy.
And certainly there are many successful applications of
more or less traditional physics, say, the physics of phase
transitions to complex systems. But many of these complex systems
if we’re thinking of social networks or a human-designed systems
the internet, don’t necessarily have an
appropriate notion of energy, so information
in many ways stands in for trying to describe how a complex
system is. Various kinds of information processing and storage
can be associated with how a system is organized.
So it’s a key concept. Certainly Shannon’s original
notion of information as degree of surprise
degree of unpredictability in a system or how random a system is needs to be augmented
so that’s certainly a focus of a lot of my work
is trying to delineate that there are many different kinds of information, not just Shannonian information,
which is, in the context of communication
theory, is a degree of surprise.
So let’s talk about a particular example. We’ve talked for instance about ant
colonies. So how do you think information
would fit in there and what are the different kinds of information?
Well, the basic approach that we use is to start
with Shannon’s view of any kind of process or
natural system or designed system as a communication channel. Now that concept can be
applied many, many different ways. So at the most general
level, we can think of any temporal process as a communication
channel that communicates the past to the future.
And we can apply that communication picture to
an ant colony. And there are many different levels at which they can be applied, so
there is a notion of the organization of the nest
and what kinds of social or even architectural information
is built into the social organization
or to the nest structure itself, and all of those things express
all those kinds of organization express a certain summary of the ants’ past
behavior that is important for them to survive and therefore live on
into the future. We could also zoom in a little bit and ask
what is it that is being communicated
and how is it that the colony organizes around certain tasks?
So that would be a more individual level
Maybe a food source shows up some distance away from the nest. How does that
get communicated? How do the different populations of worker
and forager ants shift over time in response to available resources?
And we can talk about that also in information terms, how much the
informational structure of the colony changes in response to this new outside information
coming in, how much memory there is, and so on.
So is information a real thing
in the sense that mass and energy are real things?
Is it the same kind of physical quantity?
We’re still working on this.
Basically yes. It’s not unhelpful to think back
four or five hundred years to the first basic discussions of what energy
was in the foundations of physics.
In the sixteenth and seventeenth centuries there was a lot of discussion about whether energy
depended upon -- kinetic energy -- depended upon the speed of an object or the
square of the speed. And back then we can see that
as a confusion between momentum and kinetic energy. And
I think it’s -- in a sense we’re in that same period
trying to understand first of all that there’s not a unitary notion of information.
There are different kinds of information that have different semantic
contexts in different settings.
I think the proof is in the pudding. Is it useful?
Yes, there are many applications of this. We’ve been able to show that
information storage and processing is relevant for describing how emergent properties appear
and can be quantitative about that in pattern-forming systems or nonlinear dynamical
systems. So there are many arenas in mathematical physics
in nonlinear dynamics where the concept is extremely useful
and hopefully the range of applications will grow, and as that happens,
our notion of information and the kinds of information will be enriched.
So we’ve talked about defining complexity
in this course, and how that’s a difficult thing to do.
And people have different definitions, so how does information fit into your particular
definition of complexity?
Being somewhat simple minded, they are essentially synonymous in my view.
But not Shannon information. Well, okay, so --
Right. There is the mathematical
definition of Shannon information, which is
to say it most simply,
it tells one how much information there is in the
current of probabilistic events. Mathematically it’s really just how
flat the probability distribution is, how uniform it is over
the events. That same mathematical structure gets used again
and again, but the distributions that we’re describing
change depending upon the context of application and
for example, you can talk about the Shannon information
in the causal architecture of a system, and that measures
the amount of stored memory, not how difficult that system is to predict.
By causal architecture, you mean sort of what causes what in the system?
Right. How many active degrees of freedom. If I look at a turbulent fluid
or if my car’s engine is idling roughly, how many active
degrees of freedom are there? How much in
an instantaneous state of a system is
storing the past information? What is the loci of information storage?
We still use the same mathematical form
Shannon’s information
function, but it’s applied to a different distribution and therefore the
meaning of that kind of information differs
from his original notion of how much
information an information source produces per unit of time.
Okay, so let me ask -- I know you
had a lot of influence on the field of
chaos theory and dynamics early on. How did that lead you to your
current interest in information and information processing? Well, in the
history of nonlinear physics and nonlinear dynamics, one of the most
important early steps was the Russian mathematician
Andrey Kolmogorov and his student Yakov Sinai, they borrowed
Shannon’s notion of information that Shannon introduced in the mid-40s
to apply to nonlinear dynamical systems.
What they were interested in is, if you have two different dynamical systems,
they sort of knew intuitively that they were chaotic or
in different degrees, they were more or less unpredictable, but they weren't
able to be quantitative until they borrowed Shannon’s notion of information
taking his concept of source entropy rate
and finding what’s now called the Kolmogorov-Sinai entropy, so I have a nonlinear chaotic system
a set of deterministic differential equations, and I can now measure
this information production rate, and I can say that one system is more chaotic
and more difficult to predict than the other, so the direct historical answer
is in studying nonlinear dynamical systems, in particular those that are chaotic
information has a long, half-a-century-old
history in the very basic way we understand
production of information in natural systems. Okay.
What are you working on now? What’s the most exciting thing that
has got your attention? I contrasted
the information I know sense of information with the earlier period of trying
to understand what energy is, so I’d say the most
engrossing thing right now is to ask is there a fundamental relationship
in a natural system that has different kinds of
energy and is behaving over time, and how that is related to the systems
information production, information and storage, so the question here
is are there fundamental limits on
the amount of information processing you can extract
from a natural system, or a designed system like a computer, and how much
energy dissipation is required? So it’s a new field now called information
thermodynamics. We’re actually trying to understand the direct relationship between energy and
information. Interesting. So Liz Bradley -- we talked to her
in the last unit. She talked about looking at
computers as dynamical systems and measuring them in terms of those terms.
Is that related to the kind of stuff you’re looking at? Yes, well Liz and I are actually talking about
taking some of our measures of information storage and processing and
applying that to simple kinds of logic circuits and seeing if
and their physical implementation to see if there’s some relationship between the degree
of information processing and energy dissipation. The basic ideas go back
to Rolf Landauer at IBM Research. Rolf just passed
away, and he had this notion now enshrined in
Landauer’s principle that says for every bit of logical manipulation
that a system does, you have to dissipate an amount of
energy that is proportional to the number of bits.
The number of choices the system has to make in
its logical operations, and he claimed that that’s a fundamental limit, so people
are now testing this idea. One arena that
is being revisited is the notion of
Maxwell’s demon. Maxwell introduced his clever little demon to sort fast and
slow molecules in different sides of a box, thereby increasing the temperature
difference and allowing for work to be extracted. And so
there’s a notion of how intelligent is the demon to extract how much work.
So revisiting that, and that’s sort of one prototype system
that lets us talk about intelligence or information processing on the one hand and
energy dissipation and work extraction on the other.
Okay, very interesting. One last question.
A lot of students have asked about
how it is that somebody gets into complex systems research
and there’s no way you can major in complex systems in most
universities, so what would you recommend for students who are really interested in
getting into this field? Well, I guess they should take your online course.
And also I’m putting up my course now and I
suspect given the conversations about
exactly this topic of massively open online courses
it might be coming easier and easier to
do that. It is a little tough.
There are certain basic areas I think one should
study, and I have my own favorite list -- statistical
physics, information theory, the theory of computational nonlinear dynamics --
There’s a kind of -- I gave a list
like that except mine included learning
evolution and learning. Right. I see those as applications of basics.
But you’re right -- other kinds of systems, certainly
other questions about information storage and processing, even energy dissipation
apply to ecological and learning systems, adaptive systems
and evolutionary systems too.
So those would also constitute some
basic complex system itinerary. Hopefully
at some point in the future, although I don’t think it’s been realized, as you’ve pointed out,
there’s a particular university that would step forward and allow something like a graduate program
in complex systems, and as a result you have to be kind of adventurous.
There’s no shortage of popular and semi-popular reserach
monograph books out there, so maybe your course will provide a list
of resources like that, but still, we have to cobble these
things together. Right.
Alright -- thank you so much. This has been great. I really appreciate it. Sure, happy to help.
Okay.