The First Law of Complexodynamics
I am reading this paper because it was recommended as part of Ilya Sutskever's approx. 30 papers that he recommended to John Carmack to learn what really matters for machine learning / AI today. The purpose of this blog post is to answer the question: "why does “complexity” or “interestingness” of physical systems seem to increase with time and then hit a maximum and decrease, in contrast to the entropy, which of course increases monotonically?"
Reference Link to Shtetl-Optimized Blog Post
Contents
1.1 References
1.2 Related
1.2.1 Ten Things Everyone Should Know About Time
1.2.2 Kolmogorov Complexity
1.3 Notes
Chapter 1
Notes
1.1 References
- Link to Blog Article
- Sean Carroll Blog
- Ten Things Everyone Should Know About Time
- Time is Out Of Joint
1.2 Related
1.2.1 Ten Things Everyone Should Know About Time
- Time Exists: Time organizes the universe into an ordered series of moments
- The past and future are equally real: Every event in the past and future is implicit in the current moment.
- Everyone experiences time differently: Einstein explained that how much time elapses for a person depends on how they travel through space
- You live in the past: About 80 milliseconds in the past. Our conscious experience taskes time to assemble, and your brain waits for all the relevant input before it expeiences the ”now”. Experiments have shown that the lag between things happening and us experimenting them is about 80 milliseconds.
- Your memory isn’t as good as you think: Your brain uses a similar mechanism for remembering the past as it does for dreaming about the future.
- Consciousness depends on manipulating time: The ability to manipulate time and possibility is a crucial feature of consciousness.
- Disorder increases as time passes
- Complexity comes and goes: Entropy increases, but complexity is ephemeral; it increases and decreases in complex ways
- Aging can be reversed: Reversing the arrow of time for living organisms is a technological challenge, not a physical impossibility
- A lifespan is a billion heartbeats: Larger animals live longer; but they also metabolize slower, as manifested in slower heart rates. These effects cancel out, so that animals from shrews to blue whales have lifespans with just about equal number of heartbeats — about one and a half billion, if you simply must be precise. In that very real sense, all animal species experience “the same amount of time.”
1.2.2 Kolmogorov Complexity
In algorthmic information theory (a subfield of computer science and mathematics), the Kolmogorov Complexity of an object, such as a piece of text, is the length of a shortest computer program that produces the object as output. It is a measure of the computatiopnal resources needed to specify the object, and it is also known as algorithmic complexity.
function GenerateString2() return "4c1j5b2p0cv4w1x8rx2y39umgw5q85s7"
function GenerateString1() return "ab" x 16
1.3 Notes
At FQXi’s Setting Time Aright conference, Sean Carroll posed the following question:
Why does “complexity” or “interestingness” of physical systems seem to increase with time and then hit a maximum and decrease, in contrast to the entropy, which of course increases monotonically?
This purpose of this blog post is to sketch a possible answer to Sean’s question. The Second Law of Thermodynamics says that the entropy of any closed system tends to increase with time until it reaches a maximum value. The question is: Why did the universe’s initial state at the big bang contain so much order for the universe’s subsequent evolution to destroy?
In the image above, entropy increases monotonically from left to right, but intuitively, the ”complexity” seems highest in the middle picture: the one with all the tendrils of milk. The same is true of the universe: it started as a low-entropy soup of high-energy particles, and many years from no, the universe will just be a high-entropy soup of low-energy particles. But today, the universe contains many interesting structures.
Scott Aaronson proposes that the questions of complexity can be answered using a notion called sophistication
from the theory of Kolmogorov complexity. The Kolmogorov complexity of a string x
is the length of the shortest
possible computer program that outputs x
.
Comments
You have to be logged in to add a comment
User Comments
There are currently no comments for this article.