You are viewing this site in administrative mode. Click here to view in regular mode.
print Short URL for this page:[close]

Press Ctrl+C to copy
bookmark feedback/comment

Science and Beyond >>
Complexity and Life
Jan 31, 2010


The human endeavor in defining life, as we know it, has been historically significant, yet largely futile. The journey began from the early (almost prehistoric) observation of animate objects, that used to differ significantly from the other inanimate ones around us by the virtue of their ability to move in a somewhat willful manner, or at least the motions weren't completely governed by direct forces of nature. Upon relaxing the time scale of the motions, one would observe that plants too somewhat fall under this category. The very earliest attempts of defining life were from a biologists' point of view - a life is an entity that grows, has certain capabilities of motion (reacts to stimuli), feeds (transforms energy obtained from surroundings to perform motions), and reproduces. Adaptation and evolution were concepts introduced much later, and presently form an integral aspect in definition of life. As biologist John Maynard Smith points out, "entities are alive if they have the properties of multiplication, variation and hereditary" rather than the orthodox view of defining life by their external appearance and behavior. This kind of insight into the definition of life, since the days of Charles Darwin has been predominant in the scientific community, and probably is as close we can get in understanding life.

Life and Entropy

Life is complexity - the complex organization of atoms, molecules into building blocks to form molecular machines like the cells, which in turn organize to form even more complex machines like the living organisms. Life and entropy has been a central theme in defining life from a physicist's point of view. Entropy and complexity (or information content) are essentially inversely synonymous terms. While the general trend of any system is to increase its entropy (i.e. decrease its complexity), it is well-understood that somehow, in certain regions of space-time, the entropy can decrease (while increasing the entropy of its surrounding even more such that the net entropy in the whole universe increases]). And precisely that's how you get life and complexity. However the process of decrease in entropy in a closed system like that is poorly understood. Even poorly understood is the connection between this decrease in Entropy and standard evolutionary theories which biologists use to explain the origin and evolution of life. At this point, let's revisit some of the basics of entropy and complexity to get an insight into how poor is our understanding.

Entropy: Entropy is a statistical concept. For a moment let's forget all that we had learnt about entropy in the high-school thermodynamics class. One of the most lucid explanation of the concept of entropy I have come across is from the book "A Brief History of Time". There the author takes an example of a box with Lego bricks (the original example had jig-saw puzzle pieces). You cannot see inside the box, neither can you touch the bricks. The box is large enough and has lots of empty space even with the Lego bricks (say 100 of them) inside it. Now imagine you start shaking the box and you shake it for, say, 5 minutes. You open the box, and there is a good probability that you'll find a brick or two have got latched to each other. You close the box and shake it for another 5 minutes. As you open it you will probably expect almost the same number of accidental latching of blocks - may be the previous latches were broken and new ones were created. If you do this process for, say, a hundred times, a maximum of how many can you expect to get such accidental latching? May be in one particular shake, at max 5 pairs of bricks out of, say, 100 bricks in the box? That ought to be a lucky shake! Lucky, eh? Then how about this: after some REALLY lucky shake, you open the box and find a Lego toy helicopter has been created inside the box - all by itself - purely by the stroke of luck! Now tell me, how lucky you need to be to find such a thing happen!? Definitely very lucky indeed. However you can probably increase your luck this way: instead of doing a shake for 5 mins, do the shaking for 1 million years. Now, since you have a much larger time for shaking, your luck might be better that the pieces organize into a helicopter at some point during this 1 million year (although when you finally open the box after a million years, the helicopter might have broken back into the constituent bricks). Again, you can be shaking a million bricks instead of just 100 inside a real BIG box. This even further increases your chance of creating at least one helicpter, purely by luck. It is like trying a lottery again and again, by all the members of your family. The more number of times you try, and more number of family members try, greater becomes the joint probability of hitting the jackpot by your family.

But come on, creating a helicopter out of Lego bricks in a box just by shaking the box? That sounds too improbable even if one is given a billion years to shake a billion bricks! Or does it? Then consider yourself and all the living beings around you. All of us are random outcome of such a real long shaking! The box was this planet, and the bricks were the atoms on the planet. Instead of 100 bricks there were zillions of atoms, and instead of the 5min it was shaken for some 3.5 billion years! And the outcome? Complex, self-replicating, living beings like us! It's like telling somebody to shake a box of gypsum plaster powder and promising that he/she will eventually create a statue of George Washington out of it.

That's essentially is the concept of entropy. Entropy is inversely related to the probability of creating some well-organized and complex system out of a system which is most likely to remain (or rather move towards) random states/configurations. By shaking the box of Lego bricks one would expect to end up in a more random state of the bricks rather than an organized state like a helicopter. This has a more mathematical formulation, but let's not bother much about that here. For now it's enough to realize the simple intuitive concept, that for any system which can have a large number of possible disorderly state (like the bricks being in random configuration) and a very few highly orderly state (like the bricks being fit together to form a nice structure), it's more likely to find the system in one of the disordered states. Moreover, assuming that the shaking of the box was very vigorous, it's also quite natural to expect that even if some orderly structure do get formed, it will eventually break back into the constituent bricks due to the shaking. This very statistical concept gives rise to the second law of thermodynamics, which says that the entropy in a closed system always increases (the fancy way of saying that at every instant the probability that the system will become more disorganized than its previous configuration is MUCH higher than it becoming organized.)

The loopholes in the Second Law of Thermodynamics: So far we have understood entropy and why should one expect it to increase.

TODO: Finish this article


Science Physics Life Biology

Page last modified on February 23, 2010, at 08:40 PM EST.
(cc) Subhrajit Bhattacharya