Codes and thermodynamics
http://reasonandscience.heavenforum.org/t2405-codes-and-thermodynamics
Perry Marshall, Evolution 2.0
We understand codes better than we understand gravity, the laws of thermodynamics, or quantum physics. No one knows how to create gravity, but millions of people know how to create code, including some of the wealthiest businesspeople in the world. Everything we know from computer science provokes a huge question: How do you get a code without a coder? And, as you’ll soon start asking: How can code write itself? These questions challenge the boundaries of science and religion.
Formally, heat entropy is the principle in thermodynamics that says that the path from order to disorder is irreversible. Entropy is partly why, after you’ve burned a tank of gas in your car, the exhaust is never going to rush back into your tailpipe, reverse-combust in your engine, and turn back into gasoline. Information entropy is similar, but it applies to data instead of heat. It’s not just a pattern of disorder; it’s also a number that measures the loss of information in a transmitted signal or message. A unit of information—the bit—measures the number of choices (the number of possible messages) symbols can carry. For example, an ASCII string of seven bits has 27 or 128 message choices. Information entropy measures the uncertainty that noise imposed upon the original signal. Once you know how much information you started with, entropy tells you how much you’ve lost and can’t get back. Information entropy is not reversible, because once a bit has been lost and becomes a question mark, it’s impossible to get it back. Worse yet, the decoder doesn’t report the question mark! It assigns it a 1 or 0. Half the time it will be right. But half the time it will be wrong, and you can’t know which bits are the originals and which bits are only guesses.
Random Mutation = Noise
Every communication system battles noise. Noise is a random change of a signal—in other words, noise is a mutation of the original message. You can listen to it, too. Just tune your radio dial to the space between stations and listen to the hiss. Or listen to the sound of air whooshing out of a vent in the ceiling. White noise. When you want to watch your favorite TV show, noise is your enemy. Once noise is with you, it’s with you forever. Toast never gets hotter after you take it out of the toaster. CDs never sound better after you copy them onto a cassette tape; you can make a million copies of that tape, and even if natural selection cooperates perfectly, the best version you can get is to select the least inferior version. Sometimes people have disagreed, offering hypothetical scenarios where a few random changes might just happen to give you the exact step you need. However, since a random change could happen anywhere across a billion base pairs in the genome, getting two or three or five errors to occur in an advantageous way without breaking something else first is statistically all but impossible. Again, nobody appreciates this more than a communications engineer.† Random mutation is noise and noise destroys. Random mutations = damaged DNA. We saw that in the fruit fly experiments. Any randomness-based theory of evolution violates the laws of information entropy. Music doesn’t get better when you scratch CDs. Organisms do not gain new features when their DNA mutates through damage or copying errors. Instead they get cystic fibrosis or some other birth defect, like legs instead of antennae growing out of a fruit fly’s head. Natural selection can clear competition by killing off inferior rivals. But it can’t work backward from a random mutation and undo the damage.
http://reasonandscience.heavenforum.org/t2405-codes-and-thermodynamics
Perry Marshall, Evolution 2.0
We understand codes better than we understand gravity, the laws of thermodynamics, or quantum physics. No one knows how to create gravity, but millions of people know how to create code, including some of the wealthiest businesspeople in the world. Everything we know from computer science provokes a huge question: How do you get a code without a coder? And, as you’ll soon start asking: How can code write itself? These questions challenge the boundaries of science and religion.
Formally, heat entropy is the principle in thermodynamics that says that the path from order to disorder is irreversible. Entropy is partly why, after you’ve burned a tank of gas in your car, the exhaust is never going to rush back into your tailpipe, reverse-combust in your engine, and turn back into gasoline. Information entropy is similar, but it applies to data instead of heat. It’s not just a pattern of disorder; it’s also a number that measures the loss of information in a transmitted signal or message. A unit of information—the bit—measures the number of choices (the number of possible messages) symbols can carry. For example, an ASCII string of seven bits has 27 or 128 message choices. Information entropy measures the uncertainty that noise imposed upon the original signal. Once you know how much information you started with, entropy tells you how much you’ve lost and can’t get back. Information entropy is not reversible, because once a bit has been lost and becomes a question mark, it’s impossible to get it back. Worse yet, the decoder doesn’t report the question mark! It assigns it a 1 or 0. Half the time it will be right. But half the time it will be wrong, and you can’t know which bits are the originals and which bits are only guesses.
Random Mutation = Noise
Every communication system battles noise. Noise is a random change of a signal—in other words, noise is a mutation of the original message. You can listen to it, too. Just tune your radio dial to the space between stations and listen to the hiss. Or listen to the sound of air whooshing out of a vent in the ceiling. White noise. When you want to watch your favorite TV show, noise is your enemy. Once noise is with you, it’s with you forever. Toast never gets hotter after you take it out of the toaster. CDs never sound better after you copy them onto a cassette tape; you can make a million copies of that tape, and even if natural selection cooperates perfectly, the best version you can get is to select the least inferior version. Sometimes people have disagreed, offering hypothetical scenarios where a few random changes might just happen to give you the exact step you need. However, since a random change could happen anywhere across a billion base pairs in the genome, getting two or three or five errors to occur in an advantageous way without breaking something else first is statistically all but impossible. Again, nobody appreciates this more than a communications engineer.† Random mutation is noise and noise destroys. Random mutations = damaged DNA. We saw that in the fruit fly experiments. Any randomness-based theory of evolution violates the laws of information entropy. Music doesn’t get better when you scratch CDs. Organisms do not gain new features when their DNA mutates through damage or copying errors. Instead they get cystic fibrosis or some other birth defect, like legs instead of antennae growing out of a fruit fly’s head. Natural selection can clear competition by killing off inferior rivals. But it can’t work backward from a random mutation and undo the damage.