Entropy

What follows is meant to be understood (more or less) by the average inhabitant of cyberspace. Please consult the sources in the bibliography for a more rigorous understanding of these issues.

Q 1: What is evolution and what is entropy?

Q 2. So what is the problem between entropy and evolution?

Q 3: Science changes a lot over time. Perhaps we are mistaken about entropy being an observed fact?

Q 4. But the 2nd Law refers to an isolated system! We live in an open system, therefore isn't this whole argument totally invalid?

Q 5. Didn't Dr. Ilya Prigogine solve the contradiction between evolution and entropy, for which he was awarded the Nobel Prize?

Q 6. Is the growth of a human being from a fertilized egg an example of order arising in nature, contradicting the entropy argument?

Q 7. We see examples of locally reduced entropy in nature, such as snow and ice crystals. Isn't the existence of these structures proof that intelligence is not necessary for complexity to arise in nature?

SUMMARY

Bibliography

Quotations

Q 1: What is evolution and what is entropy?

Just what is evolution anyway? I've defined evolution as a macroscopic increase in information content in a self-reproducing system without intelligent intervention. Critical to understanding the concept of biological evolution is the idea of progressive increase in complexity over time. Humans, dolphins, walnut trees and the whole wide world of life itself contains enormously more information and complexity than a first primeval cell from which evolutionary doctrine teaches we are all derived. Sir Julian Huxley, the eminent British biologist and grandson of "Darwin's Bulldog," defined evolution like this:

"Evolution in the extended sense can be defined as a directional and essentially irreversible process occuring in time, which in its course gives rise to an increase of variety and an increasingly high level of organization in its products. Our present knowledge indeed forces us to the view that the whole of reality is evolution -- a single process of self-transformation." (Emphasis mine.)
Julian Huxley, "Evolution in Genetics," in What is Man, J.R. Newman, ed. (NY: Simon & Schuster, 1955), p. 278.

What does this have to do with an obscure term (entropy) from an obscure field (thermodynamics)? Thermodynamics is the study of the dynamics of heat, but the application of thermodynamic principles is universal. Entropy, in particular, is a concept that is applicable in everything from information theory to engineering:

"It is a very broad and very general law, and because its applications are so varied it may be stated in a great variety of ways."
E.S. Greene, Principles of Physics (New Jersey: Prentice-Hall, 1962), p. 310.

The 2nd Law of Thermodynamics, also called the Law of Entropy, states that the total amount of entropy in isolated systems is always decreasing. This could be the entire universe (which taken as a whole can be assumed to be an isolated system) or smaller system. But what is entropy? Entropy was defined in 1854 by Clausius as "the energy per degree of absolute temperature that cannot be recovered as work." (George Mulfinger, "History of Thermodynamics," in Thermodynamics and the Development of Order (Norcross, GA: Creation Research Society Books, 1981), p.6. Entropy represents the disorder of a system, the loss of available energy for work. That energy is not actually annihilated, just converted to an unusable form (heat). The fields of informational thermodynamics and statistical thermodynamics study the processes of entropy in information content such as DNA and in statistically definable stuctures such as those involved in abiogenesis.

In simpler terms, R.L. Wysong described the 2nd law in this way:
1. Systems will tend toward the most probable state.
2. Systems will tend toward the most random state.
3. Systems will increase entropy, where entropy is a measure of the availability of energy to do useful work.

The Creation-Evolution Controversy (Midland, MI: Inquiry Press, 1976), p. 241.

Combined with the 1st Law of Thermodynamics (that matter/energy is neither created nor destroyed) we can say with certainty that the universe contained maximum order at the very beginning of its existence, and has been running down since then. We also know that the age of the universe is finite because entropy is occuring at a finite rate and the fact that the universe is not yet completely disordered proves that the disordering process (entropy) began a finite amount of time ago in the past. Cosmological models that attempt to explain the origin of the universe need to keep these facts in mind. They need to explain:

1. The appearance of matter/energy (of the universe itself). What is a competent source of matter/energy if it cannot spontaneously spring into being by itself per the 1st Law (and also the law of cause and effect)?

2. The fact that the initial state of the universe, far from being chaotic and disordered, was the period of maximum order and organization throughout the universe. What is a competent source for the order and organization that must have been present in the beginning if the Law of Entropy demands that decay and disorganization is always increasing?

Q 2. So what is the problem between entropy and evolution?

Look again at the definitions of evolution above. Evolution is a process of increasing order and complexity over time. In the words of Huxley, it is an "essentially irreversible process....which.... gives rise to an increase of variety and an increasingly high level of organization." But the Law of Entropy proclaims just the opposite will happen over time! We know from innumerable observations of the real world that degeneration and decay is occuring throughout the universe. Despite Huxley's patently absurd claim, "our present knowledge indeed forces us to the view that the whole of reality is" not evolutionary in nature but entropic in nature.

Dr. Henry Morris put it this way:

"Not only does the Second Law point back to creation; it also directly contradicts evolution. Systems do not naturally go toward higher order, but toward lower order. Evolution requires a universal principle of upward change; the entropy law is a universal principle of downward change."
H.M. Morris and G.E. Parker, What is Creation Science (El Cajon, CA: Master Books, 1987), p. 204.

Q 3: Science changes a lot over time. Perhaps we are mistaken about entropy being an observed fact?

The nature of entropy is such that it is easy to observe and study. The 2nd law was formulated in the middle of the 19th century and has required no modification in any form since then. The laws of thermodynamics may be considered among the most solidly observed and established empirical facts of science. Consider these quotes:

"Classical thermodynamics... is the only physical theory of universal content concerning which I am convinced that, within the framework of applicability of its basic concepts, it will never be overthrown."
Albert Einstein, in M.J. Klein, "Thermodynamics in Einstein's Universe," Science, 157(1967): 509. (Cited in Wysong, p. 248.)

"The second law of thermodynamics not only is a principle of wide reaching scope and application, but also is one which has never failed to satisfy the severest test of experiment. The numerous quantitative relations derived from this law have been subjected to more and more accurate experimental investigation without the detection of the slightest inaccuracy."
G.N. Lewis and M. Randall, Thermodynamics (NY: McGraw-Hill, 1961), p. 87.

"... if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation."
Sir A.S. Eddington, The Nature of the Physical World (New York: MacMillan, 1930), p. 74.

The "grand story" of evolution, of a world of developing complexity and organization over time, is a myth because it contradicts the law of entropy. It demands increases in information content, in organization and in complexity, and entropy specifies exactly the opposite. We have never "seen" evolution in the real world (except for degenerative changes that have been termed "evolution" like antibiotic resistance and processes of genetic impoverishment such as breeding) but we see entropy at work every day, from our aging bodies to corrupted computer files to rotting animals and rusting cars. This is what is so remarkable about evolutionism. It has managed to convince many people of a vision of reality that is just the reverse of what we actually see every day of our lives.

Q 4. But the 2nd Law refers to an isolated system! We live in an open system, therefore isn't this whole argument totally invalid?

To defend against the law of entropy evolutionists have taught many children and adults that "the earth is an open system, therefore the entropy argument doesn't apply." Often people already biased against creation and science then become close-minded and dismissive towards the entropy argument. This is unfortunate because this simplistic teaching reflects a distortion of the full entropy argument and warps the person's understanding of entropy in the real world.

First of all, note that in the broad sense the evolutionary worldview does indeed refer to the entire universe, which can indeed be treated as a closed or isolated system. That is what Huxley's quote in question #1 says when it speaks of the "whole of reality." The basic Entropy law does indeed contradict this belief.

Second, Creationists and scientists in general do indeed recognize that the earth is an open system. However, does that mean engineers can ignore entropy because as a practical matter all our manmade systems are open systems and we can therefore ignore the effects of entropy? Of course not. Entropy occurs in open systems as well as closed ones.

It is simply not true, despite the frequent claims of evolutionists - some of whom really should know better - that the 2nd Law applies only to closed systems (see Gish, 1993, pp. 162-163, etc.). Dr. J. Ross of Harvard writes:
"Ordinarily the second law of thermodynamics is stated for isolated systems, but the second law applies equally well to open systems... there is somehow associated with the field of far-from equilibrium phenomena the notion that the second law of thermodynamics fails for such systems. It is important to make sure that this error does not perpetuate itself." (Emphasis mine)
Chemical and Engineering News, July 17, 1980, p. 40.

In fact, as a rule of thumb entropy occurs more rapidly in open systems than in closed systems. Unless there are precisely established control elements as described below, allowing energy to flux through a system will simply speed up the decay. Consider:

1. A freeze dried meal is kept in its' vacuum-sealed pouch. An identical meal pouch is torn open and left in the sun. The contents of which will decay faster?

2. A mothballed building is enclosed to a solid bubble that blocks out everything from radiation to the weather; an identical building is left standing in the open. Which will break down and collapse more rapidly?

3. A deceased human body is placed in a secure and airtight coffin; another body is thrown in a dirt hole. Whose body will decay faster?

As a general rule, the more open the system the faster decay will occur. But it is obvious (as the evolutionists point out) that this is not always true. If decay always increases all the time and everywhere, we would achieve instant universal heat death and there would be no organization or complexity anywhere. Yet this computer exists in front of me, an example of complexity and order in spite of the processes of decay affecting it. You and I exist, our bodies being a source of tremendous complexity amidst the constancy of entropy. How can these local patterns of order exist in a world of decay?

Scientists have identified four criteria which, if met, will allow an open system to locally reduce entropy and increase order and organization within the system. People who rely on the "open system" argument need to acknowledge and understand that open systems are a necessary but by themselves insufficient requirement for complexity.

Four Requirements for Complexity (Biological or Otherwise) in a System:

1. System Must Be Open

2. An Adequate Energy Supply Must be Available

3. Energy Conversion Tools/Mechanisms

4. Blueprint/Template/Control System Must Exist to Organize Converted Energy

There is an element of precision which must exist among these components for the system to achieve reduced entropy. For example, for component #2 the energy available must be of the correct quality and quantity for the required output. An example of qualitative requirements would be a normal car engine. It needs gasoline to function and run the car up a hill, not diesel, natural gas, water, blood plasma, electricity, etc. And if 5 oz. of gasoline are required, a quantity of 3 oz. would fail to achieve the hill climb and the car would roll back down. Simply having some kind of energy, or some amount of the correct form of energy, does not guarantee anything.

Next, energy conversion tooling must be correct for the reduced entropy output. For example, construction tools of the correct types are required to build a house, not kitchen appliances, mainframe computers, car engines, metal-forges, farming implements, mining equipment, etc. Simply having some sort of energy conversion going on does not guarantee anything.

Lastly, blueprints must be specific and correctly matched with the energy and tooling for a reduced entropy output. If a certain protein molecule is to be constructed by a living cell, it will not be constructed by sections of DNA that code for non-protein molecules, or the wrong protein molecule.

Without highly coordinated tooling, blueprints and correct energy inputs, entropy increases (sometimes dramatically) in an open system - it does not decrease.

Consider the following questions regarding the proposed construction of a building. Will the building be constructed if:

1. All necessary construction equipment, power (electrical utilities, diesel generators), construction workers and necessary blueprints are available, but they cannot reach the construction site? Of the four items listed above, what is the missing component?

2. What if the the workers and their equipment are at the site and they have power, but no blueprints?

3. What if the blueprints and power are available, but the workers and/or their equipment is missing?

4. What if the equipment, workers, and blueprints are at the site but there is no energy, no power?

The reason evolutionists are characteristically loathe to delve into the open systems argument beyond superficial claims should be clear. In every case where we know (by observation) the origin of a complex, organized system we invariably find intelligence was a necessary agent. Whether in the construction of a skyscraper or a bee hive, wherever we observed the construction of the complex system we have always found animal or human intelligence at work. Bees build hives, humans build skyscrapers, birds make nests, beavers build dams, and so on. The nature of open systems confirms and strengthens the argument for an intelligent designer of organic systems.

Q 5. Didn't Dr. Ilya Prigogine solve the contradiction between evolution and entropy, for which he was awarded the Nobel Prize?

Dr. Prigogine won his Nobel prize in 1977 for work on "dissipative structures" and the effects of entropy on "far-from equilibrium" structures. A sort of myth has developed since then that his work somehow "solves" the contradiction between entropy and evolution.

In his research it was found that certain structures could be designed in experiments which caused orderly patterns to appear due to local reductions in entropy. The structures and patterns tended to be ephemeral and unstable, but many expressed a conviction that somehow this could explain the origin of life from non-life. However, Prigogine himself denied this. With two coauthors, he wrote the following:

"The point is that in a non-isolated system there exists a possibility for formation of ordered, low-entropy structures at sufficiently low temperatures. This ordering principle is responsible for the appearance of ordered structures such as crystals as well as for the phenomena of phase transitions.
Unfortunately this principle cannot explain the formation of biological structures. The probability that at ordinary temperatures a macroscopic number of molecules is assembled to give rise to the highly ordered structures and to the coordinated functions characterizing living organisms is vanishingly small.
"
I. Prigogine, G. Nicolis and A. Babloyants, Physics Today 25(11):23 (1972).

In other words, the dissipative structures create a sort of simple template - component #4 of the list in question #4. Along with energy influx and the simple "tooling" of natural chemical interactions the template can cause simple order to form. But this order stems implicitly from the natural order of the universe. To be consistent with observed examples of order arising from human and animal activity, we should take this as evidence of intelligent design in the natural universe and the laws and chemical interactions thereof.

But is such order easy to attain in the lab? Thaxton, Olsen and Bradley, in their landmark work on the origin of life, have this to say about the research on dissipative structures:

"They [Nicolis and Prigogine] speculate that the low degree of spatial ordering achieved in the simple trimolecular model could potentially be orders of magnitude greater for the more complex reactions one might observe leading up to a fully replicating cell. The list of boundary constraints, relative reaction rates, etc. would, however, also be orders of magnitude larger. As a matter of fact, one is left with so constraining the system at the boundaries that ordering is inevitable from the structuring of the environment by the chemist."
C.B. Thaxton, W.L. Bradley and R.L. Olsen, The Mystery of Life's Origin (Dallas: Lewis & Stanley, 1984), p. 153.

In other words to achieve more complex levels of output, the inputs must also become much more complex and carefully designed. This leads to a key issue. The past generation origin-of-life theorists, I believe, have been relying on a critical logical fallacy. Rather than relying on "chance" for the formation of the first cell, they are now engaged in an attempt to specify conditions that would automatically lead to the formation of the first cell.

So far, so good. Reliance on chance is antithetical to science in the first place, and I suspect they may someday succeed. They may someday establish a process by which a cell could form from inorganic, naturally occuring components. But here is the problem: it will involve something like 512 earthquakes, 313 meteor showers, 67 comet strikes, 192 volcanic eruptions, and so on and on and on, each occurring in precise timing and strength of effect and in coordination with one another. And it may result in a functioning, reproducing cell. But then we have to ask, what is the chance of all that having actually happened in the past?

In effect, modern theorists are saying "OK, believing a car could just form by chance is unreasonable. But if we just assume the existence of a robotic, automated and fully supplied automobile factory, why then it naturally follows that a car would result...." They are leaping out of the frying pan and into the fire, demanding an even more complex initial set of conditions than the desired result (just as the law of entropy demands). Yet such initial conditions, by definition, would be even more unlikely. Better to just go back to the old "by random chance" models!

Dr. Prigogine knew he had not "solved" the entropy problem and that his work raised new issues more so than answers. Thus, Dr. Morris writes:

"It is now common for evolutionists to respond to creationist debaters on the implications of entropy merely by stating that Prigogine has solved the problem. The fact is, of course, he has done no such thing, and he himself has refused to debate with creationists on this issue."
What is Creation Science (El Cajon, CA: Master Books, 1987), p. 216.

Secondly, as Thaxton, Bradly, Olsen and many creation scientists going back to A.E. Wildersmith have long pointed out, the formation of simple patterns of order has nothing to do with the kind of complexity found in life. The simple patterns of order are the product of "rules" implicit in the natural order of the universe. By contrast, the patterns of complexity and order found in living organisms is the product of an entirely different set of "rules" of a much higher order, orders of magnitude more complex.

An analogy may be drawn between humans and monkeys sitting before typewriters. If we give each group an incentive to type and produce work for the enjoyment of judges, should we expect the same result? Should the humans produce work of a similar nature to that of the monkeys? No, because the rules and conventions guiding the typing of the humans (symbolic language) is of an entirely different nature than whatever simple rules would guiding the monkeys' keystrokes. It is not simply a matter of high levels of order, but a different set of rules altogether, which assures different results. The works of the leading information scientist Hubert Yockey may be consulted profitably for more information on this profound point, which is too little understood.

Q 6. Is the growth of a human being from a fertilized egg an example of order arising in nature, contradicting the entropy argument?

It has also been argued (more among lay evolutionists than the more scientific ones) that the growth of an animal or human from a single egg to adult somehow serves as an example of evolution overcoming entropy. This is clearly a mistaken analogy.

In the example of human growth it is important to understand that all information required to construct the adult human body is present in the single fertilized egg cell. When an egg develops into an adult it is simply converting energy and raw materials into additional order and complexity in accord with the criteria in question #4 above. The expressed complexity of an egg is lower than the expressed complexity of the same being in adult form, but the information content or potential complexity off the egg cell is the same as that of the entire adult.

In fact, entropy is occuring all the time and the body never does achieve its maximum potential. Mutations occur in cells within the body itself, the organs suffer developmental defects, our bodies become worn down by injuries and diseases that leave their mark. Eventually the growth of our body slows down, but the effect of entropy accumulates steadily. At our point of "maximum order" in early adulthood we have achieved much greater expressed complexity than we had as a fetus, but the complexity of our body is well below the potential order inherent in the DNA which we could have achieved if entropy had not been acting from day 1. We begin to degenerate faster than our body grows or repairs itself, not because entropy is getting worse but because the growth process is tailing off. Entropy causes our body to become more and more disordered (we call this "growing old") as systems begin to lose effectiveness, slow down or shut down entirely. Eventually our bodies or critical systems within it suffer so much entropy that they cease functioning and death comes.

This is not analogous to evolution. In evolution the first cell did not have within it all the information needed to construct humans, elephants, walnut trees, etc. Evolution (it is claimed) is not a process of expressing pre-existing information (as biological growth is), it is a process by which new information comes to exist in the first place. To form that new information, that new complexity, the criteria of question #4 must still be applied.

Q 7. We see examples of locally reduced entropy in nature, such as snow and ice crystals. Isn't the existence of these structures proof that intelligence is not necessary for complexity to arise in nature?

The existence of local reductions of entropy, of increased order, is apparent not only in the labs of men like Prigogine, but also in nature. The formation of crystals is perhaps the best known example. Here we have a process by which cooling substances form themselves into patterns of order. Could this effect explain the origin and complexity of life? Does it prove intelligence is not required to achieve complexity?

In the case of crystals, order is appearing by removing energy from the system, not by adding it. Dr. Vardiman states "crystal order results from the withdrawal of heat energy, whereas evolutionists argue that evolution sustains itself by the addition of heat energy from the sun." (Impact #162). As energy is removed from the system atoms stop flying around so energetically and chemical bonding occurs, trying the atoms together in a solid matrix. Because the bonding follows stable laws of chemistry patterns appear, resulting in crystallization.

Even if the process of crystallization occurs in the opposite direction of presumed evolutionary scenarios it still holds significance. Does crystallization follow the four-point criteria outlined in question #4? Is intelligence required for crystalization to occur?

Regarding the four-point criteria, yes, crystallization does follow it. The system must be open, as always. Adequate energy, required to achieve the chemical bonding between atoms, must be present. The conversion of energy and template requirements are effectively fulfilled by the laws of chemistry themselves. In other words, the rules governing chemical interactions actually form a kind of simple template that yields order.

So is intelligence required? To answer that we would need to know the origin of the laws of chemistry. Were they intelligently ordered or are they a product of nature? If we disregard documentation like the book of Genesis, we have no observational evidence regarding the origin of chemical laws. As noted above, when we do know the origin of an ordered system we find intelligence is necessary. It would be begging the question to assume that intelligence is not necessary for crystals to form. A better deduction would be to expect the laws from which the order is derived are themselves the product of a lawgiver, making crystallization consistent with observed examples of entropy reduction.

Still, if intelligence could have yielded natural laws that produced crystals, couldn't life and evolutionary patterns also be the product of this intelligent agent. Can we explain the origin of life and the complexity we see today in a manner akin to crystal formation?

No, in fact, processes of natural order would actually inhibit the appearance of higher order such as is found in life. We will look at another monkey analogy to see why.

Imagine two monkeys. Monkey #1 types in a truly random fashion on his simplified 27-key typewriter (26 letters plus a space bar - we won't worry about punctuation). Monkey #2, perhaps more realistically, does not actually hit the keys in a totally random manner. He tends to pound the same key repeatedly, or type simple patterns repetitively. Monkey #2 has odd quirks and patterns to his behavior, such as always typing the "H" key after the "T" key, and he seems to like pressing the "E" key a lot. In comparing monkey behavior with the behavior of atoms, Monkey #2 is more realistic because atoms don't just join up with other molecules in a totally random pattern. Some atoms are much more like to join up with others because of bonding patterns and valences (remember all that stuff you learned in chemistry), and some molecules are more likely simply because their component atoms occur much more frequently in nature.

Now, here is the question. Which monkey is more likely to produce a phrase like TO BE OR NOT TO BE THAT IS THE QUESTION?

Work in progress....

SUMMARY

Simple templates (spatial and chemical relationships of atoms) and tooling (laws of chemical interaction) with naturally available energy can yield simple repetitive organized structures (crystals). CANNOT extrapolate these examples to more complex structures in nature w/o changing laws of chemistry in the process (salt crystals always salt crystals, never "evolve"). "... evolution is supposed to be open-ended, continuing indefinitely its growth in order, whereas a crystal, once formed deterministically by the pre-coded system which produced it, is at a dead end, and can go no further toward higher order. (Evolution and the Snowflake)

(Notes)

[Emphasize difference between order and complexity with monkey-typewriter examples; "Only recently has it been appreciated that the distinguishing feature of living systems is complexity rather than order." - Thaxton, et al, p. 129 below. They credit a 1973 work by Leslie Orgel and later authors, but creationist Dr. A.E. Wilder-Smith was making the same point years earlier in his The Creation of Life: A Cybernetic Approach to Evolution ]

Illustrate with the monkey story - simple patterns of order destroy or prevent higher levels of order: asfhjoerao ireawae8fh vs. shshshshshshshsh vs. To be or not to be...

"Each physical agent operating at a higher level must function with greater order and power than the effect it produces. The ultimate cause which controls all secondary processes must have infinite power and organizing intelligence." (Evolution and the Snowflake)

Bibliography

The following books have good discussions of entropy in relation to evolution, explaining the contradiction in some detail.

Gish, Duane, Creation Scientists Answer Their Critics (El Cajon, CA: Institute for Creation Research, 1993), pp. 151-208, 387-439. An extensive and readable overview of this field of the creation/evolution controversy in some depth directly countering and correcting opponents' claims. Some technical aspects of thermodynamics covered.

Morris, Henry M. and Gary E. Parker, What is Creation Science? (El Cajon, CA: Master Books, 1987), pp. 190-222. A standard science-only text on origins.

Pitman, Michael, Adam and Evolution (London: Rider & Company, 1984), pp. 229-234. An engaging book by a crypto-creationist, a biology instructor at Oxford University.

Thaxton, Charles, Walter Bradley and Roger Olsen, The Mystery of Life's Origin (Dallas: Lewis & Stanley, 1984), pp. 113-166. The bestselling textbook on abiogenesis in the 80's, contains a rigorous, mathematical discussion of the entropy problem. Written by old-earthers, but still an extremely good book for the topics covered.

Vardiman, Larry, "Evolution and the Snowflake," Impact #162 (Institute for Creation Research) December 1986, pp. i-iv.

Williams, Emmett, ed., Thermodynamics and the Development of Order (Terre Haute, IN: Creation Research Society Books, 1981), 141 pages. Essays by a number of CRS scientists on the subject. For more technical readers.

Wysong, R.L., The Creation-Evolution Controversy (Midland, MI: Inquiry Press, 1976), pp. 239-263.)

[A.E. Wilder-Smith, The Creation of Life...]

Quotations

"Throughout Chapters 7-9 we have analyzed the problems of complexity and the origin of life from a thermodynamic point of view. Our reason for doing this is the common notion in the scientific literature today on the origin of life that an open system and mass flow is a priori a sufficient explanation for the complexity of life. We have examined the validity of such an open and constrained system. We found it to be a reasonable explanation for doing the chemical and thermal entropy work, but clearly inadequate to account for the configurational entropy work of coding (not to mention the sorting and selecting work)." (Thaxton, et al, p. 165.)

"The probability that at ordinary temperatures a macroscopic number of molecules is assembled to give rise to the highly ordered structures and to the coordinated functions characterizing living organisms is vanishingly small. The idea of spontaneous genesis of life in its present form is therefore highly improbable, even on the scale on billions of years during which prebiotic evolution occurred."
Nobel Laureate Dr. Ilya Prigogine, in Physics Today, November 1972, p. 23. (Cited in Thaxton, et al, p. 121.)

"Needless to say, these simply remarks cannot suffice to solve the problem of biological order. One would like not only to establish that the second law (dSi > or = 0) is compatible with a decrease in overall entropy (dS<0), but also to indicate the mechanisms responsible for the emergence and maintenance of coherent states."
Prigogine, I. and G. Nicolis, Self Organization in Nonequilibrium Systems (New York: John Wiley, 1977), p. 25.


Return to Creation Science/Life Sciences


(Created: 3 February 1997 - Last Update: 25 June 1997)