Simon Conway Morris: Evolutionary Paleobiology at Cambridge, (Runes of Evolution, 38)
argues that adaptation is not an undirected, random walk through all possibilities. For example, when muscle tissue develops into organs that produce electricity, the process requires very precise amino acid replacements at specific sites, together with accelerated evolution of the new function, and Conway Morris concludes that:
“there is little doubt that these changes are very far from random”
He therefore argues that while the underlying principles of Darwinian evolution are correct, they do not provide a complete explanation of development, and a more comprehensive theory of evolution is required.
Paul Davies: the demon in the machine, page 78
A lot of E. coli suffered glucose deprivation. When the dust settled, this is what emerged. Mutations are not random: that part is correct. Bacteria have mutational hotspots – specific genes that mutate up to hundreds of thousands of times faster than average.
Starving bacteria, Rosenberg discovered, can switch from a high-fidelity repair process to a sloppy one. Doing so creates a trail of damage either side of the break, out as far as 60,000 bases or more: an island of selfinflicted
vandalism. Rosenberg then identified the genes for organizing and controlling this process. It turns out they are very ancient; evidently, deliberately botching DNA repair is a basic survival mechanism stretching back into the mists of biological history. By generating cohorts of mutants in this manner, the colony of bacteria improves its chances that at least one daughter cell will accidentally hit on the right solution. Natural selection does the rest. In effect, the stressed bacteria engineer their own high-speed evolution by generating genomic diversity on the fly.
And all available scientific evidence also indicates that evolution is an engineered process. In engineering and computer science, evolution never happens by accident. It’s always the result of a deliberate act. A program that can self-evolve is always considered an engineering marvel. 6
The macroscopic signals that a cell receives from its environment can influence which genes it expresses — and thus which proteins it contains at any given time — or even the rate of mutation of its DNA, which could lead to changes in the molecular structures of the proteins. 8
The genome has traditionally been treated as a Read-Only Memory (ROM) subject to change by copying errors and accidents. 7 I propose that we need to change that perspective and understand the genome as an intricately formatted Read-Write (RW) data storage system constantly subject to cellular modifications and inscriptions. Cells operate under changing conditions and are continually modifying themselves by genome inscriptions. These inscriptions occur over three distinct time-scales (cell reproduction, multicellular development and evolutionary change) and involve a variety of different processes at each time scale (forming nucleoprotein complexes, epigenetic formatting and changes in DNA sequence structure). Research dating back to the 1930s has shown that genetic change is the result of cell-mediated processes, not simply accidents or damage to the DNA. This cell-active view of genome change applies to all scales of DNA sequence variation, from point mutations to large-scale genome rearrangements and whole genome duplications (WGDs). This conceptual change to active cell inscriptions controlling RW genome functions has profound implications for all areas of the life sciences.
Ever since the formulation of the neo-Darwinist Modern Synthesis evolutionary theoryin the 1930s and 1940s, it has been an article of faith that hereditary variation results from stochastic copying errors and unavoidable damage to the genome
In the past 60 years, since the structure of DNA was elucidated, molecular biologists have studied the basic mechanisms of long-term genome change. They have discovered a wide array of proofreading and damage repair biochemical systems that remove copying errors and correct DNA damage. At the same time, they have revealed an amazing and wholly unanticipated array of cellular and molecular systems that operate to generate genome variability, both temporary and structural. As we begin the second decade of the 21st century, accumulating empirical evidence has thus shifted the perspective on genome variation to that of an active inscription process changing the information passed on to future generations.
It is clear that new gene alleles are accumulating in populations today, but there are two possible sources for these changes; mutations, and intentional changes introduced by genetic recombination. The theory of evolution attributes the continued production of genetic diversity to mutations, but evolutionists overlook the fact that the cell was intelligently designed. The cellular machinery was programmed to perform a level of self genetic engineering, and is editing genes systematically so that organisms can adapt to a wide variety of environmental conditions.
Evolutionists contend that mutation, acted upon by natural selection is the mechanism for evolutionary advancement. While this mechanism has the power to change the genome over time, most biological evolution is actually due to genetic recombination followed by natural selection. There are many examples put forward by evolutionary biologists that attempt to show how new genes have been introduced into the genome of an organism. However, in most documented cases it merely illustrates the built-in plasticity or variation within the original created kind. Merely shuffling of already existing genes becomes woefully inadequate if the observational science is followed.
Despite the few examples of beneficial genetic mutations it is unrealistic to assume that this information produced through changing already existing DNA would then be acted on again many more times by other related mutations to build radically different and complex structures than what was there previously. This is to say that mutations are not a reasonable means of producing cascading morphological change from one kind of animal to another but merely speciation.
Biophysicist Dr. Lee Spetner in his book, Not by Chance: Shattering the Modern Theory of Evolution, analyzed examples of mutational changes that evolutionists claimed were increases in information, and demonstrated that they were actually examples of loss of specificity, meaning loss of information.
“ In all the reading I've done in the life-sciences literature, I've never found a mutation that added information. … All point mutations that have been studied on the molecular level turn out to reduce the genetic information and not increase it." - Spetner ”
“ We see then that the mutation reduces the specificity of the ribosome protein and that means a loss of genetic information. ... Rather than saying the bacterium gained resistance to the antibiotic, it is more correct to say that is lost sensitivity to it. ... All point mutations that have been studied on the molecular level turn out to reduce the genetic information and not increase it.
Georgia Purdom, Ph.D. of molecular genetics, has stated,
“ Mutations only alter current genetic information; they have never, ever been observed to add genetic information; they can only change what is there. I have a lot of papers come across my desk of supposedly mutations that have added genetic information, and I've read them all, and I've looked at them all, and never, once have I seen one that has added genetic information; they just don't do that.
Nonrandom mutations 4
Mutations are supposed to be accidental, undirected events that are in no way adaptive. For example, if an animal species needs thick fur to survive in a cold climate, it will not respond by growing fur; rather, any animals who undergo random genetic changes that happen to result in thick fur will survive to produce more offspring. As Robert Gilson says, ‘The doctrine of random variation is just as unprovable as is the doctrine of the Virgin Birth, and just as sacrosanct to its adherents.’1
Attempts to justify the doctrine of random mutations usually refer to a series of experiments on the bacterium E. coli in the late 1940s and early 50s. These experiments found that when bacterial cells are suddenly subjected to a particular selection pressure (e.g. the addition of a lethal antibiotic), a small proportion of cells invariably survive because they contain a mutation that confers resistance to the antibiotic. Tests were then carried out which proved that the mutations were present in the surviving cells before the antibiotic was added to the culture, and that they were therefore truly spontaneous and nonadaptive. However, the original researchers recognized that this did not rule out the possibility of adaptive, nonrandom mutations.2
More recent experiments have shown that mutations can indeed occur in direct response to an environmental challenge – and have aroused great controversy.3 It has been found that bacteria which are unable to digest lactose, if given no other food, will after a few days develop new mutants that are able to handle it, the mutation rate being many orders of magnitude faster than the ‘spontaneous’, ‘random’ rate. Two independent mutations were needed, giving an ‘accidental’ explanation a probability of less than 1 in 10^18. Adaptive mutations also appear to occur in yeast cells and possibly fruit flies.4 The existence of adaptive mutations is now widely accepted, though the term ‘directed mutations’ is sometimes shunned. Although some of the biochemical mechanisms involved have been identified, there is no real understanding of what lies behind the phenomenon.
According to Eshel Ben-Jacob and his colleagues, ‘a picture of problem-solving bacteria capable of adapting their genome to problems posed by the environment is emerging’; ‘It seems as if the bacterial colony can not only compute better than the best parallel computers we have, but can also think and even be creative.’5 As James Shapiro has said, even the ‘simplest’ form of life – tiny, ‘brainless’ bacteria – ‘display biochemical, structural and behavioral complexities that outstrip scientific description’.6
The rapidity with which pests, from rats to insects, acquire resistance to poisons is also hard to account for on the basis of conventional evolutionary theory. Some 500 species of insects and mites have been able to defeat at least one pesticide by genetic changes that either defensively alter the insect’s physiology or produce special enzymes to attack and destroy the poison. 17 have shown themselves capable of resisting all chemicals deployed against them. As Robert Wesson says, ‘If it is true that mutations are much more frequent where they are needed than when they are virtually certain to be harmful, they cannot be held to be random.’7 Shapiro states that ‘All careful studies of mutagenesis find statistically significant nonrandom patterns of change ...’8
Molecular biologist Lynn Caporale points out that mutations seem to occur preferentially in certain parts of the genome while other DNA sequences tend to be conserved – which shows, she says, that evolution is not purely a game of chance. Although she believes that genomes can ‘steer’ mutations to ‘hot spots’ where they are more likely to increase fitness, and that the genome may be ‘in some way intelligent’, she does not believe that the actual mutations themselves are nonrandom in the sense of being somehow engineered by the organism in question to bring about the changes it needs.9 This is a good example of how Darwinists sometimes dress up their dogmas in ‘sexy’ and even mystical-sounding language.
the randomness of mutations has been called into question since the 1970s in experiments demonstrating that cells subject to non-lethal selection come up repeatedly with just the right ‘adaptive’ or ‘directed’ mutations in specific genes that enable the cells to grow and multiply 5
8 ) http://www.nature.com.https.sci-hub.hk/articles/35011540
Last edited by Otangelo on Mon Jun 21, 2021 7:17 am; edited 9 times in total