ElShamah - Reason & Science: Defending ID and the Christian Worldview
Would you like to react to this message? Create an account in a few clicks or log in to continue.
ElShamah - Reason & Science: Defending ID and the Christian Worldview

Welcome to my library—a curated collection of research and original arguments exploring why I believe Christianity, creationism, and Intelligent Design offer the most compelling explanations for our origins. Otangelo Grasso


You are not connected. Please login or register

There was no prebiotic selection to get life originating

Go down  Message [Page 1 of 1]

Otangelo


Admin

There was no prebiotic selection to get life originating

https://reasonandscience.catsboard.com/t3121-there-was-no-prebiotic-selection-to-get-life-originating

The four basic building blocks of life are DNA, RNA, proteins, and lipids. These molecules are essential for the function and replication of living cells, and their origin is a key question in the study of abiogenesis.

One challenge for naturalistic explanations of the origin of these molecules is the lack of a selection mechanism. In biological systems, natural selection acts on organisms with heritable traits, favoring those that are better adapted to their environment and promoting the propagation of their genes. However, in the early stages of life on Earth, there were no organisms with heritable traits to select for or against.

This presents a problem for explaining how the four basic building blocks of life could have arisen through purely naturalistic processes. Without a selection mechanism to favor certain molecules over others, it is difficult to see how the complex, information-rich molecules that we see in modern cells could have emerged from the chemical soup of early Earth.

Some researchers have proposed that other factors, such as self-organization or autocatalysis, may have played a role in the origin of life. However, these ideas are still largely speculative and face their own challenges and limitations.


Synonym for selecting is: choosing, picking, handpicking, sorting out, discriminating, choosing something from among others, and giving preference to something over another.

Andrew H. Knoll: FUNDAMENTALS OF GEOBIOLOGY 2012
The emergence of natural selection
Molecular selection, the process by which a few key molecules earned key roles in life’s origins, proceeded on many fronts. (Comment: observe the unwarranted claim) Some molecules were inherently unstable or highly reactive and so they quickly disappeared from the scene. Other molecules easily dissolved in the oceans and so were effectively removed from contention. Still, other molecular species may have sequestered themselves by bonding strongly to surfaces of chemically unhelpful minerals or clumped together into tarry masses of little use to emerging biology. In every geochemical environment, each kind of organic molecule had its dependable sources and its inevitable sinks. For a time, perhaps for hundreds of millions of years, a kind of molecular equilibrium was maintained as the new supply of each species was balanced by its loss. Such equilibrium features nonstop competition among molecules, to be sure, but the system does not evolve. 6

Comment:  That is the key sentence. There would not have been a complexification to higher order into machine-like structures, but these molecules would either quickly disappear, dissolve in the ocean, or clumped together into tarry masses of little use to emerging biology. The system does not evolve. In other words, chemical selection would never take place.

We know that we, as intelligent beings, do make choices to get the desired outcome all the time - and there is no alternative to conscious intelligent action. Therefore, it is logical and plausible, and probable, that an intelligent creator was in action, choosing the parameters of the laws of physics, the right equations, the right adjustments in the universe, the right building blocks of life, the right machinery to have given life a first go.  And he was remarkably good at that.

1. Life requires the use of a limited set of complex biomolecules, a universal convention, and unity which is composed of the four basic building blocks of life ( RNA and DNA's, amino acids, phospholipids, and carbohydrates). They are of a very specific complex functional composition and made by cells in extremely sophisticated orchestrated metabolic pathways, which were not extant on the early earth. If abiogenesis were true, these biomolecules had to be prebiotically available and naturally occurring ( in non-enzyme-catalyzed ways by natural means ) and then somehow join in an organized way and form the first living cells. They had to be available in big quantities and concentrated at one specific building site. 
2. Making things for a specific purpose, for a distant goal, requires goal-directedness. And that's a big problem for naturalistic explanations of the origin of life. There was a potentially unlimited variety of molecules on the prebiotic earth. Competition and selection among them would never have occurred at all, to promote a separation of those molecules that are used in life, from those that are useless. Selection is a scope and powerless mechanism to explain all of the living order, and even the ability to maintain order in the short term and to explain the emergence, overall organization, and long-term persistence of life from non-living precursors. It is an error of false conceptual reduction to suppose that competition and selection will thereby be the source of explanation for all relevant forms of the living order.
3. We know that a) unguided random purposeless events are unlikely to the extreme to make specific purposeful elementary components to build large integrated macromolecular systems, and b) intelligence has goal-directedness. Bricks do not form from clay by themselves, and then line up to make walls. Someone made them. Phospholipids do not form from glycerol, a phosphate group, and two fatty acid chains by themselves, and line up to make cell membranes. Someone made them. That is God.


If a machine has to be made out of certain components, then the components have to be made first.'

Molecules have nothing to gain by becoming the building blocks of life. They are "happy" to lay on the ground or float in the prebiotic ocean and that's it. Being incredulous that they would concentrate at one building site in the right mixture, and in the right complex form, that would permit them to complexify in an orderly manner and assembly into complex highly efficient molecular machines and self-replicating cell factories, is not only justified but warranted and sound reasoning. That fact alone destroys materialism & naturalism. Being credulous towards such a scenario means to stick to blind belief. And claiming that "we don't know (yet), but science is working on it, but the expectation is that the explanation will be a naturalistic one ( No God required) is a materialism of the gaps argument.

A Few Experimental Suggestions Using Minerals to Obtain Peptides with a High Concentration of L-Amino Acids and Protein Amino Acids 10 December 2020
The prebiotic seas contained L- and D-amino acids, and non-Polar AAs and Polar AAs, and minerals could adsorb all these molecules. Besides amino acids, other molecules could be found in the primitive seas that competed for mineral adsorption sites. Here, we have a huge problem that could be a double-edged sword for prebiotic chemistry. On the one hand, this may lead to more complex prebiotic chemistry, due to the large variety of species, which could mean more possibilities for the formation of different and more complex molecules. On the other hand, this complex mixture of molecules may not lead to the formation of any important molecule or biopolymer in high concentration to be used for molecular evolution. Schwartz, in his article “Intractable mixtures and the origin of life”, has already addressed this problem, denominating this mixture the “gunk”. 5

Intractable Mixtures and the Origin of Life 2007
A problem which is familiar to organic chemists is the production of unwanted byproducts in synthetic reactions. For prebiotic chemistry, where the goal is often the simulation of conditions on the prebiotic Earth and the modeling of a spontaneous reaction, it is not surprising – but nevertheless frustrating – that the unwanted products may consume most of the starting material and lead to nothing more than an intractable mixture, or -gunk.. The most well-known examples of the phenomenon can be summarized quickly: Although the Miller –Urey reaction produces an impressive set of amino acids and other biologically significant compounds, a large fraction of the starting material goes into a brown, tar-like residue that remains uncharacterized; i.e., gunk. While 15% of the carbon can be traced to specific organic molecules, the rest seems to be largely intractable 

Even if we focus only on the soluble products, we still have to deal with an extremely complex mixture of compounds. The carbonaceous chondrites, which represent an alternative source of starting material for prebiotic chemistry on Earth, and must have added enormous quantities of organic material to the Earth at the end of the Late Heavy Bombardment (LHB), do not offer a solution to the problem just referred to. The organic material present in carbonaceous meteorites is a mixture of such complexity that much ingenuity has gone into the design of suitable extraction methods, to isolate the most important classes of soluble (or solubilized) components for analysis.

Whatever the exact nature of an RNA precursor which may have become the first selfreplicating molecule, how could the chemical homogeneity which seems necessary to permit this kind of mechanism to even come into existence have been achieved? What mechanism would have selected for the incorporation of only threose, or ribose, or any particular building block, into short oligomers which might later have undergone chemically selective oligomerization? Virtually all model prebiotic syntheses produce mixtures. 6

Life: What A Concept! https://jsomers.net/life.pdf
Craig Venter: To me the key thing about Darwinian evolution is selection. Biology is a hundred percent dependent on selection. No matter what we do in synthetic biology, synthetic genomes, we're doing selection. It's just not
natural selection anymore. It's an intelligently designed selection, so it's a unique subset. But selection is always part of it.
My comment: 
What natural mechanisms lack, is goal-directedness. And that's a big problem for naturalistic explanations of the origin of life. There was a potentially unlimited variety of molecules on the prebiotic earth. Why should competition and selection among them have occurred at all, to promote a separation of those molecules that are used in life, from those that are useless? Selection is a scope and powerless mechanism  to explain all of the living order, and even the ability to maintain order in the short term, and to explain the emergence, overall organization, and long-term persistence of life from non-living precursors. It is an error of false conceptual reduction to suppose that competition and selection will thereby be the source of explanation for all relevant forms of order.

The problem of lack of a selection mechanism extends to the homochirality problem. 
A. G. CAIRNS-SMITH Seven clues to the origin of life, page 40:
It is one of the most singular features of the unity of biochemistry that this mere convention is universal. Where did such agreement come from? You see non-biological processes do not as a rule show any bias one way or the other, and it has proved particularly difficult to see any realistic way in which any of the constituents of a 'prebiotic soup' would have had predominantly 'left-handed' or right-handed' molecules. It is thus particularly difficult to see this feature as having been imposed by initial conditions.

In regards to the prebiotic synthesis of the basic building blocks of life, I list 23 problems directly related to the lack of a selection mechanism on the prebiotic earth. This is one of the unsolvable problems of abiogenesis. 
Selecting the right materials is absolutely essential. But a prebiotic soup of mixtures of impure chemicals would never purify and select those that are required for life. Chemicals and physical reactions have no "urge" to join, group, and start interacting in a purpose and goal-oriented way to produce molecules, that later on would perform specific functions, and generate self-replicating factories, full of machines, directed by specified, complex assembly information. This is not an argument from ignorance, incredulity, or gaps of knowledge.

William Dembski: The problem is that nature has too many options and without design couldn’t sort through all those options. The problem is that natural mechanisms are too unspecific to determine any particular outcome. Natural processes could theoretically form a protein, but also compatible with the formation of a plethora of other molecular assemblages, most of which have no biological significance. Nature allows them full freedom of arrangement. Yet it’s precisely that freedom that makes nature unable to account for specified outcomes of small probability. Nature, in this case, rather than being intent on doing only one thing, is open to doing any number of things. Yet when one of those things is a highly improbable specified event, design becomes the more compelling, better inference. Occam's razor also boils down to an argument from ignorance: in the absence of better information, you use a heuristic to accept one hypothesis over the other.
http://www.discovery.org/a/1256

Out of the 27 listed problems of prebiotic RNA synthesis, 8 are directly related to the lack of a mechanism to select the right ingredients.
1.How would prebiotic processes have purified the starting molecules to make RNA and DNA which were grossly impure? They would have been present in complex mixtures that contained a great variety of reactive molecules.
2.How did fortuitous accidents select the five just-right nucleobases to make DNA and RNA, Two purines, and three pyrimidines?
3.How did unguided random events select purines with two rings, with nine atoms, forming the two rings: 5 carbon atoms and 4 nitrogen atoms, amongst almost unlimited possible configurations?
4.How did stochastic coincidence select pyrimidines with one ring, with six atoms, forming its ring: 4 carbon atoms and 2 nitrogen atoms, amongst an unfathomable number of possible configurations?
5.How would these functional bases have been separated from the confusing jumble of similar molecules that would also have been made?
6.How could the ribose 5 carbon sugar rings which form the RNA and DNA backbone have been selected, if 6 or 4 carbon rings, or even more or less, are equally possible but non-functional?
7.How were the correct nitrogen atom of the base and the correct carbon atom of the sugar selected to be joined together?
8.How could right-handed configurations of RNA and DNA have been selected in a racemic pool of right and left-handed molecules? Ribose must have been in its D form to adopt functional structures ( The homochirality problem )

Out of the 27 listed problems of prebiotic amino acid synthesis, 13 are directly related to the lack of a mechanism to select the right ingredients.
1. How did unguided stochastic coincidence select the right amongst over 500 that occur naturally on earth?
2. How were bifunctional monomers, that is, molecules with two functional groups, so they combine with two others selected, and unifunctional monomers (with only one functional group) sorted out?
3. How could achiral precursors of amino acids have produced/selected and concentrated only left-handed amino acids? ( The homochirality problem )
4. How did the transition from prebiotic enantiomer selection to the enzymatic reaction of transamination occur that had to be extant when cellular self-replication and life began?
5. How would natural causes have selected twenty, and not more or less amino acids to make proteins?
6. How did natural events have foreknowledge that the selected amino acids are best suited to enable the formation of soluble structures with close-packed cores, allowing the presence of ordered binding pockets inside proteins?
7. How were bifunctional monomers, that is, molecules with two functional groups so they combine with two others selected, and unifunctional monomers (with only one functional group) sorted out?
8. How could achiral precursors of amino acids have produced and concentrated/selected only left-handed amino acids? (The homochirality problem)
9. How did the transition from prebiotic enantiomer selection to the enzymatic reaction of transamination occur that had to be extant when cellular self-replication and life began?
10. How would natural causes have selected twenty, and not more or less amino acids to make proteins?
11. How did natural events have foreknowledge that the selected amino acids are best suited to enable the formation of soluble structures with close-packed cores, allowing the presence of ordered binding pockets inside proteins?
12. How did nature "know" that the set of amino acids selected appears to be near ideal and optimal?

Out of the 12 listed problems of prebiotic cell membrane synthesis, 2 are directly related to the lack of a mechanism to select the right ingredients.
1. How did prebiotic processes select hydrocarbon chains which must be in the range of 14 to 18 carbons in length?  There was no physical necessity to form carbon chains of the right length nor hindrance to join chains of varying lengths. So they could have been existing of any size on the early earth.
2. How would random events start to produce biological membranes which are not composed of pure phospholipids, but instead are mixtures of several phospholipid species, often with a sterol admixture such as cholesterol? There is no feasible prebiotic mechanism to select/join the right mixtures.

Claim: Even if we take your unknowns as true unknowns or even unknowable, the answer is always going to be “We don’t know yet.”
Reply: Science HATES saying confessing "we don't know". Science is about knowing and getting knowledge and understanding. The scientists mind is all about getting knowledge and diminishing ignorance. Confessing of not knowing, when there is good reason for it, is ok. But claiming of not knowing, despite the evident facts easy at hand and having the ability to come to informed well-founded conclusions based on sound reasoning, and through known facts and evidence, is not only willful ignorance but plain foolishness. In special, when the issues in the discussion are related to origins and worldviews, and eternal destiny is at stake.  If there were hundreds of possible statements, then claiming of not knowing which makes most sense could be justified.  In the quest of origins and God, there are just two possible explanations. Either there is a God, or there is not a God. That's it. There is however a wealth of evidence in the natural world, which can lead us to informed, well-justified conclusions. We know for example that nature has no "urge" to select things and to complexify, but its natural course is to act upon the laws of thermodynamics, and molecules disintegrate. That is their normal course of action. To become less complex. Systems, given energy and left to themselves, DEVOLVE to give uselessly complex mixtures, “asphalts”.  The literature reports (to our knowledge) exactly  ZERO CONFIRMED OBSERVATIONS where evolution emerged spontaneously from a devolving chemical system. it is IMPOSSIBLE for any non-living chemical system to escape devolution to enter into the Darwinian world of the “living”. Such statements of impossibility apply even to macromolecules. Both monomers and polymers can undergo a variety of decomposition reactions that must be taken into account because biologically relevant molecules would undergo similar decomposition processes in the prebiotic environment.

CAIRNS-SMITH genetic takeover, page 70
Suppose that by chance some particular coacervate droplet in a primordial ocean happened to have a set of catalysts, etc. that could convert carbon dioxide into D-glucose. Would this have been a major step forward
towards life? Probably not. Sooner or later the droplet would have sunk to the bottom of the ocean and never have been heard of again. It would not have mattered how ingenious or life-like some early system was; if it
lacked the ability to pass on to offspring the secret of its success then it might as well never have existed. So I do not see life as emerging as a matter of course from the general evolution of the cosmos, via chemical evolution, in one grand gradual process of complexification. Instead, following Muller (1929) and others, I would take a genetic View and see the origin of life as hinging on a rather precise technical puzzle. What would have been the easiest way that hereditary machinery could have formed on the primitive Earth?

Claim: That’s called the Sherlock fallacy. It's a false dichotomy
Reply: No. Life is either due to chance, or design. There are no other options.

One of the few biologists, Eugene Koonin, Senior Investigator at the National Center for Biotechnology Information, a recognized expert in the field of evolutionary and computational biology, is honest enough to recognize that abiogenesis research has failed. He wrote in his book: The Logic of Chance page 351:
" Despite many interesting results to its credit, when judged by the straightforward criterion of reaching (or even approaching) the ultimate goal, the origin of life field is a failure—we still do not have even a plausible coherent model, let alone a validated scenario, for the emergence of life on Earth. Certainly, this is due not to a lack of experimental and theoretical effort, but to the extraordinary intrinsic difficulty and complexity of the problem. A succession of exceedingly unlikely steps is essential for the origin of life, from the synthesis and accumulation of nucleotides to the origin of translation; through the multiplication of probabilities, these make the final outcome seem almost like a miracle.

Eliminative inductions argue for the truth of a proposition by demonstrating that competitors to that proposition are false. Either the origin of the basic building blocks of life and self-replicating cells are the result of the creative act by an intelligent designer, or the result of unguided random chemical reactions on the early earth. Science, rather than coming closer to demonstrate how life could have started, has not advanced and is further away to generating living cells starting with small molecules.  Therefore, most likely, cells were created by an intelligent designer.

I have listed  27 open questions in regard to the origin of RNA and DNA on the early earth, 27 unsolved problems in regard to the origin of amino acids on the early earth, 12 in regard to phospholipid synthesis, and also unsolved problems in regard to carbohydrate production. The open problems are in reality far greater. This is just a small list. It is not just an issue of things that have not yet been figured out by abiogenesis research, but deep conceptual problems, like the fact that there were no natural selection mechanisms in place on the early earth.   


https://reasonandscience.catsboard.com/t1279p75-abiogenesis-is-mathematically-impossible#7759
1. https://onlinelibrary.wiley.com/doi/abs/10.1002/cbdv.200790167
2. https://sci-hub.ren/10.1007/978-1-4939-1468-5_27
3. https://arxiv.org/ftp/arxiv/papers/1511/1511.00698.pdf
4. https://link.springer.com/book/10.1007%2Fb136268
5. https://www.mdpi.com/2073-8994/12/12/2046

6. Andrew H. Knoll: FUNDAMENTALS OF GEOBIOLOGY 2012



Last edited by Otangelo on Sat Dec 30, 2023 2:12 pm; edited 7 times in total

https://reasonandscience.catsboard.com

Otangelo


Admin

The BIG WHITE ELEPHANT in the room of naturalism ( that nobody is talking about):

https://reasonandscience.catsboard.com/t3121-there-was-no-prebiotic-selection-to-get-life-originating#9062

A selection process is needed to select

1. The right expansion rate of the Big bang,
2. The fundamental forces of the universe and the right coupling constants,
3. The right-sized quarks and composition to have the right protons, neutrons, and electrons, to have stable atoms,
4. The right fine-tune parameters that go into the hundreds to have a life-permitting universe, in special the earth, the electromagnetic spectrum, water,
5. The right nucleobase isometries to have Watson-Crick base-pairing, and stable DNA
6. All the functional parts for the first living cell, the right genome, proteome, metabolome, and interactome, before you get to the first free-living, self-replicating cell, where SUPPOSEDLY evolution and natural selection could theoretically kick in, and start doing its business.

The problem is: Mindless nature has no goals, no purpose, no will, no foresight, no intention to bring forward life. The lack of an alternative viable possible plausible, probable alternative selection mechanism to an intelligent mind with intentions and will mean checkmate for naturalism.

Objection: Thats an argument from ignorance.
Reply:  Eliminative inductions argue for the truth of a proposition by arguing that competitors to that proposition are false. Provided the proposition, together with its competitors, form a mutually exclusive and exhaustive class, eliminating all the competitors entails that the proposition is true. Since either there is a God, or not, either one or the other is true. As Sherlock Holmes's famous dictum says: when you have eliminated the impossible, whatever remains, however not fully comprehensible, but logically possible, must be the truth. Eliminative inductions, in fact, become deductions.

Objection: Intelligent design or chance is a false dichotomy.
Reply: Right. The correct dichotomy would be Intelligent design or not intelligent design. We cannot invoke a third alternative, physical necessity. There was no need for nothing to become something, nor for the variegated physical finely adjusted parameters to be what they are. in fact:

Paul Davies said in God and Design: The Teleological Argument and Modern Science page 148–49, 2003
“There is not a shred of evidence that the Universe is logically necessary. Indeed, as a theoretical physicist I find it rather easy to imagine alternative universes that are logically consistent, and therefore equal contenders of reality” 
https://3lib.net/book/733035/b853a0

Chance, on the other hand, is not an alternative causal agent to intelligent design, but a measure of indeterminism, for a casual and unpredictable movement. The alternative factor responsible for all physical to God is NOTHING. In short, it's either God or no God. How can nothing cause something into being? Of course, it can't. Atheists, by experience, will attempt to dispute this, but a philosophical nothingness is the absence of anything, and can't, therefore, do something.

Strong atheists are left with a BIG WHITE ELEPHANT in the room. What I have outlined above, is obvious, and everyone knows. Nothing has no causal power and does not select things for a purposeful outcome. This is an uncomfortable, embarrassing truth, that lets every holder of a naturalistic worldview, without an answer. As a dictum says: " The emperor has no clothes".

There was no prebiotic selection to get life originating Argume10

https://reasonandscience.catsboard.com

Otangelo


Admin

Does cosmological natural selection explain the fine-tuning of the universe ? 


https://reasonandscience.catsboard.com/t3121-there-was-no-prebiotic-selection-to-get-life-originating#9215

Bigthink: Cosmological Natural Selection and the Principle of Precedence MAY 1, 2013
https://bigthink.com/articles/cosmological-natural-selection/

Lee Smolin: Cosmological Natural Selection (YouTube Geek Week!) | Big Think
https://www.youtube.com/watch?v=mbYLTqvo774

Physics is about discovering what the laws of nature are. And we've gone some distance towards that. We're not done but we've gone some good distance towards that at the present time. But once you know what the laws of nature are, another kind of question unfolds itself which is why are those the laws and not other laws. For example, the laws that we understand -- the standard model of particle physics describes all the fundamental particles and their interactions -- has about 30 numbers which you just have to put in as the result of measuring them by experiment. The masses of the different particles, the quarks, the electrons, the neutrinos, the strengths of the fundamental force -- various numbers like that. And the model works dramatically well as the recent experiments the Large Hadron Collider show. Why are those numbers what they are in our universe? Why is the mass of the electron what it is and not 12 times larger or half the size? There are dozens of questions like this. So I developed cosmological natural selection to try to give an evolutionary account of this so that there would be a history back before the Big Bang in which these numbers could change and evolve through a series of events like the Big Bang. 

My comment: This looks like a desperate attempt of a naturalist to find an alternative explanation to God, an intelligent powerful creator with intent and foresight, willing to create a life-permitting universe for his own purposes. So Smolin comes up with an entirely unsupported hypothesis, with a complete lack of evidence, and invented an alternative, that explains God away. Basically borrowing the mechanism in biology, that supposedly explains the origin of species and applying the same principle to cosmology. In biology, the why the question is answered with: survival of the fittest. But the same answer cannot be given when it comes to the Cosmos.  Without selecting the right parameters, constants, expansion rate, and forces, there would be no universe at all. Without existing yet, the universe could not select anything to continue to exist.

And there could be an explanation akin to natural selection. Just like you want to know why do people have two legs and not three legs or five legs or four legs or six legs. There's an evolutionary reason for that. A certain kind of fitness has been improved over many, many generations and similarly, there could be a notion of fitness of the laws of nature through approval of many generations.

My comment: Approval ? Only agencies with goals approve the certain states of affairs or things to be. Smolin tries to anthropomorphize the initial state of affairs beyond the universe, intending to generate a specific result, when there could be no intent to approve anything.


And cosmological natural selection was an example of the theory of that kind. I realized that the only methodology we had in science, or the best methodology we had in science for explaining how choices have been made in the system to all lead to a lot of structure because one of the mysteries is why our universe is so structured as it is on so many scales from organic molecules and biomolecules up to vast arrays of clusters of galaxies. There's enormous structure on such a wide range of scales. And that turns out to be tied to the values of these constants of the standard model of particle physics. And so why is that? And I realized that the only methodology that was really successful for explaining how choices were made in nature such as to lead to an improbable amount of structure is natural selection. So for natural selection we need reproduction. And there was a hypothesis lying around that universes reproduce through Black Holes, that inside Black Holes rather than there being singularities where time ends, there were basically the births of new regions of space and time which could become new universes. 

My comment: Maybe Smolin would be so kind and address where the energy came from to perform all this work.....

Lawrence Rifkin The Logic and Beauty of Cosmological Natural Selection  June 10, 2014

Here is the mind-blowing hypothesis that he first outlined in 1992 in his book The Life of the Cosmos. Throughout the universe, stars that collapse into black holes squeeze down to an unimaginably extreme density. Under those extreme conditions, as a result of quantum phenomenon, the black hole explodes in a big bang and expands into its own new baby universe, separate from the original. The point where time ends inside a black hole is where time begins in the big bang of a new universe. Smolin proposes that the extreme conditions inside a collapsed black hole result in small random variations of the fundamental physical forces and parameters in the baby universe.
https://blogs.scientificamerican.com/guest-blog/the-logic-and-beauty-of-cosmological-natural-selection/

My comment: Smolin refutes himself, when he writes:

Lee Smolin, The life of the Cosmos, page 53:
Perhaps before going further we should ask just how probable is it that a universe created by randomly choosing the parameters will contain stars. Given what we have already said, it is simple to estimate this probability. For those readers who are interested, the arithmetic is in the notes. The answer, in round numbers, comes to about one chance in 10^229. To illustrate how truly ridiculous this number is, we might note that the part of the universe we can see from earth contains about 10^22 stars which together contain about 10^80 protons and neutrons. These numbers are gigantic, but they are infinitesimal compared to 10^229. In my opinion, a probability this tiny is not something we can let go unexplained. Luck will certainly not do here; we need some rational explanation of how something this unlikely turned out to be the case.

So in order to have black holes as the result of collapsing stars, first one needs finetuning in order to have stars that can collapse. And so collapses as well Smolin's ideas based on a fertile imagination. 

https://reasonandscience.catsboard.com

Otangelo


Admin

S. M. Marshall (2021): The search for alien life is hard because we do not know what signatures are unique to life. We show why complex molecules found in high abundance are universal biosignatures and demonstrate the first intrinsic experimentally tractable measure of molecular complexity, called the molecular assembly index (MA). To do this we calculate the complexity of several million molecules and validate that their complexity can be experimentally determined by mass spectrometry. This approach allows us to identify molecular biosignatures from a set of diverse samples from around the world, outer space, and the laboratory, demonstrating it is possible to build a life detection experiment based on MA that could be deployed to extraterrestrial locations, and used as a complexity scale to quantify constraints needed to direct prebiotically plausible processes in the laboratory. Such an approach is vital for finding life elsewhere in the universe or creating de-novo life in the lab.

The search for evidence of life elsewhere in the universe relies on data collected from probes in our solar system, or astronomical observations. Knowing what signatures can be assigned to living systems is difficult as alien life has never been seen before. A solution would be to identify a feature exclusively associated with all life and develop a detection system for that feature. Here we postulate that living systems can be distinguished from non-living systems as they produce complex molecules in abundance which cannot form randomly so what is needed is a way to quantify this complexity. The need for a new technique is important for our understanding of life in the universe. The recent discoveries of the ubiquity of exo-planets of which there are over four thousand reported, raises the prospect that we will be able to observe planets that harbor life within a few decades. However, we don’t yet have a scientifically rigorous way to distinguish planets that host life from those that do not, even in our solar system. The design of a remote, unambiguous, and practical extraterrestrial life detection experiment is difficult because we have no idea what alien biochemistry is possible beyond Earth. This difficulty stems from the fact we have no universal definition of life because we only have the example of terrestrial life, and all known life forms are thought to have descended from a common ancestor. Many operational definitions of life emphasize the role of Darwinian evolution in living systems, but it is not clear how this criterion can be translated into an unambiguous remote experiment.

It is possible to distinguish between living and non-living systems on Earth due to processes such as photosynthesis, carbon and nitrogen fixation, replication, chiral enrichment, and morphogenesis. The artefacts of these processes have been proposed as possible biosignatures. There are proposals to search for such artefacts in minerals, and via isotopic and atmospheric analysis. The problem with looking for such processes in a universal manner is the lack of a rigorous definition outside the context of known terrestrial biochemistry, and therefore these cannot be deployed to design experiments. This has led to several ambiguous results from ‘metabolic’ experiments done by the Viking Lander on Mars, and investigations of potential meteorite ‘microfossils’. The results of these experiments were ambiguous because they could not be understood in a quantitative theoretical framework, and therefore the interpretation depended on chemical unknowns. In the case of the Viking Lander’s ‘metabolic’ experiments, the properties of Martian soil were unknown, making it difficult to determine whether the observed responses were purely abiotic in nature, or driven by biological processes. In the case of the supposedly biogenic magnetite crystals in the ALH 84001 meteorite, the criteria used to demarcate biogenic activity from abiogenic activity was not a quantitative measure, meaning the interpretation was always going to be ambiguous.

To circumvent these difficulties, we hypothesized that the very complex molecules made in any biochemical system could be distinguished from those produced in a non-biochemical system. This is because living systems appear to be uniquely able to produce a large abundance of complex molecules that are inaccessible to abiotic systems, where the number of small organic molecules, allowed by the rules of chemistry, is practically infinite. For instance, more than 10^60 ‘drug-like’ organic molecules with a molecular weight below 500 Daltons are estimated to be possible. Making any of these possible molecules from precursors requires constraints that may be naturally occurring, such as kinetic and thermodynamic properties of a reaction pathway, or are synthetically imposed.

Comment:  Here, the author could have outlined that the lack of natural selection makes it basically impossible to select, and accumulate a large number of chiral molecules, that could/would be used in life, besides the fact that they usually decompose into asphalt. Ribose has a very short half-life, at certain temperatures, of days, before they decompose entirely.

Biochemical systems appear to be able to generate almost infinite complexity because they have information decoding and encoding processes that drive networks of complex reactions to impose the numerous, highly specific constraints needed to ensure reliable synthesis. For example, the natural product Taxol, is an example of a molecule that could be a biosignature–this is because it is so complicated, that the probability of its formation abiotically in any detectable abundance (>10,000 identical copies) would be very small. One reason for this is that there are at least more than 10^23 different molecules possible with the same formula as Taxol, C47H51NO14 (molecular weight of 853.9), and this analysis excludes the fact that Taxol incorporates 11 chiral centers which means it has 211 or 2048 possible diastereomers. The selection of one such possibility out of the combinatorically large number of possibilities is a process that requires information. In the absence of such information encoding and decoding processes, relatively few constraints can be placed on a chemical system–only those that are encoded in the laws of physics and the local environment–which cannot provide the specific set of biases needed to produce complex molecules such as Taxol in abundance.

The concept of molecular complexity has been extensively explored theoretically, with many metrics devised based upon structural, topological, or graph theoretical complexity. However, all these metrics have different algorithms, and none have an experimental measure. To address this problem, we have devised a theory of molecular complexity that is experimentally verifiable. By mapping the complexity of molecular space it is possible to place molecules on a scale of complexity from those able to form abiotically to those so complex they require a vast amount of encoded information to produce their structures, which means that their abiotic formation is very unlikely. This mapping can be experimentally verified by building a model that correlates the theoretical complexities with spectroscopic data. By applying this model to a range of standard molecules as well as analog samples from the laboratory, terrestrial, and marine environmental samples, and an extraterrestrial sample, we show it is possible to unambiguously distinguish samples that contain molecules produced by life from those that do not.

Defining molecular assembly
Our approach to life detection is based on the molecular assembly number (MA) which is derived from the theory of assembly pathways. Assembly pathways are sequences of joining operations, that start with basic building blocks (in this case bonds) and end with a final product. In these sequences, sub-units generated within the sequence can combine with other basic or compound sub-units later in the sequence, to recursively generate larger structures (see Fig. 1A). 

There was no prebiotic selection to get life originating In_ana10
Figure 1
a In analyzing the assembly pathways of an object, we start with its basic building blocks, which are the shared set of objects that can construct our target object and any other object within the class of objects. The Assembly index of an object is defined as the smallest number of joining operations required to create the object using this model. 
b We can model the assembly process as a random walk on weighted trees where the number of outgoing edges (leaves) grows as a function of the depth of the tree, due to the addition of previously made sub-structures. By generating several million trees and calculating the likelihood of the most likely path through the tree, we can estimate the likelihood of an object forming by chance as a function of the number of joining operations required (path length). 
c The probability of the most likely path through the tree as a function of the path length decreases rapidly. The colors indicate different assumptions about the chemical space. For comparison, the dashed lines indicate the ratio of (I) one star in the entire milky way, 1:1011, (II) one gram out of all of Earth’s biomass, 1:1017, (III) one in a mole, 1:1023, and (IV) one gram out of Earth’s mass (1:1029). Note on this plot the path probability of the formation of Taxol would vary between 1:1035 to 1:1060 with a path length of 30 and the amount of chemical predisposition is varied with alpha biasing the effective selectivity between 50–99.9% at each step respectively.

Assembly pathways have been formalized mathematically using directed multigraphs (graphs where multiple edges are permitted between two vertices) with objects as vertices and objects as edge labels, however for the results here the formal details are unnecessary. Generating many assembly pathways from a pool of basic units will result in a combinatorial explosion in the diversity of structures. The molecular assembly number (MA) therefore captures the specificity of the molecule in the face of this combinatorial explosion and provides an agnostic measure of the likelihood for any molecular structure to be produced more than once. There will normally be multiple assembly pathways to create a given molecule. The MA of an object is the length of the shortest of those pathways, i.e. the smallest number of joining operations requires to construct the object, where objects created in the process can subsequently be reused. Thus, it is a simple integer metric to indicate the number of steps required in this idealized case to construct the molecule.

The MA of a molecule represents a best-case construction pathway by accounting for valence rules but no other limits from chemistry, including no consideration of reaction conditions. Importantly, a molecule with a relatively low MA could be difficult to synthesize in the laboratory, and hence as a tool for life detection, MA is susceptible to false negatives. Critically, life detection based on MA measurements can be made robust against false positives, as the number of synthetic steps required to create the molecule is not likely to be lower than the steps within the assembly model. Our central thesis is that molecules with high MA are very unlikely to form abiotically, and the probability of abiotic formation goes down as MA increases, and hence experimental determination of MA is a good candidate for a life detection system. If our hypothesis is correct, then life detection experiments based on MA can indicate the presence of living systems, irrespective of their elemental composition, assuming those living systems are based on molecules.

Bounding the MA probabilistically
To help determine how the probability of the spontaneous formation of detectable amounts of any given molecule changes with MA, we developed a computational model for the assembly of molecular graphs as unique steps on a path, where the length of this path represents the MA of a given molecule. The formation of molecules through the assembly process is then modeled using random walks on weighted trees, for full details see SI. Briefly, in this model the root of the tree corresponds to bonds available, while the nodes correspond to possible combinations of those bonds. Each node in the tree corresponds to molecules that could be synthesized from the available bonds, while outgoing edges represent joining operations transforming one molecule into another with the addition of more bonds. The shortest number of steps on a path from the base of the tree to the end of the branch corresponds to the MA of that compound, see Fig. 1B. To map the probability of the formation of any given molecule as a function of MA, we generated 3 million trees with different properties and determined the highest probability of formation of molecules as a function of MA. The probabilities we calculated represent the likelihood of an unconstrained or random assembly process generating that specific compound, given that the abiotic precursors are so abundant that they do not limit the formation. These probabilities do not represent the absolute probability of a molecule ever forming, rather they represent the chance of the molecule forming in the unconstrained, undirected process described herein any detectable abundance.

Thermodynamic and kinetic considerations suggest that the relative rates of reactions vary by orders of magnitude, and we implemented this in our model by assigning edge weights (and therefore relative abiotic likelihoods of those reactions) that also span multiple orders of magnitude. The number of possible products for each node in the trees grows as a function of the depth and hence MA of the node. By modeling the rate of growth using a function of the form |k| ∝lα,∝lα, where |k| is the number of possible molecules, l is the depth of the node (l is equal to MA for a given molecule) and α is a parameter that controls how quickly the number of joining operations growths with the depth of the tree. For the combination of any two molecules, the number of possible products formed from their combination grows at least linearly with the size of the compounds, since the bigger compounds have more atoms between which bonds can form. This means the number of ways to produce products in an assembly path explodes as the MA increases since the paths recursively utilize previous steps. To capture this, we evaluated the model with values of α between two and three, where two indicated the most conservative quadratic growth rate, and three representing a limiting case where both factors grow super-linearly. Under these conditions, molecules with a MA of between 15 and 20 would have a chance formation of one molecule in 10^23, or one molecule in a mole of substance respectively. 

Comment:  10^23 is the number of stars in the universe, therefore getting the life-permitting molecules by random shuffling is in the realm of the impossible.

We consider the one molecule in a mole threshold significant since it would be much lower than the detection limit for any known analytical technique. Importantly, we’ve not assumed any specific details about the availability of resources or the stability of specific bonds. 

Comment: This is another problem on top of finding, accumulating the basic materials, and then selecting the ones used in a specific macromolecule, and attaching it at the right place. Only then the quest for a mechanism to glue/attach/bond one part to the other become an issue. These are all cumulative problems.

This means that these results are agnostic because they do not depend on the particulars of the chemistry, only on the size and accessibility of the chemical space. Our model shows that MA tracks the specificity of a path through the combinatorically vast chemical space, and this supports our thesis that high MA molecules cannot form in detectable abundance through random and unconstrained processes, implying that the existence of high MA molecules depends on additional constraints imposed on the process.

Measuring MA in chemical space
In order to explore how MA is distributed in real molecules, we needed a way to compute the MA of a molecule, and thus we have devised an approach that uses bonds as the basic units, which simplifies our computation (Fig. 2, see Supplementary Information Section 2 for details). In using bonds as the basic unit, we describe structures as bonds connected by atoms, where two bonds are joined by superimposing an atom from each. Computing an assembly pathway of a molecule can be done simply by decomposing the object into fragments and reconstructing it, however, identifying the shortest pathway is computationally challenging.

There was no prebiotic selection to get life originating 41467_12
Figure 2
a Schematic of assembly paths for four example molecules (hydrogens and charges omitted for clarity). 
b The computed MA of molecules from the Reaxys database shown by molecular weight. The color scale indicates the frequency, with increasing frequency from dark purple (0.0) to green and yellow (1.0) of molecules in a given molecular weight range with a given MA. 2.5 million MA were calculated, in the figure shown here that data has been subsampled to control for bias, see SI. The overlaid plot with the white labels shows how the MA varies for some compound types where some natural products, pharmaceuticals, and metabolites have a wide range of values. Note that the range of MA for the amino acids is limited. The molecular masses are binned in 50 Dalton sections. 
c Example organic molecular structures and the corresponding MA values calculated.

We, therefore, developed an algorithm that calculates the “split-branch” variant of MA, implementing an efficient search for assembly pathways by recursively finding duplicated structures within the molecule. In this algorithm, the pathway found may not be the shortest, and so the result is an upper bound for the MA. However, the value calculated provides a robust estimate for the MA of the molecules in the work presented here. The algorithmic implementation of MA was used to characterize chemical space as represented by the Reaxys® database. We retrieved 25 million compounds with molecular mass up to 1000 Daltons from the database and the MA was calculated for a subset of 2.5 million unique structures over a molecular mass range up to 800 Daltons, see Fig. 2b, which shows how the MA of molecules in Reaxys vary with molecular weight. These results show that for small molecules (mass < ~250 Daltons) the MA is strongly constrained by their mass. This is understandable because small molecules have limited compositional diversity and few structural asymmetries. The MA of molecules with a mass greater than ~250 Daltons appear to be less determined by molecular weight, indicating that they can display vastly more compositional and structural heterogeneity. This is significant because it gives us insight into how to develop an experimental measure of MA based on tandem mass spectrometry by focusing on fragmenting molecules that have a mass greater than 250 Daltons.

Measuring MA in real molecules
Having established a method to calculate MA and explored how it varies in known chemical space, we next developed an analytical method to correlate experimental data to MA directly. Since MA is closely related to the structural heterogeneity of molecules, we developed a method based on tandem mass spectrometry (MS/MS). That approach allowed us to resolve distinct fragmentation patterns between high and low MA molecules. Tandem MS provides advantages in terms of life detection experiments because it generates separate signals for different ions in real complex mixtures. This separation is critical since MA is a measure based on individual molecules. Our central hypothesis is that high MA molecules will generate MS2 spectra with many distinct peaks, and that lower MA molecules would generate proportionally fewer since they tend to have fewer bonds and more symmetry, see Fig. 3. To test this hypothesis, we collected MS2 spectra for >100 small molecules and peptides for which we had calculated MA.

Life detection using MA measurements
We collected MS2 spectra from a wide variety of mixtures, including prebiotic soups, biological, abiotic, inorganic, dead, and blinded samples. We used Data Dependent Acquisition (DDA) to acquire MS2 data from the most intense ions in the mixture, allowing these to be directly compared to the single compound samples measured previously. The biological samples were explicitly produced by living systems such as E. coli lysates, yeast cultures, urinary peptides, a seawater sample, a highly complex natural product (Taxol) as well as fermented beverages and distillates (home-brewed beer and Scottish Whisky). Abiotic samples were produced in a controlled synthesis without enzymes or other biological influence (besides the chemists that prepared them) and included dipeptides, and Miller–Urey spark discharge mixtures. We also investigated the incredibly messy sugar-based ‘formose’ reaction mixtures with and without mineral salts added. Inorganic samples included extracts from terrestrial mineral sources such as quartz, limestone, sandstone, and granite. Dead samples, which were taken from terrestrial sources that have been influenced by biological processes but are not alive, included coal, and yeast burned at 200 °C and 400 °C. Finally, the blinded samples were a collection of samples whose origin was unknown at the time of analysis. These included a CM2 carbonaceous chondrite sample (Murchison meteorite), bay sediment, and biological material from two different geological epochs, the Holocene (~30,000 years old) and the Mid-Miocene ~14 Ma33. In addition, we analyzed a sample of the bacteria Aeromonas veronii collected via an online repository, it was analyzed with a different analytical platform but the results confirm our analysis, see Supplementary Information Section 6 for details. All samples we extracted in a mixture of water and methanol.

These results demonstrate that we can identify the living systems by looking for mixtures with an MA greater than a certain threshold. In the case of the analysis presented here it appears that only living samples produced MA measurement above ~15. Importantly this measurement does not imply that samples with a maximum MA below 15 are non-living, on the contrary several samples made or altered by living systems failed to generate MA above this threshold such as the Bay Sediment and some of the Scottish whisky. These represent false negatives, and they illustrate an important feature of MA-based life detection protocols: not all molecules produced by living processes have high MA–indeed complex mammals regularly produce CO2–but all high MA molecules are produced by living (or technological) processes.  


Stuart M. Marshall: Identifying molecules as biosignatures with assembly theory and mass spectrometry 24 May 2021

https://reasonandscience.catsboard.com

Otangelo


Admin

Identifying molecules as biosignatures with assembly theory and mass spectrometry

https://reasonandscience.catsboard.com/t3121-there-was-no-prebiotic-selection-to-get-life-originating#9587

S. M. Marshall (2021):  Here we postulate that living systems can be distinguished from non-living systems as they produce complex molecules in abundance which cannot form randomly so what is needed is a way to quantify this complexity.  To circumvent these difficulties, we hypothesized that the very complex molecules made in any biochemical system could be distinguished from those produced in a non-biochemical system. This is because living systems appear to be uniquely able to produce a large abundance of complex molecules that are inaccessible to abiotic systemswhere the number of small organic molecules, allowed by the rules of chemistry, is practically infinite. For instance, more than 10^60 ‘drug-like’ organic molecules with a molecular weight below 500 Daltons are estimated to be possible. Making any of these possible molecules from precursors requires constraints that may be naturally occurring, such as kinetic and thermodynamic properties of a reaction pathway, or are synthetically imposed.

Comment:  Here, the author could have outlined that the lack of natural selection makes it basically impossible to select, and accumulate a large number of chiral molecules, that could/would be used in life, besides the fact that they usually decompose into asphalt. Ribose has a very short half-life, at certain temperatures, of days, before they decompose entirely.

Biochemical systems appear to be able to generate almost infinite complexity because they have information decoding and encoding processes that drive networks of complex reactions to impose the numerous, highly specific constraints needed to ensure reliable synthesis. For example, the natural product Taxol, is an example of a molecule that could be a biosignature–this is because it is so complicated, that the probability of its formation abiotically in any detectable abundance (>10,000 identical copies) would be very small. One reason for this is that there are at least more than 10^23 different molecules possible with the same formula as Taxol, C47H51NO14 (molecular weight of 853.9), and this analysis excludes the fact that Taxol incorporates 11 chiral centers which means it has 211 or 2048 possible diastereomers. The selection of one such possibility out of the combinatorically large number of possibilities is a process that requires information. In the absence of such information encoding and decoding processes, relatively few constraints can be placed on a chemical system–only those that are encoded in the laws of physics and the local environment–which cannot provide the specific set of biases needed to produce complex molecules such as Taxol in abundance.

The concept of molecular complexity has been extensively explored theoretically, with many metrics devised based upon structural, topological, or graph theoretical complexity. However, all these metrics have different algorithms, and none have an experimental measure. To address this problem, we have devised a theory of molecular complexity that is experimentally verifiable. By mapping the complexity of molecular space it is possible to place molecules on a scale of complexity from those able to form abiotically to those so complex they require a vast amount of encoded information to produce their structures, which means that their abiotic formation is very unlikely. This mapping can be experimentally verified by building a model that correlates the theoretical complexities with spectroscopic data. By applying this model to a range of standard molecules as well as analog samples from the laboratory, terrestrial, and marine environmental samples, and an extraterrestrial sample, we show it is possible to unambiguously distinguish samples that contain molecules produced by life from those that do not.

Assembly pathways have been formalized mathematically using directed multigraphs (graphs where multiple edges are permitted between two vertices) with objects as vertices and objects as edge labels, however for the results here the formal details are unnecessary. Generating many assembly pathways from a pool of basic units will result in a combinatorial explosion in the diversity of structures.  Our central thesis is that molecules with high MA are very unlikely to form abiotically, and the probability of abiotic formation goes down as MA increases, and hence experimental determination of MA is a good candidate for a life detection system. If our hypothesis is correct, then life detection experiments based on MA can indicate the presence of living systems, irrespective of their elemental composition, assuming those living systems are based on molecules.

We evaluated the model with values of α between two and three, where two indicated the most conservative quadratic growth rate, and three representing a limiting case where both factors grow super-linearly. Under these conditions, molecules with a MA of between 15 and 20 would have a chance formation of one molecule in 10^23, or one molecule in a mole of substance respectively. 

Comment:  10^23 is the number of stars in the universe, therefore getting the life-permitting molecules by random shuffling is in the realm of the impossible.

We consider the one molecule in a mole threshold significant since it would be much lower than the detection limit for any known analytical technique. Importantly, we’ve not assumed any specific details about the availability of resources or the stability of specific bonds. 

Comment: This is another problem on top of finding, accumulating the basic materials, and then selecting the ones used in a specific macromolecule, and attaching it at the right place. Only then the quest for a mechanism to glue/attach/bond one part to the other become an issue. These are all cumulative problems.

This means that these results are agnostic because they do not depend on the particulars of the chemistry, only on the size and accessibility of the chemical space. Our model shows that MA tracks the specificity of a path through the combinatorically vast chemical space, and this supports our thesis that high MA molecules cannot form in detectable abundance through random and unconstrained processes, implying that the existence of high MA molecules depends on additional constraints imposed on the process.

In the case of the analysis presented here it appears that only living samples produced MA measurement above ~15. Importantly this measurement does not imply that samples with a maximum MA below 15 are non-living, on the contrary several samples made or altered by living systems failed to generate MA above this threshold such as the Bay Sediment and some of the Scottish whisky. These represent false negatives, and they illustrate an important feature of MA-based life detection protocols: not all molecules produced by living processes have high MA–indeed complex mammals regularly produce CO2–but all high MA molecules are produced by living (or technological) processes. 1  

Andrew H. Knoll: FUNDAMENTALS OF GEOBIOLOGY 2012
The emergence of natural selection
Molecular selection, the process by which a few key molecules earned key roles in life’s origins, proceeded on many fronts. (Comment: observe the unwarranted claim) Some molecules were inherently unstable or highly reactive and so they quickly disappeared from the scene. Other molecules easily dissolved in the oceans and so were effectively removed from contention. Still, other molecular species may have sequestered themselves by bonding strongly to surfaces of chemically unhelpful minerals or clumped together into tarry masses of little use to emerging biology. In every geochemical environment, each kind of organic molecule had its dependable sources and its inevitable sinks. For a time, perhaps for hundreds of millions of years, a kind of molecular equilibrium was maintained as the new supply of each species was balanced by its loss. Such equilibrium features nonstop competition among molecules, to be sure, but the system does not evolve. 2

Comment:  That is the key sentence. There would not have been a complexification to higher order into machine-like structures, but these molecules would either quickly disappear, dissolve in the ocean, or clumped together into tarry masses of little use to emerging biology. The system does not evolve. In other words, chemical selection would never take place.

We know that we, as intelligent beings, do make choices to get the desired outcome all the time - and there is no alternative to conscious intelligent action. Therefore, it is logical and plausible, and probable, that an intelligent creator was in action, choosing the parameters of the laws of physics, the right equations, the right adjustments in the universe, the right building blocks of life, the right machinery to have given life a first go.  And he was remarkably good at that.

1. Life requires the use of a limited set of complex biomolecules, a universal convention, and unity which is composed of the four basic building blocks of life ( RNA and DNA's, amino acids, phospholipids, and carbohydrates). They are of a very specific complex functional composition and made by cells in extremely sophisticated orchestrated metabolic pathways, which were not extant on the early earth. If abiogenesis were true, these biomolecules had to be prebiotically available and naturally occurring ( in non-enzyme-catalyzed ways by natural means ) and then somehow join in an organized way and form the first living cells. They had to be available in big quantities and concentrated at one specific building site. 
2. Making things for a specific purpose, for a distant goal, requires goal-directedness. And that's a big problem for naturalistic explanations of the origin of life. There was a potentially unlimited variety of molecules on the prebiotic earth. Competition and selection among them would never have occurred at all, to promote a separation of those molecules that are used in life, from those that are useless. Selection is a scope and powerless mechanism to explain all of the living order, and even the ability to maintain order in the short term and to explain the emergence, overall organization, and long-term persistence of life from non-living precursors. It is an error of false conceptual reduction to suppose that competition and selection will thereby be the source of explanation for all relevant forms of the living order.
3. We know that a) unguided random purposeless events are unlikely to the extreme to make specific purposeful elementary components to build large integrated macromolecular systems, and b) intelligence has goal-directedness. Bricks do not form from clay by themselves, and then line up to make walls. Someone made them. Phospholipids do not form from glycerol, a phosphate group, and two fatty acid chains by themselves, and line up to make cell membranes. Someone made them. That is God.



1. Stuart M. Marshall: Identifying molecules as biosignatures with assembly theory and mass spectrometry 24 May 2021
2. Andrew H. Knoll: FUNDAMENTALS OF GEOBIOLOGY 2012

https://reasonandscience.catsboard.com

Otangelo


Admin

Energy has no aim to become matter. Quarks do not aim to join just the right way to produce the proper mass of neutrons, a little heavier, and protons a bit lighter, to permit a stable nucleus. Electrons do not have the aim to swirl around the nucleus of the atom on different orbits with different energy states, and obeying Bohr's quantization rule, and Pauli's exclusion principle, and be fermions, rather than bosons, and be charge neutral, permitting the charge to cancel out with the nucleus, permitting charge neutral atoms, and so, atoms to be stable over billions of years, rather than annihilate in a fraction of a second, and as the result, no matter at all would populate the universe. There is no deeper physical reason that dictates that stable atoms should be the normal state of affairs, being part of reality. There has to be more to instantiate this.

There are over a hundred different atoms in the periodic table. Atoms join through bond formations, becoming molecules. The earth hosted an almost infinite number of different molecules, in no order, but chaotic, meaningless arrangements. There was no physical principle, that dictated, that suddenly, these molecules would or should sort out a set of very complex, special types of molecules, to become part of living, self-replicating cells. In fact, the fact alone, that there was no natural selection of some sort, excludes any naturalistic mindless principle, that would make it possible for life to form spontaneously, without intelligent intervention.

Remove evolution. Remove Darwin's popularized principle of natural selection. And all that is left, is the natural forces obeying the laws of thermodynamics, which rather than complexifying, randomize and disintegrate molecules.

With this alone, I destroyed the claim that the universe is a godless production of natural, non-intelligent causes.

https://reasonandscience.catsboard.com

Sponsored content



Back to top  Message [Page 1 of 1]

Permissions in this forum:
You cannot reply to topics in this forum