ElShamah - Reason & Science: Defending ID and the Christian Worldview
Would you like to react to this message? Create an account in a few clicks or log in to continue.
ElShamah - Reason & Science: Defending ID and the Christian Worldview

Welcome to my library—a curated collection of research and original arguments exploring why I believe Christianity, creationism, and Intelligent Design offer the most compelling explanations for our origins. Otangelo Grasso


You are not connected. Please login or register

Information has always a mental origin

Go down  Message [Page 1 of 1]

1Information has always a mental origin Empty Information has always a mental origin Mon Sep 21, 2020 5:23 am

Otangelo


Admin

How does information originate?

http://reasonandscience.heavenforum.org/t2529-how-does-information-originate

Information and the Nature of Reality Bernd-Olaf Küppers, page 180:

Let us consider the relationship between semantic information and complexity in more detail. Information is always related to an entity that receives and evaluates the information.This, in turn, means that evaluation presupposes some other information that underlies the process of registration and processing of the incoming information. But how much information is needed in order to understand, in the foregoing sense, an item of incoming information? This question expresses the quantitative version of the hermeneutic thesis, according to which a person can only understand some piece of information when it has already understood some other information. At first sight, it would seem impossible to provide any kind of answer to this question since it involves the concept of understanding, which, as we have seen, is already difficult to understand by itself, let alone to quantify. Surprisingly, however, an answer can be given, at least if we restrict ourselves to the minimal conditions for understanding. To this belongs first of all the sheer registration by the receiver of the information to be understood. If the information concerned conveys meaning – that is, information of maximum complexity – then the receiver must obviously record its entire symbol sequence before the process of understanding can begin. Thus, even the act of recording involves information of the same degree of (algorithmic) complexity as that of the symbol sequence that is to be understood. This surprising result is related to the fact that information conveying meaning cannot be compressed without change in, or even loss of, its meaning. It is true that the contents of a message can be shortened into a telegram style or a tabloid headline; however, this always entails some loss of information. This is the case for any meaningful information: be it a great epic poem or simply the day’s weather report. Viewed technically, this means that no algorithms – that is, computer programs – exist that can extrapolate arbitrarily chosen parts of the message and thus generate the rest of the message. But if there are no meaning-generating algorithms, then no information can arise de novo. Therefore, to understand a piece of information of a certain complexity, one always requires background information that is at least of the same complexity. This is the sought-after answer to the question of how much information is needed to understand some other information.

Ultimately, it implies that there are no “informational perpetualmotion machines” that can generate meaningful information out of nothing (Küppers, 1996). This result is the consequence of a rigorous relativization of the concept of information. It is a continuation of the development that characterized the progress of physics in the last century: the path from the absolute to the relative. This began with the abandoning of basic concepts that had been understood in an absolute sense – ideas such as “space,” “time,” and “object” – and has since led to well-known and far-reaching consequences for the foundations of physics. Whether the thorough-going relativization of the concept of information will one day lead to a comparable revolution in biological thinking cannot at present be said. This is largely due to the fact that the results up to now have been derived with respect to the semantic dimension of human language, and it is not yet clear to what extent they are applicable to the “language of genes.” For this reason, questions such as whether evolution is a sort of perpetualmotion machine must for the present remain open. At least it is certain that we must take leave of the idea of being able, one day, to construct intelligent machines that spontaneously generate meaningful information de novo and continually raise its complexity. If information always refers to other information, can then information in a genuine sense ever be generated? Or are the processes by which it arises in nature or in society nothing more than processes of transformation: that is, translation and re-evaluation of information, admittedly in an information space of gigantic dimensions, so that the result always seems to be new and unique? Questions such as these take us to the frontline of fundamental research, where question after question arises, and where we have a wealth of opportunities for speculation but no real answers.


If an intelligent creator is not excluded a priori, real answers emerge...... 

The world of abstract structures
Finally, I should like to return briefly to the question with which we began: Are the ideas of “information,” “communication,” and “language” applicable to the world of material structures? We saw how difficult it is to decide this on a philosophical basis. But it may also be the case that the question is wrongly put. There does indeed seem a surprising solution on the way: one prompted by current scientific developments. In the last few decades, at the border between the natural sciences and the humanities, a new scientific domain is emerging that has been termed “structural sciences” (Küppers, 2000b). Alongside information theory, it encompasses important disciplines such as cybernetics, game theory, system theory, complexity theory, network theory, synergetics, and semiotics, to mention but a few. The object of structural sciences is the way in which the reality is structured – expressed, investigated, and described in an abstract form. This is done irrespectively of whether these structures occur in a natural or an artificial, a living or a non-living, system. Among these, “information,” “communication,” and “language” can be treated within structural sciences as abstract structures, without the question of their actual nature being raised. By considering reality only in terms of its abstract structures, without making any distinction between objects of “nature” and “culture,” the structural sciences build a bridge between the natural sciences and the humanities and thus have major significance for the unity of science (Küppers, 2000b).

In philosophy, the structural view of the world is not new. Within the frame of French structuralism, Gilles Deleuze took the linguistic metaphor to its limit when he said that “There are no structures that are not linguistic ones … and objects themselves only have structure in that they conduct a silent discourse, which is the language of signs” (Deleuze, 2002, p. 239). Seen from this perspective, Gadamer’s dictum “Being that can be understood is language” (Gadamer, 1965, p. 450) takes on a radically new meaning: “Being” can only be understood when it already has a linguistic structure. Pursuing this corollary, the philosopher Hans Blumenberg (2000), in a broad review of modern cultural history, has shown that – and how – the linguistic metaphor has made possible the “readability” (that is, the understanding) of the world. However, the relativity of all understanding has of necessity meant that the material “read” was reinterpreted over and over again, and that the course of time has led to an ever more accurate appreciation of which “readings” are wrong. In this way, we have approached, step by step, an increasingly discriminating understanding of the reality surrounding us.

https://reasonandscience.catsboard.com

Otangelo


Admin

Information and the Nature of Reality Bernd-Olaf Küppers, page 180:

Information is always related to an entity that receives and evaluates the information.This, in turn, means that evaluation presupposes some other information that underlies the process of registration and processing of the incoming information. But how much information is needed in order to understand, in the foregoing sense, an item of incoming information? Someone can only understand some piece of information when it has already understood some other information. At first sight, it would seem impossible to provide any kind of answer to this question since it involves the concept of understanding, which, as we have seen, is already difficult to understand by itself, let alone to quantify. Surprisingly, however, an answer can be given, at least if we restrict ourselves to the minimal conditions for understanding. To this belongs first of all the sheer registration by the receiver of the information to be understood. If the information concerned conveys meaning – that is, information of maximum complexity – then the receiver must obviously record its entire symbol sequence before the process of understanding can begin. Thus, even the act of recording involves information of the same degree of (algorithmic) complexity as that of the symbol sequence that is to be understood. This surprising result is related to the fact that information conveying meaning cannot be compressed without change in, or even loss of, its meaning. It is true that the contents of a message can be shortened into a telegram style or a tabloid headline; however, this always entails some loss of information. This is the case for any meaningful information: be it a great epic poem or simply the day’s weather report. Viewed technically, this means that no algorithms – that is, computer programs – exist that can extrapolate arbitrarily chosen parts of the message and thus generate the rest of the message. But if there are no meaning-generating algorithms, then no information can arise de novo. Therefore, to understand a piece of information of a certain complexity, one always requires background information that is at least of the same complexity. This is the sought-after answer to the question of how much information is needed to understand some other information.

Ultimately, it implies that there are no “informational perpetualmotion machines” that can generate meaningful information out of nothing. This result is the consequence of a rigorous relativization of the concept of information. It is a continuation of the development that characterized the progress of physics in the last century: the path from the absolute to the relative. This began with the abandoning of basic concepts that had been understood in an absolute sense – ideas such as “space,” “time,” and “object” – and has since led to well-known and far-reaching consequences for the foundations of physics. Whether the thorough-going relativization of the concept of information will one day lead to a comparable revolution in biological thinking cannot at present be said. This is largely due to the fact that the results up to now have been derived with respect to the semantic dimension of human language, and it is not yet clear to what extent they are applicable to the “language of genes.” For this reason, questions such as whether evolution is a sort of perpetualmotion machine must for the present remain open. At least it is certain that we must take leave of the idea of being able, one day, to construct intelligent machines that spontaneously generate meaningful information de novo and continually raise its complexity. If information always refers to other information, can then information in a genuine sense ever be generated? Or are the processes by which it arises in nature or in society nothing more than processes of transformation: that is, translation and re-evaluation of information, admittedly in an information space of gigantic dimensions, so that the result always seems to be new and unique? Questions such as these take us to the frontline of fundamental research, where question after question arises, and where we have a wealth of opportunities for speculation but no real answers.


If an intelligent creator is not excluded a priori, real answers emerge...... 

The world of abstract structures
Finally, I should like to return briefly to the question with which we began: Are the ideas of “information,” “communication,” and “language” applicable to the world of material structures? We saw how difficult it is to decide this on a philosophical basis. But it may also be the case that the question is wrongly put. There does indeed seem a surprising solution on the way: one prompted by current scientific developments. In the last few decades, at the border between the natural sciences and the humanities, a new scientific domain is emerging that has been termed “structural sciences” (Küppers, 2000b). Alongside information theory, it encompasses important disciplines such as cybernetics, game theory, system theory, complexity theory, network theory, synergetics, and semiotics, to mention but a few. The object of structural sciences is the way in which the reality is structured – expressed, investigated, and described in an abstract form. This is done irrespectively of whether these structures occur in a natural or an artificial, a living or a non-living, system. Among these, “information,” “communication,” and “language” can be treated within structural sciences as abstract structures, without the question of their actual nature being raised. By considering reality only in terms of its abstract structures, without making any distinction between objects of “nature” and “culture,” the structural sciences build a bridge between the natural sciences and the humanities and thus have major significance for the unity of science (Küppers, 2000b).

In philosophy, the structural view of the world is not new. Within the frame of French structuralism, Gilles Deleuze took the linguistic metaphor to its limit when he said that “There are no structures that are not linguistic ones … and objects themselves only have structure in that they conduct a silent discourse, which is the language of signs” (Deleuze, 2002, p. 239). Seen from this perspective, Gadamer’s dictum “Being that can be understood is language” (Gadamer, 1965, p. 450) takes on a radically new meaning: “Being” can only be understood when it already has a linguistic structure. Pursuing this corollary, the philosopher Hans Blumenberg (2000), in a broad review of modern cultural history, has shown that – and how – the linguistic metaphor has made possible the “readability” (that is, the understanding) of the world. However, the relativity of all understanding has of necessity meant that the material “read” was reinterpreted over and over again, and that the course of time has led to an ever more accurate appreciation of which “readings” are wrong. In this way, we have approached, step by step, an increasingly discriminating understanding of the reality surrounding us.

https://reasonandscience.catsboard.com

Otangelo


Admin

Core Arguments from https://press.princeton.edu/books/hardcover/9780691241166/the-evolution-of-biological-information by Christoph Adami


key claims that appear to support the idea that information can arise from evolution:

1. "Darwin's theory stands out because it concerns the origins of the investigator himself."

2. "Explaining how Darwinian evolution can account for the complexity of life (biocomplexity) thus emerges as one of the last remaining major problems in evolutionary biology."

3. "If experiments could be conducted in which complexity visibly evolves from simplicity, the controversy would surely shift from 'Does It?' to 'How Does It?'"

4. "The theoretical concept that I introduce in this book seems to satisfy our intuition every time it is subjected to the test, which bodes well for its acceptance as a measure for biocomplexity."

5. "A mathematical description of the mechanisms that are responsible for the evolution and growth of complexity, and experimental evidence buttressing such a description, should go a long way to eliminate those doubts that are anchored around the startling complexity of life and the seeming inability of scientific theory to account for it."

6. "In hindsight, everything in biology uses information in one form or another, be it for communication (between cells, or organisms) or for prediction (via molecular as well as neural circuits)."

Addressing these claims:

While Adami presents these ideas as supportive of the notion that information can arise from evolution, it's important to note that these are largely theoretical concepts and hypotheses rather than empirically demonstrated facts. 

The author acknowledges that at the time of Darwin's theory, it was "based solely on observation and logical deduction, not on empirical facts obtained by experimentation." This admission highlights a fundamental challenge in proving that information truly arises from evolution.

The book proposes mathematical models and computational simulations to study evolutionary processes. However, these are abstractions and simplifications of real-world biological systems. While they may provide insights, they do not necessarily prove that information arises from evolution in actual living organisms.

The author's claim about conducting experiments where "complexity visibly evolves from simplicity" is an aspiration rather than a demonstrated reality. Such experiments, even if conducted, would face significant challenges in isolating evolutionary processes from other factors and proving a direct causal link to information generation.

The application of information theory to biology, while potentially useful as an analytical tool, does not inherently prove that biological information originates through evolutionary processes. It's a framework for understanding existing biological systems, not necessarily their origin.

Lastly, the statement that "everything in biology uses information" is an observation about the current state of biological systems, not evidence of how that information came to be. It doesn't address the fundamental question of the origin of this information.

1. Darwin's theory of evolution is described as a major leap forward in our understanding of the world, obliterating "the last vestiges of human hubris" and declaring humans as kin to all other life forms.

2. The book aims to address two perceived vulnerabilities of evolutionary theory:
   a) The lack of experimental evidence for macroevolution due to long timescales.
   b) The controversy surrounding how Darwinian evolution can account for the complexity of life.

3. Adami introduces two research strands to address these issues:
   a) Experimental evolution, which has matured into a quantitative science.
   b) A theoretical concept of complexity rooted in mathematics and information theory.

4. The author proposes a new measure of biological complexity that he claims satisfies both mathematicians and biologists, is practical, and universal.

5. Adami suggests that a mathematical description of mechanisms responsible for the evolution and growth of complexity, supported by experimental evidence, could help eliminate doubts about evolutionary theory's ability to explain life's complexity.

6. The book presents information theory as a unifying framework for understanding complex adaptive systems, with biological life as the prime example.

7. Adami acknowledges that the book had a long gestation period, starting around 2002 and evolving in focus over time.

These points summarize the main claims and objectives of the book as presented in the preface, without attributing specific page numbers to quotes.

Core Arguments

1. Evolution as a Mechanism for Complexity:
  - Adami argues that evolution is a robust mechanism for creating biological complexity, from viruses to brains. He emphasizes that natural selection and genetic variation drive the increase in complexity over time.
  - Quote: "Darwin's theory of evolution... represents a major leap forward in our understanding of the world... it obliterates the last vestiges of human hubris and declares us kin with bacterial slime and leaves of grass."

2. Information Theory in Biology:
  - The book integrates information theory with evolutionary biology, suggesting that biological information can be quantified and analyzed using principles from information theory.
  - Quote: "A mathematical description of the mechanisms that are responsible for the evolution and growth of complexity, and experimental evidence buttressing such a description, should go a long way to eliminate those doubts that are anchored around the startling complexity of life."

3. Experimental Evolution:
  - Adami presents evidence from experimental evolution, such as the Lenski experiment, to demonstrate how complexity can evolve in controlled environments, providing empirical support for evolutionary theory.
  - Quote: "The first strand is the field of experimental evolution... a discipline that few could have imagined in Darwin's days, but that today has matured into a quantitative science with the power of falsification."

4. Digital Evolution:
  - The book explores the concept of digital life and computational models to study evolutionary processes. These models help in understanding how complexity can arise in artificial environments.
  - Quote: "Computational evolutionary biology involves building models of worlds in which the Darwinian principles are explored... it has become possible to conduct dedicated experiments that can explore fundamental aspects of evolution as they affect an alien form of life."

Refutation of Evolution as an Inadequate Explanation for Biological Information

1. Misunderstanding Complexity:
- Evolution cannot account for the complexity observed in biological systems. However, Adami's integration of information theory provides a framework to quantify and understand this complexity.
- Refutation: While information theory can help quantify complexity, there is no empirical evidence that evolutionary processes can generate novel complex biological information. The observed complexity in biological systems often involves the reorganization or utilization of pre-existing information rather than the creation of new information.

2. Lack of Experimental Evidence:
- Another argument is that evolution lacks empirical evidence, particularly for macroevolutionary changes. Adami counters this by presenting experimental evolution studies that show observable changes in complexity within shorter timescales.
- Refutation: Experimental evolution studies, such as the Lenski experiment, do not provide evidence for the emergence of novel complex biological information. These studies demonstrate adaptation and variation within species (microevolution), but they do not show how new, complex information can arise naturally. The changes observed are often minor adjustments of existing genetic material rather than the creation of entirely new information.

3. Digital Life and Artificial Environments:
- Some may argue that digital evolution and artificial environments do not accurately represent biological evolution. However, these models are designed to isolate and study fundamental evolutionary principles that apply universally.
- Refutation: Digital evolution and artificial environments are simplified models that cannot fully replicate the intricacies of biological systems. They are useful for studying certain aspects of evolutionary theory, but they do not provide evidence that naturalistic processes can generate novel complex biological information. The parameters of these models are set by researchers and often do not account for the full complexity of natural evolution.

4. Information Theory as a Unifying Framework:
- Critics might claim that information theory is too abstract to be applied to biological systems. Adami, however, demonstrates that information theory can effectively describe and predict biological phenomena.
- Refutation: While information theory is a valuable tool for describing and predicting certain aspects of biological systems, it does not demonstrate how evolutionary processes can produce new, complex biological information. The application of information theory to biology often assumes the existence of complex information without explaining its naturalistic origin.

Christoph Adami's "The Evolution of Biological Information" provides a comprehensive argument for the adequacy of evolutionary theory in explaining the origin and increase of biological complexity. By integrating information theory and presenting empirical evidence from experimental and digital evolution, Adami effectively counters the claim that evolution is an inadequate explanation for the origin of biological information.

https://reasonandscience.catsboard.com

Sponsored content



Back to top  Message [Page 1 of 1]

Permissions in this forum:
You cannot reply to topics in this forum