Claim: Information is not immaterial
Reply: Norbert Wiener, the founder of cybernetics and information theory, information cannot be of physical in nature:
‘Information is information, neither matter nor energy. No materialism that fails to take account of this can survive the present day.’
Information scientist Werner Strombach 2
Semantic information has an appearance of order at the level of reflective consciousness.
Philosophy and Technology II: Information Technology and Computers in Theory and Practice, page 76:
Information involves a sender, a receiver, a carrier, and content. To become information there must also be a receiver to understand the signal as a sign, to decode its content, and to react to it. N. Wiener has pointed out only what information is not: that it is neither matter nor energy. Klaus defines the term more explicitly by calling information a third essential aspect of matter, but for G. Gunther it is a third aspect of the world in a metaphysical sense. world 1 is the physical world; world 2 is mind-immanent (subjective); world 3 is the world of objective intelligibility, of ideas (in an objective sense), of theories and their logical relations, the world of arguments, the products of the human mind as recorded in languages, the arts, sciences, and technologies. The rise of world 3 out of worlds 1 and 2 is brought about by humanity. Since information now measures the growth of knowledge produced by an event, it also measures an "amount of form." This connection between information and "being able to know" leads to the corollary that information is present only when something is being understood.
My comment: How is the kind of understanding applied to biology? Foresight or understanding is required to know beforehand, that the assignment of one of the triplet codons of the genetic code, for example, UCU means Serine, permits to form a genetic sequence, which will result in the amino acid sequence, which, once fully polymerized and sequenced, will result in the right folding of the strand, and subsequently, eventually, to a functional protein with machine-like operation capabilities.
The concept of order is, on the one hand, closely related to the concepts of wholeness and system9 and, on the other, to that of the regularity of events. But not every order is information. There must also be
(a) a differential in information similar to the energy differential which causes motion or work;
(b) the intelligibility of the message to a receiving system; and
(c) the generation of activity in the receiving system.
All three of these supplementary characteristics are attributable to human beings. Human beings investigate the unknown, decode the code of nature, and act in accordance with received information. Information consists in the representation of the order of reality which becomes manifest in human beings - i.e., in logical thinking, conceptual understanding, ethical evaluation, and significant or useful human action.
Materialists have traditionally struggled with the fact that DNA contains information, encoded using the genetic code. Traditionally, that attack has been towards denying that DNA contains a true code, claiming that it does so just in an allegorical sense. That is of course untrue because the assignment of the triplet codon "alphabet" to the amino acid " alphabet" is de facto a code or a cipher. In the Ribosome, de facto, a translation takes place literally.
So the objection that information is not immaterial, but physical is another canard. The claim is not made so often, but nonetheless, as seen in the video, it comes up. So, is information, material, or non-material?
To give an answer, I will answer to another commonly made assertion.
Claim: Of course, creationists conflate information with ascribed meaning deliberately, because they seek to expound the view that information is a magic entity, and therefore requires an invisible magic man in order to come into existence.
Reply: The claimant conflates DE-scribing information, with PRE-scribing information. When we see an object, we can DE-scribe it. Its constitution, its size, form, smell, in short, giving an informational description of what is.
PREscriptive information is on the other hand information that can be transmitted vocally, or in written form. It is the kind of information that informs how things have to be. When a machine designer makes a blueprint, he gives instructions on the exact dimensions, sizes, materials, and how a machine has to be built. That is prescriptive information, or instructional, complex, codified information. And that is PRECISELY what we see in the cell.
Information can be produced by a person ( prescriptive) or can be discovered through observation, conversation, experiments, books, etc.; (descriptive)
The field of information systems, with its role in living systems, is constantly evolving ( Corning and Kline 1998 ). Shannon and Weaver (1949) defined information as the capacity to reduce statistical uncertainty in the communication of messages between a sender and a receiver; however, this definition bears no relationship to natural systems, such as living organisms, that are “ informed thermodynamic systems ” ( Wicken 1987 ). Later information theorists introduced structural or functional information to account for the self-organizing capabilities of living systems, and instructional information, which is a physical array ( Brooks and Wiley 1988 ). However, linkages with the field of semiotics established a much more compatible approach to biological information ( Salthe 1998 ). Within this trend, control information is defined as the capacity to control the acquisition, disposition, and utilization of matter, energy, and information flows in purposive (cybernetic) processes.
Information and the Nature of Reality, edited by Paul Davies
When the foundation for information theory was laid down by Shannon, he purposely left out of the account any reference to what the information means, and dwelt solely on the transmission aspects. His theory cannot, on its own, explain the semantics and communication of higher-order entities. At most, one could say, that Shannon focused on the syntactic features of an information potential.
Let us consider the relationship between semantic information and complexity in more detail. Information is always related to an entity that receives and evaluates the information.This, in turn, means that evaluation presupposes some other information that underlies the process of registration and processing of the incoming information. But how much information is needed in order to understand, in the foregoing sense, an item of incoming information? This question expresses the quantitative version of the hermeneutic thesis, according to which a person can only understand some piece of information when it has already understood some other information. At first sight, it would seem impossible to provide any kind of answer to this question since it involves the concept of understanding, which, as we have seen, is already difficult to understand by itself, let alone to quantify. Surprisingly, however, an answer can be given, at least if we restrict ourselves to the minimal conditions for understanding. To this belongs first of all the sheer registration by the receiver of the information to be understood. If the information concerned conveys meaning – that is, information of maximum complexity – then the receiver must obviously record its entire symbol sequence before the process of understanding can begin. Thus, even the act of recording involves information of the same degree of (algorithmic) complexity as that of the symbol sequence that is to be understood. This surprising result is related to the fact that information conveying meaning cannot be compressed without change in, or even loss of, its meaning. It is true that the contents of a message can be shortened into a telegram style or a tabloid headline; however, this always entails some loss of information. This is the case for any meaningful information: be it a great epic poem or simply the day’s weather report. Viewed technically, this means that no algorithms – that is, computer programs – exist that can extrapolate arbitrarily chosen parts of the message and thus generate the rest of the message. But if there are no meaning-generating algorithms, then no information can arise de novo. Therefore, to understand a piece of information of a certain complexity, one always requires background information that is at least of the same complexity. This is the sought-after answer to the question of how much information is needed to understand some other information.
Ultimately, it implies that there are no “informational perpetualmotion machines” that can generate meaningful information out of nothing (Küppers, 1996). This result is the consequence of a rigorous relativization of the concept of information. It is a continuation of the development that characterized the progress of physics in the last century: the path from the absolute to the relative. This began with the abandoning of basic concepts that had been understood in an absolute sense – ideas such as “space,” “time,” and “object” – and has since led to well-known and far-reaching consequences for the foundations of physics. Whether the thorough-going relativization of the concept of information will one day lead to a comparable revolution in biological thinking cannot at present be said. This is largely due to the fact that the results up to now have been derived with respect to the semantic dimension of human language, and it is not yet clear to what extent they are applicable to the “language of genes.” For this reason, questions such as whether evolution is a sort of perpetualmotion machine must for the present remain open. At least it is certain that we must take leave of the idea of being able, one day, to construct intelligent machines that spontaneously generate meaningful information de novo and continually raise its complexity. If information always refers to other information, can then information in a genuine sense ever be generated? Or are the processes by which it arises in nature or in society nothing more than processes of transformation: that is, translation and re-evaluation of information, admittedly in an information space of gigantic dimensions, so that the result always seems to be new and unique? Questions such as these take us to the frontline of fundamental research, where question after question arises, and where we have a wealth of opportunities for speculation but no real answers.
If an intelligent creator is not excluded a priori, real answers emerge......
The world of abstract structures
Finally, I should like to return briefly to the question with which we began: Are the ideas of “information,” “communication,” and “language” applicable to the world of material structures? We saw how difficult it is to decide this on a philosophical basis. But it may also be the case that the question is wrongly put. There does indeed seem a surprising solution on the way: one prompted by current scientific developments. In the last few decades, at the border between the natural sciences and the humanities, a new scientific domain is emerging that has been termed “structural sciences” (Küppers, 2000b). Alongside information theory, it encompasses important disciplines such as cybernetics, game theory, system theory, complexity theory, network theory, synergetics, and semiotics, to mention but a few. The object of structural sciences is the way in which the reality is structured – expressed, investigated, and described in an abstract form. This is done irrespectively of whether these structures occur in a natural or an artificial, a living or a non-living, system. Among these, “information,” “communication,” and “language” can be treated within structural sciences as abstract structures, without the question of their actual nature being raised. By considering reality only in terms of its abstract structures, without making any distinction between objects of “nature” and “culture,” the structural sciences build a bridge between the natural sciences and the humanities and thus have major significance for the unity of science (Küppers, 2000b).
In philosophy, the structural view of the world is not new. Within the frame of French structuralism, Gilles Deleuze took the linguistic metaphor to its limit when he said that “There are no structures that are not linguistic ones … and objects themselves only have structure in that they conduct a silent discourse, which is the language of signs” (Deleuze, 2002, p. 239). Seen from this perspective, Gadamer’s dictum “Being that can be understood is language” (Gadamer, 1965, p. 450) takes on a radically new meaning: “Being” can only be understood when it already has a linguistic structure. Pursuing this corollary, the philosopher Hans Blumenberg (2000), in a broad review of modern cultural history, has shown that – and how – the linguistic metaphor has made possible the “readability” (that is, the understanding) of the world. However, the relativity of all understanding has of necessity meant that the material “read” was reinterpreted over and over again, and that the course of time has led to an ever more accurate appreciation of which “readings” are wrong. In this way, we have approached, step by step, an increasingly discriminating understanding of the reality surrounding us.
1. Without information the inflow of energy would not lead to self-organization. Information in this sense is more than information in the Shannon and Weaver (1949 ) sense; it is functional and can be thought of as information in both an “ instructional ” and “ control ” sense, as it requires information that creates complex structures — for example, enzymatic proteins — and metabolic pathways that productively channel the flow of energy both within an organism and between the latter and its environment.
2. Blueprints, instructional information, and master plans, which permit the autonomous self-organization and control of complex machines and factories upon these are both always tracked back to an intelligent source which made both for purposeful, specific goals.
3. The Blueprint and instructional information stored in DNA, which directs the make and controls biological cells and organisms - the origin of both is, therefore, best explained by intelligent design.
1. Information and the Nature of Reality Paul Davies , Gregersen N.H. (eds.)