What exactly is information? And why should information not be confused with matter-energy?
When I see the leaves falling around me, I know winter is approaching. My brain (matter-energy) uses collateral energy in the process of thinking, but the matter-energy is not part of the information in my mind. Information is the stuff of mind; matter-energy (including the brain) is physical. Information is never physical, but it can be, and usually is, represented in some physical form.
The philosophy of information is defined as the area of research that studies conceptual (mental) issues arising at the intersection of computer science, information technology, systems theory, cybernetics, and philosophy.
According to Luciano Floridi (1964- ), four kinds of mutually compatible phenomena are commonly referred to as ‘information’, namely:
Information about something (e.g. a train timetable)
Information as something (e.g. DNA, or fingerprints)
Information for something (e.g. algorithms or instructions)
Information in something (e.g. a pattern or a constraint)
Now, the word ‘information’ is commonly used so metaphorically, or so abstractly, that its meaning is totally unclear.
The crucial point is that information, unlike matter-energy (matter and ‘mobile matter’), is a function, a construct, of the mind (the observer). Information is a mental construct. Information is programs (‘automated’ algorithms) plus data. Programs are not data, but can be treated like data.
The same message (information) may have different meanings for different people. Although information requires the perception of a difference (a signal, a message), external to the mind the difference will require a matter-energy carrier (e.g. a page in a book, electrical circuits in a computer, or sound waves in air – collateral matter-energy).
In addition, awareness, and/or cognition require a nervous system (formed out of matter-energy). You can send information over the Internet, by fax, or by heliograph – one cannot do the same with matter.
When you fax, or copy, a document (matter-energy) only the information in the document is transferred, or copied, and not the physical document itself.As Gregory Bateson (1904-1980) has said in Mind and Nature, “Thought can be about pigs or coconuts, but there are no pigs or coconuts in the brain; and in the mind, there are no neurons, only ideas of pigs and coconuts.” [And in mind there are only ideas about neurons, I might add.]
Where then does information come from? Information is neither matter, nor energy (‘matter in motion’) – information has no mass, but it sometimes occupies ‘space’ (memory). Information on a computer hard disk occupies ‘memory space’, and on a DVD, information is minute dents in the disk. I do not know about information taking up space in the brain, or having mass.
For our purposes, matter is anything that has both mass and volume (it takes up space) and is made up of protons, neutrons, and electrons. As information in the universe increases, the matter-energy of the universe will stay unchanged and unaffected.
In a certain sense information is like energy. Energy is not something with substance (ethereal or otherwise), but like mass, energy is a quality of matter – energy is matter in motion, energy is (‘relative’) movement – the motion of matter relative to matter. Like energy, information has no mass, but unlike energy, information is not a quality (or characteristic) of matter. Information is sui generis. Information is associated with mind.
Like various types of matter (e.g. fresh foods, medicines, etc.), some types of information (e.g. news) are also perishable. However, you can still use old information without it killing you, or making you ill. Nevertheless, information can and do kill.
Informationsystems take data(sensations, facts, figures, clues, evidence, ideas, pictures, diagrams, numbers, symbols, letters, codes, et cetera) and process the (raw) data into useful information (‘cooked’ data)for creativeness, decision-making, entertainment, and planning and control purposes.
Sensations, facts, figures, clues, evidence, ideas, pictures, diagrams, numbers, symbols, letters, codes, are all mental constructs and are useless (or non-existent) without mind to make sense of them. There is no information (or data) without mind. Information is patterns created in, or gleaned from, matter-energy by mind.
Although matter-energy has been the subject of scientific investigation for several hundred years, a ‘scientific’ conception of information is relatively new.
A variety of definitions of information has been proposed. The American electrical engineer, and mathematician, Claude Shannon (1916-2001), defined information as “a reduction of uncertainty”. Bateson defined information as “that which changes us”, or “the difference that makes a difference”. [This is all true .]
However, although true , these definitions are very abstract. While a relatively simple concept, information proved to be a tough notion to pin down – to define.
Information as a concept bears a diversity of meanings, from everyday usage to technical settings. Generally speaking, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, and so on.
Information is a collection of related data; knowledge about a topic; data that have been processed into a format that is understandable by its intended audience; facts or figures ready for communication or use. Knowledge communicated or received concerning a particular fact or circumstance, news et cetera.
Information is data that has been processed into a form that is meaningful to the recipient and is of real or perceived value in current or prospective decisions. It is data that have been processed and presented in a form suitable for human interpretation, often with the purpose of revealing trends or patterns.
A crucial point is that information, unlike matter-energy, is a function of the observer (mind). For example, the same message may have different meanings for different people and in different contexts. Although information requires the perception of a difference (data), the difference will require a matter-energy carrier (e.g., a page in a book, electrical circuits in a computer, or sound waves in air – collateral matter-energy).
Say you have a document (matter) with a mass of exactly five (5) kilograms and the document contains unique, irreplaceable information. Now, say you burn this document (matter-energy); you will find that the remaining ashes have a mass of exactly five kilograms, but your irreplaceable information is now lost forever.
The measure (notion), five kilogram, is obviously also information and it does not exist in the physical Universe, it exists only in mind. Paper and ink have mass, volume, and heat content, but information does not have mass and does not necessarily occupy additional space; it also has no heat content.
Although the document was incinerated the ‘amount’ of matter (i.e. the mass, not the volume) is unchanged – the volume has changed. A fast ‘amount’ of information is lost (like in the case of the legendary Royal Library of Alexandria), but we will never find the huge loss of information by measuring the mass (or volume) of the ashes. The same would have happened if we have shredded the document.
The amount of information in a document will also have no influence on the heat content (measured in joule) of a document – although the paper and ink will have heat content. I.e. information, unlike matter (paper and ink), has no mass or heat content in itself. Sensations, facts, figures, clues, evidence, ideas, pictures, diagrams, numbers, symbols, letters, codes, et cetera have no mass or heat content, but can take up space and have size in some kinds of physical memory.
The physical Universe is made up of matter-energy. Matter is made up of atoms and molecules (groupings of atoms) and energy is the atoms and molecules in motion – either bumping into each other or vibrating back and forth. The motion of atoms and molecules creates forms of energy called heat or thermal energy, sound, and electromagnetism, which is present in all matter. Even in the coldest voids of space, matter still has a very small, but still measurable amount of heat energy and electromagnetism (internal motion).
Mass is a fundamental concept in physics and other sciences, a basic property of matter, roughly corresponding to the intuitive idea of ‘how much matter there is in an object’ – the quantity of matter in an object. It is also a body of matter that forms a whole and is the property of an object that is a measure of its inertia (a measure of an object’s resistance to acceleration), the amount of matter it contains, and its ‘influence’ in a gravitational field. On the earth’s surface, an object’s mass (measured in, for example, kilogram) is different from, but proportional to, its weight (in this case, 5 kilogram x 9.8 metre per second per second = 49.0 newton). Weight is a force, a vector; mass is not.
In physics and other sciences, energy (from the Greek energos, ‘active, working’) is a scalar, physical, thermodynamic quantity that is a property of objects and systems in the physical Universe. The units of energy in the metric system are the joule, or erg.
Energy is often ‘defined’ as the ability to do work (E = m.a.s = F.s = mass [m] x acceleration [a] x displacement [s]), i.e. the impetus behind all motion and all activity. Energy is a fundamental aspect of matter in motion – force (F) times displacement (s). Energy is characterised by vibration and the impulse to move. People get energy from food. Your toaster and your washing machine get their energy directly from electricity, which can come from coal, wind, or water. One type of energy can be converted into another, e.g. chemical energy à electrical energy à mechanical energy, et cetera. Work is possible with, or without, information, but not without energy.
Again then, where does information come from? Information is neither matter (it has no mass), nor energy (it has no heat content, temperature, motion, or whatever). Moreover, as the American philosopher and mathematician, William Dembski (1960- ), correctly inferred, “Information is sui generis. Only information begets information.” This is Dembski’s (in Intelligent Design) Law of the Conservation of Information. Information is the product of mind. The brain uses energy; the mind uses information.
A clue, fact, or figure, in isolation has no real meaning at all – it needs context (a conspicuous pattern). If your body temperature is 40oC, you might feel very ill and feverish, but if you also know that the average body temperature of humans is 37oC, you can now be certain that you have a serious fever and help is needed.
Property, commodities, products (goods and services), et cetera, are all physical entities and they all have economic value (= quantity x price). However, price is strictly a mental creation (as is the concepts of quantity and value) and has no physical existence – it is information.
At one time, gold was used as money (an abstraction of economic value, i.e. information). Then later money used to be notes and coins representing a gold standard and later still it was only a fiat (i.e. based on faith and no gold value). Today money is increasingly an electronic transaction that occurs only in the realms of cyberspace – firmly information only. A value (information) is attached to an hour, week, or month of your time (also a mental construct, i.e. information) – time is money, and information is king!
Informationis datathat is related to each other in a very specific way to make sense to some decision-maker (mind). Thus, you need at least two bits of data to create one piece of information. As Floridi has shown, information is a term with many meanings depending on context, but is as a rule closely related to such concepts as meaning, knowledge, instruction, communication, representation, and mental stimulus.
Information is organised data (raw facts and figures presented, or arranged, in formation) in a specific context. Information is a non-physical, immaterial, mental entity completely unrelated to matter-energy.
Information is noticeable patterns in matter-energy. For our purposes, a pattern is any condition in which something moves from one state to another in a pre-set, or predetermined, manner. Patterns are by definition predictable. Patterns are the exact opposite of chaos – chaos imparts no information. Chaos is the distinctive lack of information. There are no discernable patterns in chaos. Chaos is truly random. Mind always try to see patterns, to reduce chaos and make sense of our world.
Consider, for example, Phillip Johnson’s(1940- ) so-called basic points regarding information in his book Defeating Darwinism by Opening Minds:
·“First, lifeconsists not just of matter (chemicals) but of matter [matter-energy] and information.
·“Second, informationis not reducible to matter, but it is a different kind of ‘stuff’ altogether. A theory of lifethus has to explain not just the origin of the matter but also the independent origin of the information.
·“Third, complex, specified informationof the kind found in a book or a biological cell cannot be produced either by chance or at the direction of physical and chemical laws.”
Things like zeroes (0) and ones (1) do not exist, except in mind. What you see here is only symbols of the concepts, not the ‘things’ themselves – and they have no mass or energy, but they can carry, and store, information!
Information is in essence only contextual patterns, or maps, usually enclosed within matter-energy, e.g. 101101 is one hundred and one thousand, one hundred and one, or 101101 (in binary mode) = 45 (in decimal mode).
Een, einz, un, mon, ‘1’, I, √1, 1n, 1x1, a/a, a0, sin2(a) + cos2(a), et cetera, are all symbols for the same concept, or at least they indicate the same thing – one. Nonetheless, although ‘1’, √1, and 12 are all exactly equal to one, there is a subtle mathematical difference (data) between them for the initiated – context again!
The puzzle we must now confront is this: Where and how does the germinal information arise? There is no hint of it in the laws of nature (as we understand them) that govern the interactions among the basic particles that compose all matter-energy. The information just appears as a given, with no causal agent evident, as if it were an intrinsic facet of nature. This is the real mystery of all life and it has puzzled humans for as long as we can remember. Still we have no ‘scientific’ answer as yet.
The physicist and the first American involved in the theoretical development of the atomic bomb, John Archibald Wheeler (1911-2008), likened what underlies all existence to an idea, the ‘bit’ (binary digit – 0 or 1) of information that gives rise to the ‘it’, the substance of matter. Wheeler is also known for having coined the terms “black hole” and “wormhole” and the phrase “it from bit”.
Matter-energy (body) and mind (information, data) are somehow linked, but is not similar. Information is neither heat or electromagnetism nor mass (plasma, gas, liquid, or solid). Information also has no energy, mass, or volume.
DNA (DeoxyriboNucleic Acid), and RNA (RiboNucleic Acid), is matter-energy (part of body); it is somewhat like the program strips (paper and magnetic tapes) that the old computers used to employ to store information. DNA and RNA are not information, but like paper tapes in the old computers it ‘stores’ information.
Punched cards (the predecessor of paper tapes) were originally developed in 1804 as an aid to textile production by a Frenchman, Joseph Marie Jacquard (1752-1834). These old computers used binary digits to do their stuff.
Binary numbers (ones and zeros, yes and no, or on and off) are used to store data and describe the steps that are needed to be done to solve a particular problem, or perform a certain task with a digital computer. George Boole (1815-1864) is the inventor of Boolean Logic, which is the basis of modern digital computer logic. Living bodies use four digits (basis) in different patterns to store information, namely: adenine (A), cytosine (C), guanine (G) and thymine (T).
Most computers in the 1950’s were designed for a particular purpose or a limited range of purposes – much like the calculators and cellphones of today. Alan Mathison Turing (1912-1954) envisioned a computer that could do anything (a ‘programmable computer’), something that we take for granted today. The method of instructing the computer was very important in Turing’s concept. This concept was revolutionary for the time.
The ‘Turing Machine’ would read each of the steps and perform them in sequence, resulting in the proper answer. As a mathematician, he applied the concept of the algorithm to digital computers – he was the first person to realise that programs and information can be handled in exactly the same manner in computers, i.e. both programs and information are software.
His research into the relationships between machines and nature created the field of artificial intelligence. His intelligence and foresight made him one of the first to step into the information age.
Turing essentially described a ‘machine’ that knew a few simple instructions – today these basic instructions are saved on one, or more, read only memory (ROM) chips in personal computers (PCs). Making the computer perform a particular task was simply a matter of breaking the job down into a series of these simple instructions (algorithms). This is identical to the process programmers go through today. Turing believed that an algorithm could be developed for almost any problem. The hard part was determining what the simple steps were to be and how to break down the larger problems.
Dr Martin Cooper (1928- ), a former general manager for the systems division at Motorola, is considered the inventor of the first portable handset and the first person to make a call on a portable cellphone in April 1973. The first call he made was to his rival, Joel Engel (1936- ), Bell Labs’ head of research.
AT&T's research arm, Bell Laboratories, introduced the idea of cellular communications in 1947. But Motorola and Bell Labs in the sixties and early seventies were in a race to incorporate the technology into portable devices.
If we consider all the possible carriers of information, it is clear that the relationship between carrier (matter-energy) and signal (information, data) is not fused and definite. The relationship depends on the material in which a pattern appears. That is, a pattern or set of differences can be observed at the atomic level (where Bremermann’s limit applies), in molecules (DNA, RNA), cells (neurons), organs (the brain), groups (norms), and society (culture).
Both Leo Szilard (1898-1964) and Hans-Joachim Bremermann (1926-1996) also used the term ‘information’. However, they both realised that because of the complexities introduced by having to specify one or more observers, the term ‘information’ is not an elementary concept. Information is dependent on simpler concepts, like for example difference (data).
Difference denotes the elementary building block of data, signals, or information. Therefore, when dealing with physical foundations, Stuart A Umpleby (1944- ) believes it is preferable to speak in terms of matter, energy, and difference, rather than information. [That is Difference --> Data --> Information.]
According to Umpleby, ‘difference’ is a physical entity that can be noted by an observer. Drawing a ‘distinction’ is a purposeful act that creates two categories. Although the difference is a physical entity, the distinction drawn from it is a mental construct and is not physical. [That is Difference (physical) --> Data (mental) --> Information (mental).]
Scientists today understand phenomena related to matter-energy more thoroughly than phenomena related to information. Perhaps reflecting on the physical relationships among matter-energy and information can help natural scientists and social scientists understand better the nature of their disciplines and the world in general.
Efforts to apply the methods of the natural sciences to social systems have led some people to conclude that matter and energy relationships are the appropriate subjects of attention for social scientists. However, in social systems, distinctions (difference, data, and information) are essential.
Bateson (in Steps to an Ecology of Mind) made this point as follows, “… my colleagues in the behavioural [social] sciences have tried to build the bridge to the wrong half of the ancient dichotomy between form [mind/information] and substance [body/matter-energy]. The conservative laws for energy and matter concern substance rather than form.
“But mental process, ideas, communication, organization, differentiation, pattern, and so on, are matters of form rather than substance. Within the body of fundamentals, that half which deals with form have been dramatically enriched in the last thirty years by the discoveries of cybernetics and systems theory.”
However, most physical scientists have not realised this as yet either; they are still stuck in their archaic, material paradigms and therefore are still bamboozle by the ‘difference’ between information/data and matter-energy. After scientists have sorted these differences between matter-energy and information (mind and body) out, they will be in a much better position to make the next paradigm leap in understanding the Universe and Nature.
Willie, this is an intriguing article. I noticed that information or a collection of data in the human mind triggers "mental energy" to give a kind of physical response or reaction. I wonder if the realm of Physics will ever touch on the effect of information on matter and energy, at least those that belong to the human mind. Just a vague thought. Thanks for the will written article, Willie.