Home Information and Quantum Physics: The Universe as a hologram

    Information and Quantum Physics: The Universe as a hologram

    The exploration of quantum information challenges objective reality, positing the universe as a hologram. This piece examines how informational algorithms drive everything from the emergence of physical laws and time to the ultimate nature of consciousness as an emergent property

    Quantum reality and information

    Q. How does Anton Zeilinger’s assertion that “the distinction between reality and our knowledge of reality, between reality and information, cannot be made” reshape the classical notion of objective reality in light of quantum information theory?

    Anton Zeilinger, who shared the 2022 Nobel Prize in Physics for work establishing the foundations of Quantum Mechanics (QM), asserts that QM forms the fundamental basis of reality, effectively demonstrating that Einstein was incorrect – that “God plays dice.” His research, conducted where Ludwig Boltzmann developed the concept of entropy (a concept later connected to information theory), laid the groundwork for quantum information science.

    Zeilinger’s most famous experiments involved entangled photons. By measuring the polarisation of one photon (e.g., in Las Palmas), its entangled partner (e.g., six kilometres away in Tenerife) instantly assumed the same polarisation. This instantaneous sharing of information, which is faster than the speed of light, violates Bell inequalities and Einstein’s theory of relativity. This phenomenon, which Einstein famously called “spooky action at a distance,” proves that entangled particles share information in a manner that, in some sense, ignores spacetime.

    Zeilinger’s major scientific contributions – including quantum teleportation, long-distance quantum communication, and quantum cryptography – moved quantum entanglement from a “weird thought experiment” into experimental reality.

    This work leads Zeilinger to conclude that realism does not exist. While classical physics assumes the world exists independent of observers, QM strongly implies that a fixed reality cannot be meaningfully described unless it is observed. As Zeilinger states: “The only true/correct description of reality is quantum mechanics and information.” He argues that the strangeness of quantum mechanics actually “throws a veil over a deeper, so far hidden reality,” pushing Einstein’s local-realistic universe “back to a small nook of the universe.”

    This microscopic reality, governed by quantum information, stands in sharp contrast to our macroscopic experience. For instance, the human body is an organised complexity of approximately 37 trillion cells, with about three hundred billion renewed daily and trillions of biochemical interactions occurring every second.

    Microscopically, we are a tremendous volcano of activity, yet macroscopically, we appear as a consistent whole. The only way to reason this reality – where our approximately 1027 atoms “know” exactly what to do – is through hidden webs of information and their encrypted instructions. This highlights the huge difference between an information-driven microscopic reality and the consistent macroscopic reality our brain assembles and experiences.

    Q. If the universe functions as a “cosmic quantum computer,” as suggested in Virtuality of the Real, what implications does this have for our understanding of physical laws as informational algorithms rather than deterministic mechanics?

    The view of the universe as a “cosmic quantum computer” implies that physical laws are not fixed, deterministic rules, but informational algorithms – flexible, emergent properties of a self-organising system.

    Physicist Paul Dirac suggested this idea as early as the 1930s, stating that the physical laws of the universe were not immutable truths “imprinted on the universe at birth.” Dirac noted: “At the beginning of time, the laws of nature were probably very different from what they are now. Thus, we should consider the laws of nature as continually changing with the epoch.”

    This idea of mutable laws is supported by the study of the universe’s origin:

    • The universe’s early state, a fraction of time after the Big Bang, was incredibly compact – about 40 sextillionth (4021) nanometers 10-9 m in radius, a scale where classical laws break down.
    • Albert Einstein initially rejected the idea of an evolving universe calling his introduction of the cosmological constant to maintain a static model his “biggest mistake”, demonstrating the historical resistance to non-fixed laws.
    • Stephen Hawking initially explored a multiverse cosmology (estimating 10500 universes with different laws) as a “bottom-up” approach to explain the existence of our life-friendly laws.

    Hawking later shifted, however, and, with Thomas Hertog, championed a top-down approach using quantum cosmology. By examining the very beginning of the universe, this approach reveals that time, space, and natural laws were not originally present. Our current laws and constants are understood as having emerged from quantum fluctuations and symmetry breaks in the earliest moments.

    The top-down approach effectively inverts the traditional hierarchy in physics: it rejects the idea of the universe as a machine ruled by fixed, a priori laws. Instead, it emphasises a vision of the universe as a self-organising entity in which emergent patterns appear, and the most common of these patterns are what we term the laws of physics.

    The key implication is that we can consider our physical laws as the informational algorithms or software that the universe runs on. Their specific configuration allows for complex information streams – including our own discussions and cognitive interactions – to shape our experienced reality.

    Entropy, information, and life

    Q. How does the concept of infotropy – the coupling of entropy and information – redefine the thermodynamic basis of life, and could this framework help explain the emergence of biological complexity?

    The concept of infotropy – the necessary coupling of information and entropy – redefines the thermodynamic basis of life by positing that life is an emergent property driven by the storage and manipulation of information within highly structured systems.

    Entropy, as ingeniously formulated by Ludwig Boltzmann, represents the microscopic complexity (or disorder) of a system. Boltzmann’s guideline is that entropy is proportional to the number of microscopic arrangements of a system’s components that leave its macroscopic properties unchanged. This ties temperature directly to entropy, as thermal radiation arises from the vibration of an object’s internal constituents.

    This measure of microscopic arrangement is closely related to information. A higher entropy means that more information can be stored in the microscopic details of a system.

    • Black holes demonstrate this concept on a cosmic scale. Stephen Hawking proved black holes have entropy, and the formula for Hawking radiation (inscribed on his tombstone) shows their temperature is incredibly cold (less than 0.0000001 Kelvin, colder than the 2.7 Kelvin of the Cosmic Microwave Background).
    • Black holes are considered the most efficient hard drives in the universe. For comparison, the estimated 1080 gigabytes of information contained in the Milky Way’s supermassive black hole, Sagittarius A*, dwarfs the data on all of Google’s servers, which could easily fit into a black hole no bigger than a proton.

    The ultimate black hole is the Big Bang. Understanding the complexity and informational role of black holes may lead to a revolution in understanding information’s role in the universe, gravity, and quantum mechanics, potentially revealing our very origin – the universe’s way to think about itself.

    The formal link between entropy and information was established in the 1940s by Claude Shannon. Working at Bell Laboratories, Shannon introduced the concept of Shannon Entropy as a measure of information content. The number of states derived from the Boltzmann entropy reflects the amount of Shannon information required to achieve any specified arrangement of constituents.

    This specific arrangement of constituents is the essence of life. All life requires highly specific arrangements, and this information is stored and processed efficiently.

    • DNA stores life’s instructional code using four nucleobases: Adenine (A), Guanine (G), Cytosine (C), and Thymine (T) (with Uracil (U) replacing T in RNA). This code emphasises reproduction and adaptation.
    • The complexity extends beyond DNA/RNA to the 20 different amino acids that form proteins. Given that a moderate protein consists of about 200 amino acids, the mathematical possibility for different proteins is vast – in the order of 10260.
    • With the estimated amount of different proteins in Earth life being around one trillion (1012) this immense mathematical possibility reflects the ocean of creative potential for biological complexity, suggesting that Earth life evolution is only at the beginning of exploring the informational possibilities available.

    Life, therefore, can be viewed as matter + information, which successfully organises complexity and counters local entropy by utilising and storing vast amounts of specific information.

    Q. Given that consciousness may have evolved from primitive biochemical information transfer mechanisms, to what extent can it be described as an emergent property of informational complexity rather than a purely biological phenomenon?

    Consciousness can be described extensively as an emergent property of informational complexity, rather than purely biological, because its characteristics – especially its extreme energy efficiency and ability to interpret data – appear to be a highly evolved solution to a fundamental informational problem faced by both living and artificial systems.

    The contrast between digital and biological information processing highlights the efficiency of life:

    • Digital Systems: Our computers use the simplest signals – bits (yes/no) – and must constantly erase information as it moves between logic gates. Thermodynamics dictates that this erased information is transferred into arbitrariness or thermal noise, increasing the computer’s entropy and thus its temperature.
      • This informational cost is why modern data centres consume tremendous amounts of energy, largely for cooling.
      • The Landauer limit, calculated in 1961, defines the theoretical minimum energy needed to erase one bit of information. If current computers could achieve this limit, they would run on milliwatts of electricity – thousands of times less than a light bulb. Instead, they consume the equivalent of millions of light bulbs, representing a tremendous waste of energy compared to what nature has achieved.
    • Biological Systems: The information recording, storing, and processing in our brains is approximately six million times more energy-efficient than our best and most efficient computers. Furthermore, our biological “supercomputers” are capable of a task that current Artificial General Intelligence (AGI) and large language models (like ChatGPT) struggle with: the interpretation of information and assigning it meaning, which is necessary to interact successfully with the environment.

    This immense efficiency and interpretive capacity suggest that consciousness itself is an emergent phenomenon resulting from biological informational complexity. Even the fundamental complexity of a simple virus or bacterium involves energy-efficient computation. Consciousness, therefore, can be viewed as the ultimate emergent solution – an energy-efficient mechanism necessary for a living system to have a notion or awareness of its surroundings, interpret that information, and interact with the environment for survival. The phenomenon’s basis appears to be information processing itself, making the principles universally applicable even though its most complex known form is currently biological.

    The holographic mind and examining time

    Q. If perception itself is the result of neural interpretation of quantum information – where colour, sound, and touch are mental constructs – can human consciousness be viewed as a localised hologram of universal information?

    Yes, if perception is the neural interpretation of quantum information, human consciousness can be viewed as a localised hologram of universal information, operating within a universe that may itself be holographic. This concept is rooted in the physics of black hole entropy and the resulting Holographic Principle.

    The idea that information is stored on a surface, not in a volume, originated with the study of black holes:

    • Black Hole Entropy: In 1972, Jacob Bekenstein postulated that a black hole’s entropy is proportional to the surface area of its event horizon. Stephen Hawking’s subsequent discovery of Hawking radiation proved Bekenstein correct. This insight revealed black holes to be the most complex structures in nature, containing an enormous amount of information that Albert Einstein’s General Relativity had completely ignored.
    • The Paradox: The surprising fact that entropy is related to surface area and not volume contradicted common-sense expectations. For example, a library’s information content is measured by its volume, not just its exterior walls.
    • Planck Scale Information: To calculate a black hole’s information content, one must consider the event horizon’s surface area. This area is divided into Planck areas (10-66 cm2) where one-fourth of the total surface area determines the storage capacity. Each Planck cell can host one bit of quantum information, often explained as two entangled particles, potentially answering a yes-or-no question about the black hole’s evolution.

    This structure was the first glimpse of holography in physics: the black hole’s storage capacity is determined by the size of its two-dimensional horizon, making the black hole appear as a hologram – like looking at the peel of a lemon that somehow contains the information of the whole fruit inside.

    The fundamental principle here is that information is conserved; the total amount of information in a system, like the universe, is always maintained, though its form may change (just as energy is conserved).

    Building on this, physicists Gerard ‘t Hooft and Leonard Susskind postulated the Holographic Principle: the universe, or at least a region of it, can be described as a hologram where all the information encoded in its volume is actually stored on a lower-dimensional boundary, such as a quarter of its de Sitter horizon. This is comparable to viewing an optical hologram, where two-dimensional data (like laser light passing through stripes on a film) creates a three-dimensional image that appears real.

    Since information is a basic constituent of the universe that provides instructions to biological systems on how to organise and construct, the complexity of life (including its inherent processing and interpretation capacities) creates mental states. These mental states – colour, sound, touch, and all other subjective experiences necessary for survival and evolution – can be seen as a local, temporary, and highly complex projection. In this analogy, the biological complexity of life and consciousness represents the laser light in our familiar optical holograms, which translates the universal information code into our three-dimensional, perceived reality.

    To read and download this eBook in full ‘Information and Quantum Physics: The Universe as a hologram’ click here

    Contributor Details