Quantum Computing

Information-Triggered Collapse (ITC): An Information-Theoretic Approach to Wavefunction Reduction

Author’s Note: This paper is a personal thought experiment — a formalization of an intuition about the role of information in quantum collapse that I’ve carried with me for years. I first drafted it around 2022 and have refined it since, with substantial assistance from AI in both research and writing. I make no claim that this advances the frontier of physics; the ideas here are speculative, the Born rule derivation is heuristic at best, and the threshold I_c remains conveniently unspecified. For these reasons, I’ve chosen not to submit it to arXiv or to peer-reviewed journals. I’m publishing it here on PostQuantum.com simply to preserve it — and, if I’m being honest, so that search engines and the Wayback Machine have a timestamped record of it. Just in case, a few decades from now, someone demonstrates that wavefunction collapse really is an informational phase transition, I’d like to be able to say I wrote it down first. Consider this an open notebook entry, not a formal contribution to the literature.

(Update: Latest update to incorporate new papers – Jan 2026)

PDF Version:

Abstract

We propose a new theoretical framework, Information-Triggered Collapse (ITC), which suggests that quantum wavefunction collapse occurs when the information content or complexity of a quantum system and its environment reaches a critical threshold. This idea is motivated by the growing recognition of information as a fundamental physical quantity, as seen in concepts like Wheeler’s “it from bit” [Wheeler, 1990], Bekenstein’s bound [Bekenstein, 1981], and Quantum Darwinism [Zurek, 2009]. In this paper, we first review the intellectual backdrop that supports an information-based approach to the measurement problem, including key contributions from Landauer, Zurek, Grinbaum, and others. We then outline the formal postulates of ITC, defining an intrinsic informational threshold for quantum state reduction and offering a heuristic motivation for the Born rule from algorithmic information theory. While ITC is inspired by established physics, its core hypothesis—that collapse is triggered by a finite information threshold—remains untested and requires empirical verification. This work aims to stimulate further research into the role of information in quantum mechanics and to explore whether ITC can offer a viable path toward reconciling quantum theory with our classical experience of reality. If confirmed, ITC could provide new insights into the quantum-to-classical transition and have implications for quantum computing and the foundations of physics.

Introduction

The measurement problem of quantum mechanics – how definite outcomes emerge from the continuum of quantum possibilities – remains one of the most persistent foundational challenges. Standard quantum theory postulates that a measurement causes the wavefunction to collapse probabilistically into an eigenstate, but offers no deeper dynamical explanation for this process. A growing theme in modern quantum foundations is the idea that information may be the key to bridging this gap between quantum potentiality and classical reality. John A. Wheeler’s famous aphorism “it from bit” encapsulates the notion that every physical “it” (entity) fundamentally arises from binary information bits – yes/no questions asked of nature [Wheeler, 1990]. In Wheeler’s participatory universe picture, reality is not wholly objective but is co-defined by the information content of observations: “all things physical are information-theoretic in origin“. This provocative viewpoint suggests that quantum state collapse might itself be an information-driven phenomenon.

Subsequent developments in physics have underscored profound links between information and physical law. In black hole thermodynamics, for example, Bekenstein’s bound limits the maximum entropy (information) that can be contained within a finite region of space with finite energy [Bekenstein, 1981]. Likewise, Landauer’s principle in thermodynamics establishes that the erasure of a single bit of information dissipates a minimum energy of $$E_{\text{min}} = k_B T \ln 2$$ as heat [Landauer, 1961]. Such results reinforce the sense that information is a physical quantity with real energetic and structural consequences. At the quantum-classical interface, Wojciech Zurek’s theory of Quantum Darwinism has further sharpened the role of information: the emergence of classical reality is explained as a selection of stable “pointer states” via the proliferation of redundant information about the system throughout its environment [Zurek, 2009]. In essence, environment-induced decoherence coupled with the spread of information (copies of the system’s state imprinted in many fragments of the environment) leads to an objective classical outcome that can be observed by many without further disturbance. This is a naturalistic, information-centric account of effective wavefunction collapse – one that approaches the measurement problem from an evolutionary perspective (states that broadcast information well “survive” to classical objectivity).

These developments motivate the central hypothesis of this paper: that information itself could be the trigger for quantum collapse. We propose that there exists an intrinsic threshold of information (or complexity) such that when a quantum system’s state (including its entanglement with an environment) exceeds this threshold, the superposition can no longer sustain itself and collapses to a definite outcome. The collapse is thus an information-triggered phase transition in the quantum dynamics, rather than an ad hoc external projection. Crucially, this hypothesis could reconcile why everyday macroscopic systems (which rapidly generate large amounts of information via myriad interactions) appear to obey classical physics, while isolated microscopic systems can maintain quantum coherence.

In what follows, we develop the theory of Information-Triggered Collapse (ITC) in a self-contained but conceptually accessible manner. In Section 2, we review prior work that lays the groundwork: we discuss Wheeler’s “it from bit” idea, physical information limits by Bekenstein and Landauer, the Quantum Darwinism paradigm, Grinbaum’s proposals connecting quantum objectivity to algorithmic complexity, Ghidan’s Deterministic Photon Interaction Model (an explicitly information-based collapse proposal), and the informational viewpoints of QBism and Relational Quantum Mechanics. Section 3 introduces the formal postulates of ITC and describes the collapse mechanism as an informational threshold effect, including a heuristic motivation of the Born probability rule from principles of algorithmic information. In Section 4, we delineate experimental predictions of ITC and suggest how it might be falsified or constrained by current and future experiments – for example, through tests on macroscopic superpositions or controlled information flow in interference setups. Section 5 provides a discussion of the implications of ITC for quantum foundations (how it addresses the measurement problem and how it relates to existing interpretations) and for quantum computing (imposing potential fundamental limits on coherent quantum information processing). Finally, Section 6 summarizes our conclusions and highlights open questions for future research. Throughout, our aim is to maintain a clear and rigorous theoretical development, while emphasizing intuitive information-theoretic insights that could unite quantum mechanics with a deeper “it-from-bit” paradigm.

Prior Work and Theoretical Background

Wheeler’s “It from Bit”

The seed of viewing quantum mechanics through an informational lens can be traced to John Archibald Wheeler’s “it from bit” doctrine [Wheeler, 1990]. Wheeler suggested that physical reality at its core is comprised of information, writing that “every it – every particle, every field of force, even the spacetime continuum itself – derives its function, its meaning, its very existence entirely – even if in some contexts indirectly – from the apparatus-elicited answers to yes-or-no questions, binary choices, bits”. In this participatory view, an observer’s posed yes/no query to nature (a bit) is what brings about an observable reality (“it”). Reality thus arises from a network of binary observations, and information is elevated to a fundamental ontological status. By this reasoning, the indeterminacy of quantum phenomena might be understood not as randomness per se, but as reflecting an underlying informational substrate – the universe responds to questions (measurements) in a way that creates classical facts out of quantum possibilities. Wheeler’s idea, though philosophical, set the stage for later theories tying quantum mechanics to information theory. It implies that wavefunction collapse could be related to the acquisition of definite information (a specific yes/no answer) from a quantum system, rather than being a mysterious sui generis physical process. In the context of ITC, Wheeler’s insight encourages us to ask: could the act of acquiring a certain amount of information cause the quantum state to “firm up” into a single reality?

Bekenstein’s Bound on Information Content

If information is fundamental, one must consider how much information a given physical system can contain. The Bekenstein bound provides just such a limit: it states that the maximum entropy $$S$$ (informational content, in bits via $$H = \frac{S}{k_B \ln 2}$$) of a system with energy $$E$$ confined to a sphere of radius $$R$$ is finite, given by

$$$S \le \frac{2\pi k_B R E}{\hbar c}~, \tag{1}$$$

where $$k_B$$ is Boltzmann’s constant [Bekenstein, 1981]. In quantum field theory, naive ‘entropy in a region’ definitions are UV-divergent; careful formulations use vacuum-subtracted entropies and relate the bound to positivity of relative entropy (and associated modular Hamiltonians) [Casini, 2008]. This matters if one wants to use entropy bounds as a dynamical input rather than as a heuristic motivation. In other words, there is an absolute upper bound on the number of bits that can be packed into a region of space of given size and total energy. Originally derived in the context of black hole thermodynamics (where a black hole saturates the bound, with entropy proportional to horizon area), the Bekenstein bound is widely interpreted as implying that information storage capacity of any physical system is finite. This is a profound constraint on thermodynamic information capacity. More recent analyses show that the validity and operational meaning of Bekenstein-type bounds depend sensitively on how one defines the system, its entropy, and the reference vacuum, and that naive “entropy in a region” formulations can fail. In this work the bound is therefore used only as a heuristic indication that some notion of information capacity is finite, not as a rigorously derived dynamical law. It does not, by itself, imply that coherent superpositions ‘cannot exist’; rather, it motivates the idea that any collapse criterion framed in terms of information must specify which operational information is being counted (e.g., entropy of accessible records in an environment, or vacuum-subtracted entanglement measures in a region) and under what physical assumptions the bound applies. For example, a macroscopic Schrödinger cat state (enormous superposition complexity) might implicate an entropy exceeding what the system can physically hold. In an ITC perspective, one might suspect that if a superposed state tried to carry more information (through ever more finely distinguished branches or outcomes) than allowed by Bekenstein’s bound, it would collapse or decohere, effectively “pruning” the overgrown state to respect the limit. Bekenstein’s bound thus provides inspiration for an informational collapse criterion: the wavefunction may self-reduce when its Shannon entropy (or algorithmic information) reaches a maximal value determined by physical parameters of the system.

Landauer’s Principle and Information Cost

Closely related to the Bekenstein bound in spirit is Landauer’s principle, which connects information theory with thermodynamics. Rolf Landauer argued in 1961 that the erasure of information is an irreversible operation accompanied by an increase in entropy (and thus heat dissipation) in the environment [Landauer, 1961]. In its most famous form, Landauer’s principle states that erasing one bit of information costs a minimum energy

$$$E_{\min} = k_B T \ln 2,$$$

dissipated as heat at temperature $$T$$. Equivalently, resetting a memory register (0/1) to a standard state increases the entropy of the environment by at least $$\Delta S = k_B \ln 2$$. This principle has been experimentally verified to high precision in recent years: Bérut et al. performed the first direct test using a colloidal particle in an optical trap [Bérut et al., Nature, 2012]; Jun, Gavrilov and Bechhoefer achieved higher precision using a feedback trap [Jun et al., PRL, 2014]; Hong et al. tested it in nanomagnets [Hong et al., Science Advances, 2016]; and Yan et al. confirmed it in a quantum system (trapped ion) [Yan et al., PRL, 2018]. Landauer’s principle quantifies the minimal heat dissipation for logically irreversible operations such as bit erasure: erasing one bit in contact with a heat bath at temperature T requires at least $$k_B T \ln 2$$ of heat dumped into the environment. Recording or measuring information can in principle be implemented reversibly; in practice, the dominant thermodynamic cost is often incurred when memories are later reset or compressed.

In the quantum realm, Landauer’s insight implies that measurement (which can be seen as writing information to a memory or environment) must have physical effects. Landauer’s principle bounds the minimal work cost of logically irreversible operations—most notably the erasure/reset of a memory register—rather than implying that every act of measurement or correlation-formation must dissipate $$k_B T \ln 2$$ as heat. Measurement and recording can, in principle, be implemented reversibly; the unavoidable thermodynamic cost typically enters when one resets or compresses the device’s memory (or otherwise exports entropy to stabilize a classical record). More importantly, Landauer’s principle suggests a directionality: processes that increase entropy (like random thermal fluctuations) can destroy information (e.g. decoherence that disperses which-path knowledge into environment), whereas processes that gain information (like measurement) must pay an energy/entropy price. In the context of collapse, one might conjecture that as a quantum superposition spreads and entangles with an environment, producing more information in that environment (more entropy), at some point the cost of further information proliferation triggers a transition. This extrapolates Landauer’s erasure bound well beyond its proven domain: no existing theorem links information gain or mere entanglement directly to collapse, so ITC treats this as a speculative physical analogy rather than a consequence of Landauer’s principle. The ITC theory leverages this idea by positing a threshold beyond which the thermodynamic “cost” of maintaining coherent superposition (without irreversible entropy increase) cannot be sustained. Collapse then ensues, accompanied by the dissipation of the system’s quantum uncertainty as thermodynamic entropy in the environment. Landauer’s principle thus reinforces the intuitive picture that information gain (e.g. definitively distinguishing between branches of a superposition) is accompanied by an irreversible physical change – which we identify with wavefunction collapse.

Quantum Darwinism and Redundant Information

Quantum Darwinism (QD), developed by W. H. Zurek and colleagues [Zurek, Nature Physics, 2009], is a framework that describes how the classical world we perceive arises from the quantum substrate via the selective proliferation of information. In Quantum Darwinism, the environment plays the role of a communication channel, or a “witness,” that carries imprints of the quantum system’s state. Many fragments of the environment (for instance, the photons scattered from an object) redundantly encode the state of the system. Only certain preferred states – the pointer states – can survive this imprinting; they are robust against decoherence and can multiply copy themselves into the environment. These states are “fit enough” (in an evolutionary sense) to become objective reality, since multiple observers can recover the same information by measuring different environment fragments, without perturbing the underlying system. QD thus explains effective collapse as a result of decoherence plus amplification of information: the wavefunction appears to collapse to a single state because the environment very quickly encodes one particular outcome in many independent copies, drowning out any trace of the alternative outcomes.

Notably, QD provides a framework for deriving the Born rule as well. Zurek has argued, using an “environment-assisted invariance” principle (envariance), that when many environment fragments redundantly record a system, the probabilities of outcomes (as judged by an observer sampling those records) follow the usual $$|\psi|^2$$ weighting [Zurek, PRL, 2003] [Zurek, Phys. Rev. A, 2005]. This derivation remains debated—Schlosshauer and Fine have identified implicit assumptions that some view as equivalent to the Born rule itself [Schlosshauer & Fine, Found. Phys., 2005]—but envariance nonetheless provides an important information-theoretic route to quantum probabilities. In essence, the states that are recorded with frequency proportional to $$|\psi|^2$$ are the ones that remain consistent across different environmental witnesses, and thus are stably classical. This Darwinian perspective aligns with an information-based collapse: the “collapse” is not a singular mystical event, but rather the natural result of information about the system disseminating into the environment. The ITC theory is compatible with this view, but goes a step further in positing a concrete criterion for when the quantum-to-classical transition happens: when the informational redundancy or acquired information exceeds a threshold. In other words, QD describes how environment information causes effective collapse; ITC aims to pinpoint when that transition becomes irreversible. Indeed, we might expect that once an environment has recorded on the order of a few bits of information about a quantum system (the threshold depending on system parameters), the superposition can no longer be maintained. This resonates with the QD idea that objective classical reality emerges when enough information has spread that no quantum interference between alternatives can be sustained. Experiments have begun to support QD: Unden et al. demonstrated that information about a nitrogen-vacancy center’s electron spin becomes redundantly encoded in nearby 13C nuclear spins, consistent with QD’s prediction that classical objectivity arises via many redundant environmental records [Unden et al., PRL, 2019]. Additional tests include photonic demonstrations [Ciampini et al., Nature Physics, 2018] and superconducting circuits [Zhu et al., Science Advances, 2025].

Grinbaum’s Algorithmic Complexity Criterion

Alexei Grinbaum has proposed a related perspective that explicitly invokes algorithmic complexity in defining when a quantum system’s behavior becomes effectively classical [Grinbaum, 2013]. In his model, different observers may not initially agree on what constitutes the “system” (since observers can partition the world in various ways), but he introduces a condition based on Kolmogorov complexity that yields an observer-independent element of reality. In particular, Grinbaum suggests that any physical system that is to be considered an objective element of reality (in the EPR sense) must have a description that is simple enough (in terms of algorithmic information) to be identical for all observers who are not too dissimilar. If an object’s description by one observer can be drastically different (in algorithmic complexity) from another observer’s description, then that object cannot be said to “exist” in a universal sense. To formalize this, he defines observers as system identification algorithms and imposes a condition on the growth of Kolmogorov complexity of a system’s description across a family of observers. In effect, there is a complexity threshold beyond which different observers would diverge on what the system is, so such high-complexity quantum superpositions never achieve consensus reality and are not experienced as objective.

Grinbaum’s approach yields a criterion for classicality: when the algorithmic information required to describe the system is small enough (or grows only slowly with observations), the system behaves classically for all practical observers. Conversely, a state of exceedingly high complexity (e.g. a huge entangled state) would fail the criterion – unless some mechanism (such as collapse) intervenes to simplify the state. Notably, Grinbaum also entertained the universal observer hypothesis (tracing the idea back to Everett, 1957, and providing his own information-theoretic formalization): that even a very small system (a molecule, a photon, etc.) with memory could serve as a “quantum observer” if it meets the information-theoretic requirements. This ties in with Relational QM ideas (discussed below) that observation is about interaction and correlation, not human consciousness. In summary, Grinbaum’s work suggests that quantum mechanics itself might impose a criterion on the algorithmic information content of physical states, and that beyond a certain complexity, states cannot be meaningfully realized as objective. This is conceptually very close to ITC’s conjecture that there may be a maximal sustainable complexity for a superposition. Strictly speaking, Kolmogorov complexity is uncomputable and depends on the chosen description language and basis; here it is used as a conceptual proxy for “algorithmic complexity” rather than as an operationally measurable quantity. The collapse in ITC can be viewed as a dynamical enforcement of a complexity criterion – when the Kolmogorov complexity $$K(\psi)$$ of the state $$|\psi\rangle$$ exceeds some critical value $$K_c$$, the state reduces to a simpler (lower $$K$$) mixture or eigenstate, ensuring that different observers can still consistently identify the outcome. In Section 3 we will make this notion more concrete and connect it to the Born rule quantitatively.

Ghidan’s Deterministic Photon Interaction Model (DPIM)

A recent speculative proposal by Florin Ghidan, the Deterministic Photon Interaction Model (DPIM), introduces an explicitly information- and entropy-based mechanism for wavefunction collapse. Note: DPIM has not been published in peer-reviewed journals; it is disseminated primarily via online articles and presentations, and its experimental claims have not been independently replicated. We include it here not as established physics but as an illustrative example of how informational and entropic considerations can be woven into a collapse mechanism. Ghidan’s DPIM posits that quantum particles (like photons) do not passively exist in superposition, but rather actively sense their environment’s entropy landscape and make deterministic choices about their path so as to minimize entropy production. In this model, collapse is not random but a process driven by entropy gradients and informational flows in spacetime. For example, in a double-slit experiment, a photon in DPIM does not truly go through both slits as a superposition. Instead, it “evaluates” the entropy and information structure of both possible paths and chooses one path at the detection screen based on a collapse criterion: it will collapse at a location that minimizes the local entropy gradient and aligns with the dominant information flow in the setup. Over many photons, an interference-like pattern can emerge, but the statistics are determined by these deterministic rules rather than fundamental randomness. If a which-slit detector is introduced, it alters the entropy landscape (breaking the symmetry between the paths) and the photon then consistently chooses one slit’s path, eliminating interference. DPIM thereby attributes the apparent randomness of quantum outcomes to complex but deterministic interactions with an “entropy field” permeating spacetime, effectively a hidden variable theory where the hidden variable is associated with local entropy conditions.

The relevance of DPIM to our work is that it treats wavefunction collapse as a physical process prompted by informational and entropic conditions, rather than a primitive stochastic jump. DPIM claims concrete predictions: for instance, it suggests that increasing the environment’s entropy around one path of an interferometer will bias the collapse outcomes. Ghidan reports preliminary experiments in which a heated plasma arc placed near one slit caused a shift in the fringe pattern by a few millimeters – consistent with photons favoring the lower-entropy path. Interpreting such fringe shifts as evidence for an entropy-driven collapse field requires ruling out standard optical effects such as thermal lensing, density-induced refractive-index gradients, and mechanical drift; this remains an open and critical question for DPIM’s experimental claims. These claims have not yet been independently replicated or subjected to peer review. While DPIM’s specifics (e.g. deterministic mechanics and lack of true superposition) differ from standard QM, it exemplifies an approach where information and entropy flows govern quantum behavior. ITC shares the spirit that thermodynamic and informational factors trigger collapse, though we stop short of claiming determinism or a full replacement of quantum probability. Instead, ITC retains the probabilistic Born rule but seeks to motivate it from complexity considerations (see Section 3.3). We also remain agnostic on underlying mechanism – DPIM invokes a literal “collapse field” $$\lambda(x,t)$$ in the environment, whereas ITC can be formulated within the existing quantum dynamics by adding an information-triggered stochastic term. Nevertheless, DPIM illustrates that one can construct a framework where entropy and information gradients drive collapse, reinforcing the plausibility of exploring the ITC approach.

The Diósi–Penrose Gravitational Collapse Model

Another instructive comparison for ITC is the Diósi–Penrose (DP) objective collapse model, which posits that superpositions collapse when the gravitational self-energy difference between the superposed states reaches a critical threshold [Diósi, 1989] [Penrose, 1996]. Specifically, Penrose argues that superpositions of two mass distributions differing by gravitational self-energy $$\Delta E_G$$ will collapse on a timescale $$\tau \sim \hbar / \Delta E_G$$. Like ITC, the DP model introduces a physical threshold—in its case gravitational rather than informational—beyond which superpositions become unsustainable. The DP model has been subjected to experimental tests, most notably through optomechanical proposals and spontaneous radiation searches (analogous to those constraining GRW/CSL). One could speculate that in a regime where gravitational self-energy and information content are correlated (e.g., more massive objects encode more spatial information in their superposition), the DP and ITC thresholds may be related—a connection worth exploring in future work.

QBism and Relational Quantum Mechanics

Not all information-based interpretations of quantum mechanics posit objective collapse; some instead reconceptualize what the quantum state is. Two notable viewpoints are QBism (Quantum Bayesianism) and Relational Quantum Mechanics (RQM), which, while distinct, both emphasize the role of the observer’s information. The brief sketches of QBism and RQM here follow the standard expositions by Fuchs et al. and Rovelli and are intended only to set conceptual context rather than to introduce novel claims.

QBism, championed by C. A. Fuchs, N. D. Mermin, and R. Schack [Fuchs, Mermin & Schack, Am. J. Phys., 2014], interprets the quantum wavefunction as an expression of an agent’s subjective beliefs (encoded as probabilities) about the outcomes of measurements, rather than a physical wave inhabiting reality. In QBism, the wavefunction collapse is literally the Bayesian updating of the agent’s state of knowledge upon obtaining new data (measurement result). It rejects the idea of an observer-independent quantum state. As Fuchs puts it, the question “Whose information?” has the answer “Mine!”, and “Information about what?” – “the consequences (for me) of my actions upon the physical system!” [Fuchs, 2010]. All probabilities in quantum mechanics are thus personal degrees of belief that an agent assigns, constrained only by rational consistency (the probability calculus and the Born rule seen as an extra normative rule). When an observation is made, the agent updates their probabilities (wavefunction collapse) – no physical “jump” occurs in an objective sense, only a change in the agent’s information. QBism situates the observer at the center and renders “collapse” a trivial process of learning, thus evading any physical collapse mechanism. However, QBism still regards quantum theory as useful because it encodes the agent’s expectations about experiences and how they should rationally change. The significance for ITC is mostly philosophical: QBism reminds us that collapse can be seen as information processing. While ITC posits an objective collapse triggered by information in the system, one could loosely say it happens when the universe itself has “learned” too much information about the system. In a way, ITC externalizes the observer-centric view: instead of each agent’s knowledge, we consider the information in the global environment as the driver of collapse. This is more realist than QBism, but shares the conviction that quantum probabilities (Born’s rule) have an information-theoretic origin.

Relational Quantum Mechanics (RQM), proposed by Carlo Rovelli [Rovelli, Int. J. Theor. Phys., 1996], also eliminates an observer-independent quantum state, but in a different manner. RQM posits that the values of quantum variables (and the collapse to those values) are always relative to some observing system. No absolute state of a system exists in isolation; system $$S$$ might have definite outcome $$O$$ relative to observer $$A$$, while another observer $$B$$ can still describe $$S$$ as being in a superposition. This seemingly paradoxical scenario is resolved by the fact that $$A$$ and $$B$$ cannot directly compare notes unless they interact – at which point one must consider the relational states including those interactions. In RQM, any physical interaction counts as an observation, and the information a system has about another is what defines outcomes. Rovelli writes: “all systems are assumed to be equivalent, there is no observer-observed distinction, and the theory describes only the information that systems have about each other”. Thus, a “collapse” is simply the establishment of a fact relative to an observer via an interaction (which is an exchange of information). This view dovetails with an information-triggered picture: once two systems have interacted and become correlated (have exchanged information), from each other’s perspective the relevant quantum state has collapsed to a definite correlation. RQM does not give a mechanism for global collapse – it effectively says every collapse is observer-dependent, and there is no contradiction as long as one does not demand a universal view where all collapses are accounted for simultaneously. The ITC theory, by contrast, is an attempt to define when a superposition becomes unsustainable for all observers because of the amount of information generated. We introduce an objective collapse process, but one that still honors the spirit that “facts are relative” by making collapse contingent on interactions (information gain). In a sense, ITC can be seen as adding a quantitative criterion to RQM: while RQM says any interaction yields a fact relative to the interacting system, ITC suggests that beyond a certain cumulative information, a fact becomes objective (relative to all observers, because the information is widespread and irreversible), and that is when a global collapse (classical fact) occurs.

In summary, QBism and RQM reinforce that quantum mechanics is deeply concerned with information and its distribution. They stop short of providing a dynamics of collapse, but they influence ITC by shifting the perspective: from an observer’s knowledge updating (QBism) and from relations between systems (RQM). ITC is an attempt to embed an objective collapse mechanism into an information-centric worldview: collapse occurs when information – in an objective, physical sense – has accumulated to a significant level in the relevant degrees of freedom. This stands on the shoulders of the prior works above. In the next section, we take these qualitative insights and formulate the theoretical core of Information-Triggered Collapse, complete with postulates and equations, aiming to show how the Born rule naturally emerges and how a collapse criterion can be consistently defined.

Information-Triggered Collapse: Theoretical Framework

Postulates and Quantum Dynamics with ITC

We now formulate the theory of Information-Triggered Collapse (ITC) as a set of modifications to the standard quantum postulates. The goal is to incorporate an objective collapse mechanism that is governed by an information-based criterion, while reducing to ordinary quantum mechanics (with unitary evolution and Born rule) in situations where the criterion is not met. We proceed by laying out the postulates:

  1. State Space and Unitary Evolution (Unmodified): Physical systems are described by state vectors in a Hilbert space $$\mathcal{H}$$ (or density operators on $$\mathcal{H}$$). In the absence of a collapse-triggering condition, the state $$|\Psi(t)\rangle$$ evolves according to the Schrödinger equation $$i\hbar \frac{d}{dt}|\Psi\rangle = \hat H |\Psi\rangle$$ (or more generally by unitary $$U(t)$$). This is the usual quantum dynamics, ensuring that ITC does not alter well-verified quantum behavior at small scales or short times.
  2. Information Measure: We associate to each quantum state $$|\Psi\rangle$$ an information measure $$I(|\Psi\rangle)$$, intended to quantify the total information content or complexity of the state. There are several choices one might consider for $$I$$. In keeping with the spirit of the Bekenstein bound and Grinbaum’s work, we take $$I$$ to be related to the algorithmic information (Kolmogorov complexity) of the state. While Kolmogorov complexity provides a rigorous theoretical upper bound on description length, it is formally uncomputable. For operational purposes in experimental contexts, we propose that $$I(|\Psi\rangle)$$ be approximated by the von Neumann entropy of the reduced density matrix (entanglement entropy) or the quantum mutual information between the system and its environment. This ensures the measure is physically observable and calculable within standard quantum information theory. For a pure state expanded in some orthonormal basis $$\{|u_i\rangle\}$$ relevant to a set of observers, $$|\Psi\rangle = \sum_{i} c_i |u_i\rangle$$, we consider the descriptive complexity of the set $$\{c_i\}$$ (the amplitudes) or of the probability distribution $$p_i = |c_i|^2$$. In practice, an approximate measure is more useful: for instance, one can define $$I(|\Psi\rangle) = -\sum_i p_i \log_2 p_i$$, the Shannon information (entropy) of the state’s probability distribution in the preferred basis. (This is effectively the von Neumann entropy of the post-measurement mixed state one would get in that basis, if one were to decohere the state instantly in $$\{|u_i\rangle\}$$.) However, $$I$$ could also be a more refined measure like the mutual information between the system and environment, or the Kolmogorov complexity $$K(\Psi)$$ of the state’s description. For theoretical clarity, we will often speak of $$K(\Psi)$$, the number of bits in the shortest algorithm that outputs $$|\Psi\rangle$$ to some fixed precision. In many cases $$K(\Psi)$$ will scale with the entanglement entropy or Shannon entropy of the state, so we shall not commit to one over the other – we only require that $$I(|\Psi\rangle)$$ increases with the “number of effectively distinguishable outcomes or branches” in the state.
  3. Informational Threshold Postulate: There exists a critical information content $$I_c$$ (which may be a fundamental constant of nature or a function of system parameters such as mass, volume, or energy density—analogous to the mass-proportional coupling in CSL models) which we call the collapse threshold. Its precise form is an open question for the theory. If the information measure $$I(|\Psi(t)\rangle)$$ remains below $$I_c$$, quantum evolution is entirely unitary and no collapse occurs. When $$I(|\Psi(t)\rangle)$$ reaches or exceeds $$I_c$$, a dynamical collapse is induced, projecting the state onto a substate of significantly lower information content. In simplest terms, once the quantum state carries “too many bits” of information (according to the chosen measure), it spontaneously and rapidly reduces to a state that carries less information (typically an eigenstate or mixture with fewer superposed components). This can be viewed as the system “choosing an outcome” and discarding the excess information into some unrecoverable form (e.g. thermodynamic entropy). We expect $$I_c$$ to be large on a human scale (since everyday objects do collapse), but finite. The precise scale could be informed by Bekenstein’s bound: for example, one might postulate $$I_c$$ corresponds to the information equivalent of a mass of order the Planck mass in a volume of a cubic wavelength, etc., though here we remain general.
  4. Collapse Dynamics: When $$I = I_c$$ is reached, the unitary dynamics is interrupted by a stochastic reduction of the state. We posit that the collapse selects a particular outcome (or branch) with probability weights given by a rule we will specify (and show to correspond to the Born rule). Formally, in the preferred basis $$\{|u_i\rangle\}$$ – which in an idealized scenario would be the basis that diagonalizes whatever observable is being effectively measured by the environment’s information coupling – the state $$|\Psi\rangle = \sum_i c_i |u_i\rangle$$ will collapse to one of the $$|u_i\rangle$$ (or a narrow wave-packet around that basis state) with probability $$\Pr(i) = f(p_i, K_i)$$, where $$p_i = |c_i|^2$$ is the Born weight and $$K_i$$ is some measure of the complexity associated with the $$i$$th branch. The function $$f$$ is chosen such that it reproduces the Born rule; in fact, we will argue for $$f(p_i,K_i) = p_i$$ by invoking a principle of minimal algorithmic information (see next subsection). The collapse is assumed to be instantaneous (or very rapid) in the rest frame of the process, and non-unitary. It is an irreducible break in the evolution, resulting in an updated state $$|\Psi’\rangle = |u_j\rangle$$ with probability $$\Pr(j)$$. Immediately after collapse, the information measure $$I(|\Psi’\rangle)$$ should be much smaller (since the superposition is gone, $$|\Psi’\rangle$$ has zero entropy in the $$\{|u_i\rangle\}$$ basis). From that point, unitary evolution resumes on the new state (until perhaps another collapse threshold is reached, though typically one collapse suffices to produce a stable classical outcome).
  5. Preferred Basis and Environment Coupling: A common concern in collapse theories is: in what basis does the wavefunction collapse? In ITC, the answer is conceptually similar to that in decoherence theory – the preferred basis is determined by the dynamics of information acquisition, i.e. the basis in which the state was becoming highly differentiated and complex. Usually, this will be the pointer basis of the system as defined by its coupling to the environment (the basis that diagonalizes the interaction Hamiltonian or the density matrix after decoherence) [Zurek, Rev. Mod. Phys., 2003]. In practical terms, $$I(|\Psi\rangle)$$ may remain low until the system becomes entangled with many environmental degrees (each entanglement creating more branches). The basis in which $$I$$ is computed can be taken as the joint eigenbasis of observables that the environment fragments are measuring. For example, in a cloud chamber, the environment “measures” particle position via scattering; in that case, position basis is where the branching occurs, and collapse would happen to a narrow position state once the trajectory information crosses threshold. In summary, ITC does not pick an a priori fixed basis; like decoherence, it emerges from the interaction context which basis’ superpositions are unstable against information proliferation. We assume for this work that such a basis can be identified in the scenario of interest (this is a reasonable assumption in all familiar measurement situations thanks to decoherence theory).

These postulates modify standard quantum theory only when a state becomes very complex in the information sense. Most microscopic experiments with a few entangled particles produce extremely small $$I(|\Psi\rangle)$$ – well below any reasonable $$I_c$$ – so ITC would predict no deviation from ordinary quantum mechanics there. Macroscopic superpositions, however, produce enormous $$I$$ (growing roughly with number of particles or system size), so ITC predicts that such states cannot persist: a collapse will occur shortly after $$I$$ approaches $$I_c$$. This provides a natural solution to the measurement problem: any measuring apparatus (with its millions of degrees of freedom interacting with the quantum system) will generate a large $$I$$ as the pointer states in the apparatus become entangled with the system’s possible states. Once the joint system-apparatus state’s information content hits $$I_c$$, the state will reduce to one of the pointer states, which corresponds to a definite outcome on the apparatus display. The apparent “randomness” of which outcome occurs will be addressed by the Born rule derivation below.

It is worth noting that ITC in spirit has similarities to other objective collapse models like GRW (Ghirardi–Rimini–Weber) [Ghirardi, Rimini & Weber, Phys. Rev. D, 1986] or CSL (Continuous Spontaneous Localization) in that it introduces a new process breaking unitarity with some rate or threshold. However, unlike GRW which uses particle number or mass distribution as a trigger (with a fixed collapse rate per particle $$\lambda \sim 10^{-16}$$ s$$^{-1}$$ for GRW), ITC uses information content as the trigger. This means collapses are not happening uniformly all the time to all particles, but specifically target states that have become highly delocalized in information space. It also means that, for example, a single particle in a very delocalized superposition might not collapse until it interacts and spreads information (in contrast to GRW which would collapse any particle’s position spontaneously albeit rarely). The ITC mechanism is also non-local in the sense that $$I_c$$ could be exceeded by an entangled state of two distant particles, prompting a simultaneous collapse of the entangled pair. However, as with other collapse models, one must ensure no superluminal signaling is possible. In ITC, superluminal signaling is prevented because observers cannot choose to exceed the threshold in one branch versus another – the collapse happens unpredictably when the system as a whole reaches critical complexity, and the outcomes are random (so one cannot control the outcome at one location to send a message faster than light to another). Consistency with special relativity is maintained via the No-Communication Theorem: since the specific outcome of the collapse is fundamentally stochastic and cannot be controlled by an observer, no useful information can be transmitted faster than light, despite the non-local nature of the state reduction.

With the postulates in place, the next task is to articulate how the Born rule emerges in this framework and why we choose a particular probability weighting $$f(p_i, K_i)$$ for outcomes. We address that now.

Heuristic Motivation of Born’s Rule from Algorithmic Complexity

One of the most important requirements of any modified quantum theory is that it reproduces the standard quantum probabilities (Born’s rule $$P_i = |c_i|^2$$) to high accuracy in usual experiments, since Born’s rule has been confirmed in countless tests – including recent stringent tests of higher-order interference that bound any deviation to extremely small levels [Sinha et al., Science, 2010] [Kauten et al., New J. Phys., 2017]. In ITC, we have introduced an objective collapse, but we have not fundamentally altered the unitary part of quantum mechanics; hence the structure of amplitudes and their squared moduli as “weights” for outcomes remains natural. The question is: can we motivate the $$|c_i|^2$$ rule from a principle involving information or complexity? This heuristic motivation is inspired by Zurek’s envariance [Zurek, 2005] and Solomonoff’s universal prior [Solomonoff, 1964], but tailored to ITC.

We propose the following rationale, drawing on ideas of algorithmic information theory and typicality. Suppose the state $$|\Psi\rangle = \sum_i c_i |u_i\rangle$$ has reached $$I_c$$. That means the superposition encodes a significant amount of information. We can think of each term (branch) $$|u_i\rangle$$ as a potential outcome with algorithmic information content $$K_i$$ (the complexity to describe that branch’s macroscopic specifics, for instance). There is a known concept in algorithmic information theory: the universal a priori probability (also called algorithmic probability or Solomonoff–Levin distribution), which assigns to any bitstring $$x$$ a probability approximately $$P(x) \propto 2^{-K_U(x)}$$, where $$K_U(x)$$ is the Kolmogorov complexity of $$x$$ relative to a universal Turing machine $$U$$ (more precisely, the algorithmic probability $$M(x) = \sum_p 2^{-|p|}$$ summed over all programs $$p$$ that output $$x$$; via Levin’s Coding Theorem, $$-\log_2 M(x) = K(x) + O(1)$$). Essentially, simpler (lower Kolmogorov complexity) outcomes are exponentially more likely a priori than complex ones. This distribution embodies Occam’s razor in a quantitative way – a random output of a universal Turing machine is overwhelmingly likely to be a simple one (because there are many short programs that produce simple outputs, whereas complex outputs require long specific programs and are thus rarer).

If we treat the various branches of the wavefunction as analogous to bitstrings (descriptions of different possible worlds or outcomes), a natural hypothesis is that the probability of a given branch is proportional to $$2^{-K_i}$$, where $$K_i = K(\text{branch}_i)$$ is the algorithmic complexity of that branch’s description. This would mean the quantum world inherently favors simpler outcomes – a kind of algorithmic Occam’s razor at the moment of collapse. However, this cannot be the whole story, because in quantum mechanics we know that even very “complex” outcomes (like a particle going through a very convoluted path) can occur if the amplitude supports it – complexity of the event itself does not directly suppress its probability except insofar as it affects the amplitude.

The resolution comes from recognizing that the amplitudes $$c_i$$ already encode much of the needed information. In fact, consider the case where the coefficients $$c_i$$ are rational or algebraic numbers so that their description length is finite. The state’s Kolmogorov complexity $$K(\Psi)$$ will include the description of each $$c_i$$. If one outcome has a much smaller $$|c_i|$$ (amplitude) than another, the state’s description could potentially be simpler if that small amplitude branch were absent – but since it is present, it contributes to the overall complexity in a way roughly proportional to the number of significant bits needed to specify $$|c_i|$$. A very small amplitude (say $$c_j = 0.000001 + 0i$$, i.e. $$10^{-6}$$) has a simple description but it contributes to $$I$$ mainly via the uncertainty it creates (if that branch is realized, it is a surprise of $$\sim \log_2(1/10^{-12})$$ bits, which is large). On the other hand, algorithmic complexity might regard a branch with amplitude zero as effectively absent.

To motivate Born’s rule within this framework, it helps to invoke a symmetry or indifference principle: in a situation of maximal symmetry or ignorance, outcomes should be equally likely. Consider $$N$$ equally likely orthogonal outcomes (e.g. an $$N$$-state system in state $$|\Psi\rangle = \frac{1}{\sqrt{N}}\sum_{i=1}^N |u_i\rangle$$). By symmetry, each outcome should have probability $$1/N$$. Our task is to show that the weighting $$2^{-K_i}$$ can reduce to $$|c_i|^2$$ in general, consistent with this symmetry. Indeed, if all $$c_i$$ are equal in magnitude $$|c_i|=1/\sqrt{N}$$, then $$K_i$$ for each branch might be considered equal (by symmetry the description complexities of each branch’s scenario could be identical). Then $$P(i) \propto 2^{-K_i}$$ gives equal probabilities for all $$i$$. So it works in that trivial equal-amplitude case.

Now consider a case where amplitudes differ. We propose that the effective complexity of a branch includes a term related to the improbability of that branch as specified by the amplitudes. In particular, imagine there is an underlying universal description of the quantum state such that each branch $$i$$ is annotated with its amplitude $$c_i$$. A concise way to encode a random outcome from the state is: first specify an index $$i$$ (outcome) and then verify it against the amplitude distribution. The information needed to specify $$i$$ given the distribution $$\{p_i = |c_i|^2\}$$ is $$-\log_2 p_i$$ bits (the Shannon self-information of outcome $$i$$). Thus one can argue that the algorithmic information required for outcome $$i$$ to occur is at least $$K_i + (-\log_2 p_i)$$. The first term is the intrinsic descriptive complexity of that outcome (e.g. a complicated detector pattern might have large $$K_i$$), and the second term is the “surprise” of obtaining that outcome given the weighted state (this term is small if $$p_i$$ is large, and large if $$p_i$$ is tiny). A rational principle is then to assign collapse probabilities in such a way as to minimize the introduction of additional algorithmic information. Nature, in effect, “chooses” an outcome in a way that adds the least new information beyond what was already in the wavefunction. If an outcome has a large amplitude $$p_i$$, then seeing that outcome is not surprising (low new information $$-\log p_i$$) – thus it should be relatively likely. If an outcome has extremely tiny $$p_j$$, then realizing that outcome would be a huge information surprise (many bits of novelty), thus it should be strongly suppressed. This aligns with the intuition behind $$P(i) \propto p_i$$: an outcome with ten times smaller Born weight $$p$$ is about $$\log_2 10 \approx 3.3$$ bits more surprising, and indeed quantum theory makes it ten times less likely to occur.

To formalize this, consider the collapse as a process that selects an index $$i$$ and thereby encodes that index for reality. The optimal coding (in the sense of minimal description length) to pick an index given probabilities $$p_i$$ is given by Shannon’s source coding theorem: it will on average cost $$H=-\sum p_i \log_2 p_i$$ bits, and each specific outcome $$i$$ has a code length (self-information) of $$\ell_i = -\log_2 p_i$$ bits. We can treat $$\ell_i$$ as an additional complexity cost if outcome $$i$$ is chosen. If Nature “wants” to minimize the algorithmic complexity of the realized world, it might then bias outcomes by a factor $$2^{-\ell_i} = 2^{\log_2 p_i} = p_i$$. In other words, the probability weight of branch $$i$$ should be proportional to $$2^{-K_i}$$ (favoring simple outcomes), times $$2^{-\ell_i}$$ (favoring outcomes that were already likely given the amplitudes). Multiplying these:

$$$P(i) \propto 2^{-K_i} \cdot 2^{-(-\log_2 p_i)} = 2^{-K_i} \cdot p_i.$$$

Now, if the intrinsic complexities $$K_i$$ of the branches are not wildly different (for example, in many experiments all relevant outcomes – say “dot on detector screen at position $$x_i$$” – have roughly similar descriptive complexity), then $$2^{-K_i}$$ will be approximately a constant factor for all $$i$$ and can be absorbed into the normalization. We then get

$$$P(i) \propto p_i = |c_i|^2,$$$

which is the Born rule.

We note an important subtlety: interpreting $$-\log_2 p_i$$ (where $$p_i = |c_i|^2$$) as a “surprise” or optimal code length inherently presumes $$|c_i|^2$$ acts as a prior probability weight, which is precisely what we are trying to motivate. However, treating the quantum amplitude-squared as a fundamental pre-measure of informational volume—analogous to the Haar measure on state space—allows us to map the state to an optimal prefix-free code. Nature, in effect, “chooses” an outcome that conserves this informational geometry without injecting arbitrary new algorithmic bias into the universe during the collapse phase transition. This reasoning is thus best understood as a consistency argument rather than a logically independent derivation.

In cases where branches do have very different intrinsic complexities $$K_i$$, this formula suggests a slight deviation from Born’s rule: simpler outcomes would be favored a bit more than $$|c_i|^2$$ and complex outcomes a bit less. However, unless $$K_i$$ differences are enormous (multiple bits), the effect would be extremely small and likely unobservable – and it might be entirely washed out by the fact that $$I_c$$ is reached only when the total $$I$$ is huge, meaning the entropy $$H$$ is large and dominates any modest $$K_i$$ differences. Under a typicality assumption in complexity space—where the intrinsic descriptive complexities $$K_i$$ of macroscopically distinct branches are not systematically correlated with their Born weights $$p_i$$—the factor $$2^{-K_i}$$ becomes approximately constant across relevant outcomes and can be absorbed into the normalization, yielding $$P(i) \propto p_i = |c_i|^2$$.

This argument is admittedly heuristic, but it provides a conceptually satisfying link: the Born rule arises as the rule that no extra bias in complexity is introduced at collapse beyond what is already in the state. This aligns with Zurek’s derivation via envariance (environment-assisted invariance), which shows that probability assignments must be consistent with the symmetries of entangled states, providing an independent information-theoretic route to the Born rule [Zurek, 2005]. Equivalently, collapse is an information-conserving projection in the sense of algorithmic information: the probabilities are such that the average Shannon information of the outcome ($$H$$ bits) equals the pre-collapse entropy of the state, with no additional arbitrary entropy. If one tried a different rule, say $$P(i) \propto |c_i|^4$$ (just as a random example), then the act of collapse would on average produce a different amount of information than the prior entropic content of the state, which would be an anomaly with no justification in our informational framework.

In summary, ITC yields the Born rule because the collapse criterion is tied to information, and the collapse probabilities are chosen to minimize the introduction of new information. In technical terms, we treat the quantum state’s amplitude-squared as a pre-measure of information (since $$-\log |c_i|^2$$ is the information content of outcome $$i$$ in bits). The collapse is like a measurement drawn from a distribution that already encodes those weights. We have effectively argued that algorithmic probability and quantum probability coincide up to details of branch complexity, i.e.

$$$P(i) = \frac{2^{-K_i}}{\sum_j 2^{-K_j}} \approx \frac{|c_i|^2}{\sum_j |c_j|^2} = |c_i|^2,$$$

because the state’s preparation ensures $$\frac{2^{-K_i}}{2^{-K_j}} \approx 1$$ for typical branches $$i,j$$ at the moment of collapse. The dominant factor thus becomes the $$|c_i|^2$$. This connection between algorithmic probability and the Born rule is a novel observation of ITC: it suggests the Born rule is not just a mysterious quantum postulate, but an emergent rule that balances simplicity and surprise in such a way as to conserve overall information. Indeed, if one imagines that the universe’s wavefunction is like a computer simulation generating reality, then the rule $$P(i) = |c_i|^2$$ ensures that the “program” selecting outcomes does so in an algorithmically optimal way (no wasted randomness or bias). This provides a satisfying interpretation for the otherwise puzzling quadratic nature of quantum probabilities.

Collapse as an Informational Phase Transition

Having established the collapse criterion and the probability rule, it is worth interpreting the collapse in physical terms. We can think of the wavefunction’s evolution as analogous to a phase transition triggered by information. When $$I(|\Psi\rangle)$$ is low, the state is in a “quantum phase” where superposition is maintained (coherent phase). As $$I$$ grows (through unitary entangling interactions, etc.), the state approaches a critical point. At $$I = I_c$$, a rapid, non-linear transition occurs: coherence between branches breaks down, and the system “chooses” a branch (entering a classical phase where that outcome is realized). This is reminiscent of thermodynamic phase transitions: e.g. supersaturated vapor suddenly condensing into droplets when a threshold is crossed, or a magnet spontaneously aligning spins beyond a critical cooling point. In those cases, a small fluctuation can decide the symmetry-breaking choice (which droplet forms, or which direction magnetization takes). In the quantum collapse, the role of fluctuations is played by the inherently probabilistic selection of the outcome branch. One could imagine that underlying quantum zero-point fluctuations or other microscopic noise gives the slight nudge that pushes the system into one branch or another once it becomes critical. ITC does not attempt to derive that from first principles – it enters as the stochastic element in the postulates – but it is conceptually akin to saying: when the information load is too high, the system must shed information in the form of entropy (lost coherence), randomly collapsing to one of the permissible low-information states.

Mathematically, one might attempt to model the collapse process with a stochastic modification of the Schrödinger equation, similar to Continuous Spontaneous Localization (CSL) models. A crucial constraint is that any such modification must be stochastic rather than purely deterministic: Gisin (1989) proved rigorously that deterministic non-linear extensions of the Schrödinger equation inevitably allow superluminal signaling between entangled particles [Gisin, Phys. Lett. A, 1989]. Using Itô calculus, the state evolution could be described by:

$$$d|\Psi_t\rangle = \left[ -\frac{i}{\hbar}\hat{H}\, dt – \frac{\gamma}{2}\Theta(I – I_c) (\hat{F} – \langle \hat{F} \rangle_t)^2\, dt + \sqrt{\gamma}\,\Theta(I – I_c)(\hat{F} – \langle \hat{F} \rangle_t)\, dW_t \right] |\Psi_t\rangle$$$

where $$\Theta$$ is a Heaviside step function turning on when $$I > I_c$$, $$\hat{F}$$ is the pointer observable driving the collapse (analogous to position in CSL), $$\gamma$$ is a collapse rate parameter, and $$dW_t$$ is a stochastic Wiener process (noise term). The inclusion of the stochastic noise term ensures the no-signaling theorem is strictly respected while naturally recovering Born rule statistics. In the pre-critical regime, $$\Theta = 0$$ and we have normal Schrödinger evolution. Once $$I > I_c$$, the stochastic non-linear term kicks in, rapidly localizing the state. The exact formulation of such a collapse operator is beyond our scope, but it could be engineered to produce the right ensemble statistics (Born rule outcomes). This would put ITC on similar footing to GRW/CSL which add stochastic terms to the Schrödinger equation. The key difference is the trigger: $$\Theta(I – I_c)$$ ensures the term only matters at high complexity. This avoids the issue of continuous spontaneous localization at microscopic scales, which GRW addresses with a tiny rate to not conflict with atomic physics – in ITC, microscopic systems simply never hit $$I_c$$ so they never collapse unless measured. For a relativistic extension, one could draw on Tumulka’s relativistic GRW framework [Tumulka, J. Stat. Phys., 2006], which could inspire an information-triggered variant.

We emphasize that ITC’s collapse, while objective, is intrinsically tied to environment and context because typically only via entanglement with many degrees of freedom does $$I$$ grow large. A single isolated particle in a double-slit apparatus does not have a large $$I$$ – in fact, its state $$|\Psi\rangle = c_1|x_{\text{left}}\rangle + c_2|x_{\text{right}}\rangle$$ has an entropy $$H = -(|c_1|^2 \log |c_1|^2 + |c_2|^2 \log |c_2|^2)$$ of at most 1 bit (for equal superposition). This might be far below $$I_c$$ (which could be, say, on the order of $$10^{10}$$ bits for a system of Avogadro’s number of particles, as a guess). So the particle can maintain coherence through both slits – which is good, since interference is observed experimentally for single particles. If we add a detector at one slit (which is effectively an environment interaction), then $$I$$ increases because now the which-path info entangles with a macroscopic detector state (maybe adding tens of bits of entropy as the detector amplifies a signal). If that pushes $$I$$ close to $$I_c$$, collapse might occur: the particle path is decided (no superposition), giving a definite detection event and no interference. In usual lab situations, even a single bit of which-path info (detector click or no click) can be enough to destroy visible interference, because the entanglement with a macroscopic detector rapidly proliferates further information (the detector’s atoms, photons, etc., carry the info, quickly raising $$I$$ to huge values and forcing collapse).

In this way, ITC qualitatively agrees with both decoherence theory and common sense: monitoring a quantum system creates information and thus forces it into a classical outcome. What ITC adds is a quantitative threshold and an objective collapse where decoherence alone only provided effective diagonalization but not actual wavefunction reduction. It also suggests that even without an explicit measurement apparatus, sufficiently complex quantum states will spontaneously collapse (since the environment or internal interactions can serve as the “observer”). For example, a large molecule in a superposition of two locations might self-collapse because the information needed to describe its spatially extended state (all those atoms coherently in two places) might exceed $$I_c$$. This could tie into why macroscopic superpositions are not seen – not just because of fast decoherence, but because there may be a fundamental collapse kicking in.

In the next section, we explore the empirical consequences of this theory: what distinctive predictions it makes and how one might test or constrain it through experiments.

Experimental Predictions and Tests of ITC

A compelling feature of objective collapse theories is that they are, in principle, testable – they introduce deviations from standard quantum mechanics that can be sought in carefully designed experiments. The Information-Triggered Collapse theory is no exception: by tying collapse to a quantitative information threshold, it makes several predictions that depart from orthodox quantum theory (which lacks any such threshold).

Macroscopic Superposition Tests

ITC Prediction 1: There is a maximum size/complexity of superposition that can be sustained. Beyond that scale, superpositions will collapse spontaneously. This manifests as an effective limit on macroscopic coherence. In practice, as one increases the number of particles or degrees of freedom in an entangled superposition, one should find a point at which interference visibility or coherence signals begin to deteriorate faster than expected from environmental decoherence alone – indicating an intrinsic collapse. This is analogous to how continuous-collapse models (like CSL) predict a mass-dependent collapse rate that becomes significant for large masses. Here the dependence is on information content: a state with entropy $$H$$ bits is predicted to have a finite lifetime if $$H$$ is a sizable fraction of $$I_c$$. If we denote by $$\tau_{\text{decay}}(H)$$ the timescale for spontaneous collapse of a state with information $$H$$, we expect $$\tau_{\text{decay}}$$ to decrease sharply as $$H \to I_c$$. States that nearly saturate the information bound $$I_c$$ would collapse almost immediately (sub-nanosecond), whereas states with $$H \ll I_c$$ could live essentially indefinitely (thus recovering the stability of small quantum superpositions).

Experiments that create large superpositions can test this. For example, matter-wave interferometry with massive objects has been progressing: interference has been observed with molecules comprising up to ~2000 atoms (mass $$\sim 25{,}000$$ amu) [Fein et al., Nature Physics, 2019], corresponding to very large de Broglie wavelengths and coherent delocalizations on the order of micrometers. Thus far, these results show no anomalous loss of coherence beyond what is attributable to ordinary decoherence and experimental noise. This implies that if ITC is correct, the threshold $$I_c$$ must lie above the information content of those superpositions. The interference of molecules with 2000 atoms still exhibits high-contrast fringes, placing a lower bound on $$I_c$$. Roughly estimating the information in such a state: if each molecule is delocalized into two paths, one can argue $$H \sim 1$$ bit (which-path info). But more precisely, each molecule has many internal degrees; however those are not in superposition, only the center-of-mass is. So the relevant $$I$$ is small. ITC easily survives this – one would expect $$I_c \gg 1$$ bit. Future planned experiments aim to push interference to even larger masses (e.g. $$10^6$$ amu nanocrystals, or small viruses). If such experiments continue to see interference with no deviation, the $$I_c$$ threshold is pushed further out. On the other hand, if one encountered an unexpected loss of coherence (beyond environmental decoherence) as mass/complexity increases, it could be a sign of ITC.

One concrete proposal: create a highly complex, multipartite entangled state. If one creates a simple GHZ “cat state” of $$N$$ qubits, $$\frac{1}{\sqrt{2}}(|00\ldots 0\rangle + |11\ldots 1\rangle)$$, the intrinsic entanglement entropy of the superposition across any bipartition is only 1 ebit, regardless of how large $$N$$ is—meaning it could theoretically survive indefinitely in isolation under ITC. However, if one creates a state with volume-law entanglement (such as the highly scrambled output of a deep pseudo-random quantum circuit, or a system of $$N$$ maximally entangled Bell pairs shared between a system and an environment), the entanglement entropy scales linearly to roughly $$N$$ bits. Standard QM predicts this state remains coherent. ITC predicts that for an $$N$$ such that the state’s internal entropy $$H \approx I_c$$, spontaneous collapse will occur even in perfect isolation. For example, if $$I_c$$ were $$10^{6}$$ bits, a volume-law entangled state of a million qubits would be fundamentally impossible to sustain. While such a million-qubit state is far beyond current technology, the conceptual point is that there is a maximal allowed entanglement entropy. Multipartite entanglement has been demonstrated in many physical forms, but GHZ states at very large $$N$$ remain technologically challenging; recent GHZ records are at most in the $$\mathcal{O}(10^2)$$ range in some platforms (e.g. 120 qubits on IBM’s Heron R2 processor), with larger-$$N$$ demonstrations typically relying on other entanglement structures (e.g., spin-squeezed ensembles) rather than GHZ. These bounds are very loose compared to likely values (if ITC connects to Bekenstein’s bound, $$I_c$$ might be astronomically large, e.g. $$10^{45}$$ bits for a human-sized object). So detecting ITC might require going to truly extreme regimes.

ITC Prediction 2: Interference visibility or coherence time will drop precipitously when crossing the threshold. This is akin to a sharp phase transition: as some parameter (like number of particles, separation distance, or entanglement depth) increases, one might see a sudden change from quantum behavior to classical outcomes. For instance, imagine a progressively larger cat-state (superposition of two macroscopically distinct states). For small size (kitten state), coherence can be maintained for a short time; for a slightly larger cat, maybe still possible; but at some critical size, the state might basically collapse immediately (making it effectively impossible to observe interference at that scale). This could be probed indirectly by quantum optomechanics experiments that attempt to put mesoscopic objects (like micron-sized mirrors or nanocrystals) into superposition of different locations. If one plots coherence time vs. mass, standard decoherence theory predicts a smooth decrease (due to more rapid environmental decoherence). ITC would add the possibility of a more abrupt cutoff around the mass/complexity corresponding to $$I_c$$. Upcoming experiments with optically levitated nanoparticles in ultra-high vacuum (aiming to test superpositions of masses ~$$10^{9}$$ amu) may reach regimes where either we start seeing unexplained decoherence (hinting at collapse) or we continue to confirm quantum theory (pushing $$I_c$$ higher). Experiments such as MAQRO (a proposed space experiment for macroscopic superpositions [Kaltenbaek et al., Quantum Sci. Technol., 2023]) have explicitly the aim to detect spontaneous collapse or new decoherence at scales beyond what environment alone would cause.

It should be noted that ITC, like other collapse models, must be carefully formulated to not conflict with sensitive tests of quantum linearity such as superpositions in SQUIDs, interferometry with large molecules, or the continued success of quantum computing algorithms as the number of qubits grows. So far, no violation of Born’s rule or unitarity has been observed in those domains beyond what is expected from known noise sources. This already implies any $$I_c$$ is quite high. For instance, multi-slit experiments have constrained any higher-order (third-order) interference term—quantified by the Sorkin parameter—to below roughly one part in $$10^{-3}$$ of the expected pairwise interference [Kauten et al., New J. Phys., 2017], improving upon the earlier Sinha et al. bound of $$10^{-2}$$ [Sinha et al., Science, 2010]. This supports the exactness of the Born rule for low-dimensional quantum systems. ITC respects that because typical lab states have $$I \ll I_c$$. Only in extremely complex quantum states would one expect deviations.

Controlled Information Manipulation

A unique aspect of ITC is that it directly involves the concept of information gained by the environment. This suggests experiments where we deliberately manipulate the information flow to see its effect on collapse. Consider a quantum interferometer (double-slit or Mach-Zehnder) with a which-path detector that can be turned on or off. Normally, if the detector is on (acquiring path information), interference is destroyed; if off, interference is seen. In standard QM, if the detector information is erased (quantum eraser), interference is restored. In ITC, if collapse has objectively happened due to the detector, one might worry that erasing information later cannot undo the collapse. ITC’s collapse is, in principle, reversible only under the extremely stringent condition that the informational entropy of the entire causally connected region could be reduced below $$I_c$$—a thermodynamic impossibility for all practical purposes. Thus, for any realistic experimental scenario, collapse triggered by exceeding $$I_c$$ is effectively irreversible. This leads to a prediction: if one performs a quantum eraser in a scenario where enough info was recorded to trigger a collapse, the interference will not recover, unlike the usual prediction where it would (assuming ideal conditions).

For example, set up an entangled photon pair such that one photon goes through a double-slit (signal) and the other (idler) carries which-path info. If one fully measures the idler’s path (maximizing info), the signal collapses to a particle pattern (no interference). If one later “erases” the idler info (by measuring in a Fourier basis that doesn’t distinguish paths), in standard quantum theory the interference can be observed in coincidence counts. In ITC, if the idler’s which-path measurement already collapsed the signal (since environment got that info, perhaps crossing threshold for the two-photon system), then no interference can be restored – the interference pattern is lost even in coincidence subsamples. A carefully done quantum eraser experiment could thus in principle reveal an ITC effect if the initial which-path info was enough to cause an irreversible collapse. Realistically, in current quantum erasers the systems are small (single photons and an atom or so), $$I$$ is tiny, so ITC would behave just like QM (no collapse until observer finally measures). But one could conceive an eraser with a delayed-choice massive detector: e.g. a cat is set to detect a photon’s path (which would definitely cause collapse by ITC, as the cat’s state difference is enormous), and then one attempts to erase the info in the cat’s memory (a much harder task!). If one could do that (e.g. reverse the memory or swap the cat’s state coherently – highly sci-fi at this point), standard QM says the interference should come back. ITC says once the cat knew it, collapse happened and you cannot get the superposition back. While this is not feasible experimentally, it underscores a conceptual difference.

A more practical test could involve weak measurements. In weak measurement, one gathers partial information about a quantum system without fully collapsing it, and many repetitions build up the info. According to ITC, a series of weak measurements that cumulatively acquire information could at some point trigger collapse when the total info crosses $$I_c$$. One might see a sudden change in the system’s state statistics at that point. For example, consider measuring an observable weakly repeatedly – initially, the system remains coherent (since each measurement yields little information, $$I$$ accrues slowly), but after many such probes, if enough info has been accumulated (even if spread among many measuring devices), collapse might occur and subsequent probes show the system in a definite eigenstate consistently. This would manifest as a non-linear effect: the act of information gathering has a memory, unlike in standard QM where each weak measurement’s effect is independent aside from the gradual state update. ITC effectively says the universe keeps a running tally of information extracted; once the “budget” is used up, the wavefunction can’t sustain coherence. Detailed modeling of this could show up as deviations in weak measurement experiments or quantum Zeno effect setups (where frequent observations freeze evolution). Typically, the quantum Zeno effect implies infinite frequent observations can hold a state in an eigenstate. In ITC, if those observations are weak such that each alone wouldn’t collapse, one might not freeze the system until enough of them have contributed to a collapse. It’s a subtle difference, possibly too small to easily detect given $$I_c$$ large.

Connection to Entropy and Thermodynamics

One intriguing prediction of ITC is that a collapsing wavefunction should produce a slight excess of entropy or energy in the environment. Since collapse is the removal of information from the system (the superposition alternatives are eliminated), that information (which can be quantified in bits) should go somewhere – likely becoming entropy in the environment. This could be analogous to a tiny burst of heat or radiation accompanying collapse. Some collapse models predict faint spontaneous radiation (e.g. spontaneous photon emission in CSL due to jitter of charged particles). ITC would predict something similar if, say, a large object’s wavefunction collapses, releasing of order $$k_B T \ln 2$$ energy per bit of entropy generated (by Landauer’s principle). If $$I_c$$ collapse happens, say with $$\Delta I$$ bits of info lost, then $$\Delta E \ge k_B T_{\text{env}} \ln 2 \cdot \Delta I$$ energy must be dumped into environment. For macroscopic collapses (like a measurement), this is negligible compared to normal dissipation (the measuring device already dissipates huge energy to amplify signal). But for carefully isolated systems, one might in theory look for anomalous heating or noise coincident with collapse events.

It is important to note that existing experiments placing bounds on spontaneous collapse models (such as CSL) via X-ray emission from Germanium detectors [Donadi et al., Nature Physics, 2021], cold atom heating, or optomechanical noise already constrain the rate of such energy release. For ITC to remain viable, the threshold $$I_c$$ must be sufficiently high, or the collapse dynamics sufficiently rare, such that spontaneous emissions evade current detection limits. ITC primarily predicts observable effects during deliberate, macroscopic measurement events where information flow is engineered, rather than in isolated bulk matter at equilibrium.

A proposed experiment: an optomechanical resonator cooled to near ground state, prepared in a superposition of two positions. If collapse occurs, perhaps a small phononic excitation or emission might be detectable as the energy of localizing the object. GRW-like models predicted a spontaneous heating of matter (from random collapses shaking atoms) that has been bounded by careful calorimetry on bulk germanium etc. ITC’s effect might be episodic (only when large superpositions form, they collapse and emit heat). So a null test would have to involve creating those superpositions repeatedly and monitoring any unexplained energy release.

Another area: cosmology or astrophysics. If extremely large-scale quantum states (like fields in early universe) underwent ITC collapse, there might be imprints. Some approaches to cosmic inflation, for instance, treat the generation of classical perturbations as a result of “quantum-to-classical transition” of inflaton fluctuations – essentially a spontaneous decoherence or collapse is sometimes invoked to get definite classical seeds. If ITC operates, the immense amount of information in a super-horizon entangled state of the inflaton could have triggered collapse, seeding classical density variations. While speculative, it is an interesting angle: the scale of $$I_c$$ might even be tuned such that cosmological data (spectrum of perturbations) match what would occur if collapse happened at horizon crossing. This is beyond our scope, but it shows ITC might connect to big-picture questions as well.

Currently, the most practical tests of ITC reside in the laboratory: matter-wave interference of large objects, creation of increasingly entangled states, and precision tests for deviations from quantum linearity. If ITC is correct, eventually such experiments should hit a “glass wall” where going further yields diminishing coherence not explained by known decoherence sources. Conversely, if such experiments indefinitely scale up quantum phenomena with no surprises, then $$I_c$$ is effectively pushed to such high values that ITC becomes metaphysical (or simply false).

We should acknowledge that existing experiments already put a lower bound on $$I_c$$ that is quite high. For instance, consider the famous experiment by Arndt et al. with buckyball molecules (C$$_{60}$$) showing interference fringes [Arndt et al., Nature, 1999]. Each C$$_{60}$$ has many internal degrees of freedom, but those weren’t in superposition. The center-of-mass fringe pattern implies a coherent superposition across a path separation dictated by the diffraction grating (on the order of hundreds of nanometers), vastly exceeding the molecule’s thermal de Broglie wavelength (~picometers). The positional entropy of a single molecule across two slits is tiny ($$H \approx 1$$ bit for center-of-mass superposition), implying $$I_c \gg 1$$ bit. Future experiments with 100 nm diameter silica spheres in superposition of spatially separated positions (on order of their own diameter) could involve significant entropy (perhaps tens of bits when including all possible internal configurations correlated with position); future $$10^9$$ amu tests could probe $$I_c \approx 10^3 – 10^6$$ bits if internal modes entangle. All evidence so far points to quantum mechanics being intact up to those levels. If $$I_c$$ is astronomically large (e.g. $$10^{23}$$ bits, comparable to Avogadro’s number), then only truly macroscale events (like a cat or a measuring device interacting with something) trigger collapse – which is consistent with everyday observation (cats are never seen in superposition).

To summarize the experimental status: ITC does not conflict with any observed quantum phenomenon to date, but it provides a framework to understand the quantum-classical boundary and offers a target for future tests. Key predictions like a cutoff in superposition size, or irreversibility of collapse if too much info is gained, differentiate it from Many-Worlds or standard quantum theory with only decoherence. As technology pushes quantum coherence to larger scales, ITC will either be supported (by observation of an eventual breakdown of coherence) or increasingly constrained (if no breakdown is seen, $$I_c$$ must be larger). The next decade may bring interference of viruses, $$10^{8}$$-amu objects, or entanglement of $$10^5$$ qubits; these will be fertile grounds to watch for any slight anomalies.

Discussion and Implications

The Information-Triggered Collapse theory we have formulated has far-reaching implications for how we interpret quantum mechanics and for practical quantum technologies. We discuss these implications, as well as how ITC situates itself among the landscape of interpretations and what it means for the future of quantum science.

Resolution of the Measurement Problem

ITC offers a clear (if speculative) resolution of the infamous measurement problem: it posits an ontologically objective criterion – the information threshold $$I_c$$ – that delineates when a quantum measurement is effectively completed (wavefunction collapsed). Unlike interpretations that require an ill-defined Heisenberg cut or invoke consciousness to collapse the wavefunction, ITC grounds the transition in a physical, quantitative property (information). When an observer (or measuring device) interacts with a system, the entanglement and information exchange will eventually cross $$I_c$$, at which point the system’s state non-linearly and randomly reduces to a single outcome state. This removes ambiguity: in principle, one could calculate $$I(t)$$ during a measurement interaction and identify the moment of collapse (though in practice $$I_c$$ is huge, so the collapse may appear almost instantaneous on human timescales). The Schrödinger cat paradox is resolved because a cat, by virtue of being a macroscopic information-rich system, will cause any superposed trigger (alive vs dead state entangled with a decaying atom) to blow past $$I_c$$ almost immediately, ensuring the cat is never in an enduring superposition from any external perspective – it will collapse to either alive or dead (with appropriate probabilities) long before a human opens the box. In other words, ITC formalizes the intuitive notion that “large, thermodynamically irreversible amplification has happened, so the outcome is decided.” Here “irreversible amplification” essentially means a lot of information got generated (e.g. in the positions of millions of atoms in a Geiger counter or the neurons of a cat’s brain), irreversibly (it cannot be put back into a coherent superposition state – too much entropy generated).

Philosophically, ITC aligns with a realist stance: there is a single outcome that truly occurs upon collapse, not multiple worlds. Yet it avoids adding mysterious hidden variables or ad-hoc collapse rates; it builds on a widely accepted concept – information – to drive the process. One might even say ITC inherits the spirit of Bohr’s Copenhagen interpretation but with a twist: Bohr said measurement outcomes are definite when recorded in a classical apparatus; ITC says the apparatus becomes effectively classical (and outcome definite) once it has recorded $$\ge I_c$$ bits about the system. The “classical realm” is thus emergent and defined by the information threshold. This moves Copenhagen from an axiom (“classical devices measure quantum systems”) to a derivable consequence (“devices behave classically once they entangle enough degrees of freedom to hit $$I_c$$”). Within ITC, the Wigner’s friend scenario is resolved by the information threshold: if the friend’s measurement interaction generates sufficient informational entropy ($$I \geq I_c$$), an objective collapse occurs, rendering the friend’s outcome a single, factual reality for all subsequent observers, including Wigner. If the combined friend+system state remains below $$I_c$$, a global superposition could in principle persist, though this regime is likely inaccessible for macroscopic observers. In effect, ITC privileges an information macrorealism: once information is amplified to a macroscopic level, it is a single reality.

Comparison with Other Interpretations

We have touched on how ITC echoes elements of Quantum Darwinism, QBism, RQM, etc. It may be useful to situate it more explicitly. On the spectrum of interpretations, ITC is a dynamical collapse theory (like GRW/CSL) in that it modifies quantum dynamics to include an objective collapse. However, it is not ad hoc: it is guided by the principle that information has a physical effect. It draws strength from decoherence theory by using the idea that specific basis and redundant information are important, but it supplements decoherence (which alone does not select a single outcome) with a bona fide collapse at a quantifiable point. Thus, one can view ITC as a synthesis of decoherence + objective collapse, with the collapse trigger set by an information measure that decoherence naturally increases.

Compared to Many-Worlds (Everett): Many-Worlds denies collapse entirely, saying all outcomes exist in non-communicating branches. ITC instead says only one outcome becomes real, the others truly cease. A Many-Worlds proponent might ask: how does ITC decide which world to keep? The answer: probabilistically by Born’s rule as derived. Many-Worlds also faces the question of why we perceive probabilistic outcomes if the wavefunction just deterministically branches – essentially it lacks a built-in probability postulate and must derive the appearance of Born rule frequencies from decision theory or typicality arguments. ITC, by enforcing a single branch, directly uses the standard probability concepts and doesn’t have that conceptual gap. However, Many-Worlds has the advantage of not needing any new physics (just taking unitary quantum mechanics at face value). ITC does introduce a new element ($$I_c$$ and a collapse mechanism). One might consider ITC as adding a subtle “World Selection” mechanism on top of Many-Worlds: all branches start to form, but only until their mutual information with environment hits $$I_c$$, then only one branch remains in reality (others effectively get pruned out as they would require more info capacity than available). This hearkens back to Wheeler’s idea of a “genesis by information” – perhaps one could imagine that the universe, as a sort of computational entity, does not “compute” all branch outcomes beyond a certain complexity because that would violate the Bekenstein bound or overuse resources, so it randomly picks one branch to actually instantiate. In that sense, ITC could be a more palatable update to Everett: it doesn’t demand infinite real parallel worlds, only potential ones that collapse down to one actual world once enough complexity has built up.

Relative to Bohmian Mechanics (pilot-wave): Bohm’s theory has a deterministic particle trajectory guided by a wave, avoiding collapse by having a particle always in one definite branch. That theory violates normal quantum symmetry somewhat and introduces hidden variables (positions). ITC doesn’t have additional hidden variables beyond a threshold, and it is indeterministic (truly random collapse outcomes). It also doesn’t suffer from nonlocal trajectory issues or needing preferred foliation; however, ITC shares with Bohm a kind of realist desire: in both, at any moment there is a single reality (be it particle positions or a single branch after collapse). The difference is that in Bohm, quantum randomness is epistemic (due to unknown initial conditions of particles) whereas in ITC it is fundamental.

Compared to Stochastic Mechanics (GRW): The GRW theory says each particle has a tiny probability per time to spontaneously collapse its wavefunction around some location (with a chosen localization width) [Ghirardi, Rimini & Weber, 1986]. Over many particles, this ensures macroscopic objects collapse quickly. ITC achieves a similar effect (macroscopic objects collapse) but via a collective criterion rather than independent collapses. GRW collapses could be triggered even in isolation, whereas ITC would not collapse a truly isolated single particle (unless it had an astronomically large superposition). So one difference is testable: GRW predicts, for instance, spontaneous X-ray emission from Germanium due to sudden localization of electrons – experiments show no such radiation above certain levels, which has constrained GRW’s parameters [Donadi et al., 2021] [Majorana Collaboration, PRL, 2022]. ITC in contrast would predict no spontaneous collapse radiation for an isolated atom, thus evading that constraint entirely. Only when environment or measurement is involved does ITC collapse occur, so energy release would be entangled in that already large process (hence hard to isolate as a separate effect). This makes ITC harder to falsify via those spontaneous emission tests, but also arguably more in line with existing evidence (since no such spontaneous collapses have been seen). However, ITC might be falsified if we keep increasing isolated system complexity and never see collapse – whereas GRW has a definite collapse rate we can try to catch, ITC is more subtle and might require bigger systems to reveal itself, thus possibly eluding near-term tests.

ModelTriggerObjective Collapse?Born RuleCurrent TestabilityKey Reference(s)
GRW / CSLSpontaneous per particleYesPostulatedRadiation searches, optomechanicsGhirardi et al. 1986
Diósi–PenroseGravitational self-energyYesPostulatedOptomechanics, spontaneous radiationDiósi 1989; Penrose 1996
ITC (this work)Information threshold IcI_cIc​YesHeuristic motivation from algorithmic complexityMacroscopic superpositions, controlled info flow—
Quantum DarwinismRedundant environmental recordsEffective onlyDerivable via envarianceNV centers, photonic & superconducting circuitsZurek 2009
QBismAgent’s belief updateNoNormative ruleN/A (interpretational)Fuchs et al. 2014
Table 1. Comparison of Information-Triggered Collapse (ITC) with major objective collapse models and information-centric interpretations of quantum mechanics. The table contrasts the collapse trigger (or equivalent mechanism), whether collapse is objective, the status of the Born rule, experimental testability, and representative references.

Implications for Quantum Computing and Technology

If ITC is true, it has sobering implications for the scalability of quantum computers and other quantum technologies that rely on coherent superposition of many degrees of freedom. These implications are, of course, contingent on the unknown value of $$I_c$$; if $$I_c$$ is sufficiently large (e.g., near the Bekenstein bound for the device), no practical limitation would arise. A large-scale fault-tolerant quantum computer requires maintaining coherence across thousands or millions of qubits through error correction. Such a state has a huge amount of entanglement entropy (the code space deliberately spreads information nonlocally). If the $$I_c$$ threshold is below the entropy involved in the entire computer’s state, ITC would predict an eventual collapse of the quantum state of the computer, essentially a catastrophic decoherence that no error-correcting code could fully avert (because it’s not due to local noise but a fundamental effect).

As an estimate, a quantum computer with $$N$$ qubits in a highly entangled state could have on order $$N$$ bits of entanglement entropy (in maximally mixed subsystems). If $$N$$ surpasses about $$I_c$$, the computer cannot be in a pure state across those qubits – it will have collapsed to one of many potential outcomes (perhaps corresponding to a particular computational basis state or some classical mixture). This would mean a fundamental limit on quantum computing power. For example, if $$I_c = 10^{9}$$ bits (pure speculation), then a quantum computer with more than $$10^9$$ entangled qubits cannot operate in a fully quantum way – it’ll behave classically at that scale. If $$I_c$$ is extremely large (like $$10^{23}$$ bits, ~Avogadro’s number), then we might never reach it in engineering practice (a QC with $$10^{23}$$ qubits is far-fetched). But if $$I_c$$ is in the thousands or millions, it could be a roadblock.

Even before reaching a hard limit, ITC suggests that as we scale up quantum processors, we might see increasing unaccounted error rates or decoherence that are not explained by known noise but by an intrinsic instability of large entangled states. This could manifest as a kind of unexplained loss of fidelity when running algorithms that entangle too many qubits simultaneously. It would appear as if an unmodeled decoherence source kicks in beyond a circuit complexity threshold. This gives experimental quantum computing another reason to measure and profile errors carefully: if an error rate floor persists that grows with qubit number in a way not attributable to environment noise, it might hint at something like ITC at play.

On the other hand, if quantum computers can be scaled arbitrarily (with error correction holding things together) and they function as expected, one could argue that pushes any fundamental collapse scale even further out. For instance, a future quantum computer with $$10^6$$ physical qubits reliably entangled during operation (error-corrected) would indicate no ITC effect up to entropy of ~$$10^6$$ bits. That would be a profound confirmation of standard quantum mechanics at macroscopic levels. So quantum computing progress doubles as a test of macroscopic coherence.

For quantum communication (QKD, teleportation, etc.), ITC likely has minimal effect as those usually involve relatively small entangled states at any given time (pairs or few qubits). But one niche: quantum networks envision entangling many nodes (like a GHZ state distributed across many users). If one tries to entangle, say, 1000 nodes, is there a chance the state collapses on its own? Possibly if 1000 bits is near threshold. It might put a limit on multi-party entangled states length or size.

Another implication is on quantum sensing: strategies like entangled sensor arrays (where many particles are entangled to get a sharper measurement) rely on large entangled states for gain. If ITC limited those states beyond a certain size, there’s a cap on achievable quantum advantage.

Conceptual and Philosophical Impact

If ITC – or something akin to it – turned out to be correct, it would underscore the notion that information is a physical entity on par with energy and momentum in dictating dynamics. It would cement the idea that there is no measurement problem per se; rather, quantum mechanics was incomplete without an information-based law. This resonates with the modern view in black hole physics (where information content cannot exceed area/4 in Planck units) and in holographic principle (where physics can be described by degrees of freedom on a lower-dimensional boundary, effectively linking info to spacetime geometry) [Bousso, 2002]. One might speculate that $$I_c$$ is not just a random constant but related to such principles – e.g. proportional to the number of degrees of freedom in a closed surface surrounding the system, or related to gravitational effects (Penrose had conjectured that gravity might cause collapse for superpositions above a certain mass difference [Penrose, 1996]). ITC as presented is agnostic to gravity, but one could imagine that if a superposition is massive enough to significantly curve spacetime differently in each branch, that might correspond to large information (since distinguishing the spacetime geometries requires info) and cause collapse. So ITC might be the emergent quantum side of some quantum gravity phenomenon.

It also touches on the arrow of time: collapse is a thermodynamically irreversible step (information turns to entropy). This might tie quantum collapse to the Second Law of thermodynamics. Some have argued that the increase of entropy (time’s arrow) is fundamentally linked to the collapse of the wavefunction (since measurements increase entropy). ITC certainly aligns with this: collapse happens basically when further unitary evolution would violate the second law by requiring more entropy capacity than available. It forces an “update” that increases entropy (thus time’s arrow advances). This is a fertile ground for philosophical exploration: is the flow of time related to information-induced state reductions? Could $$I_c$$ be effectively a universal bound ensuring that overall entropy keeps increasing consistent with thermodynamics? These ideas remain hypothetical but echo holographic bounds [Bousso, 2002]. It is satisfying that ITC gives a collapse mechanism fully compatible with entropy considerations, unlike ad hoc collapse which doesn’t obviously obey or relate to thermodynamic principles.

Finally, ITC has an appealing intuitive narrative: in essence, “When enough bits are in play, quantum bits turn into classical bits.” It quantifies the folklore that “a measurement is an interaction that spreads information into the environment.” ITC says precisely how much is enough: $$I_c$$ bits. If future experiments were to verify something like this, it would profoundly impact our understanding of quantum reality, effectively elevating information to a dynamical agent that can change the course of physical evolution (not just passively record it). It would fulfill, in a sense, Wheeler’s vision of the universe fundamentally built from yes/no questions – with the twist that once too many answers are obtained, nature freezes the outcome (collapse) as a record.

Conclusion and Outlook

This work is theoretical and does not claim empirical support beyond inspiration from existing physics.

We have presented a theoretical framework in which quantum wavefunction collapse is not an ad hoc axiom but an emergent phenomenon triggered by information. By synthesizing insights from Wheeler’s “it from bit” doctrine [Wheeler, 1990], physical entropy bounds [Bekenstein, 1981], Landauer’s thermodynamic cost of information [Landauer, 1961], and modern decoherence theory [Zurek, 2003], the Information-Triggered Collapse (ITC) theory provides a cohesive picture: when a quantum system’s state becomes sufficiently entangled or differentiated that it contains more than $$I_c$$ bits of information (about the distinguishable outcomes within it), the superposition can no longer be sustained. At that point, a non-unitary collapse occurs, selecting a single branch with probability given (and justified heuristically) by the Born rule motivated from algorithmic complexity considerations. This collapse is effectively an informational phase transition, turning a quantum state into a classical outcome once the “information pressure” exceeds a critical value.

ITC offers a clear resolution to the measurement problem – collapse is objective and tied to a quantitative condition, eliminating ambiguity about the role of observers or classical apparatus. It is a falsifiable theory: although $$I_c$$ is currently not known, we have outlined how experiments pushing the scale of quantum superposition can discover or constrain such a threshold. If in future experiments quantum interference and entanglement continue to hold up at larger and larger scales with no sign of collapse, then the $$I_c$$ threshold (if it exists) is simply pushed further, perhaps to a domain beyond foreseeable technology (e.g. astronomical or cosmological scales). Conversely, if a fundamental limit to coherence is observed, it could be interpreted as evidence for ITC or similar mechanisms. Current experimental evidence – from interference of massive molecules to tests of Born’s rule – are all consistent with $$I_c$$ being very high, which in turn is consistent with why we don’t normally see quantum effects in macroscopic life (collapse happens well before that scale) yet we do see them in controlled microscopic setups.

Looking ahead, there are several open questions and avenues for further research:

  • Precise Value of $$I_c$$: What determines the numerical value of the information threshold $$I_c$$? Is it a new constant of nature to be measured, or can we derive it from more fundamental physics (perhaps relating to gravity, Planck scale, or cosmological parameters)? One speculative guess is to set $$I_c$$ such that when a system’s entropy reaches $$I_c$$, its gravitational self-collapse (à la Penrose’s gravity-induced collapse idea) would occur; this could tie $$I_c$$ to mass and radius. Alternatively, $$I_c$$ might correlate with the holographic information content of a region (so proportional to area in Planck units) – exploring such connections requires a foray into quantum gravity or black hole physics, which is beyond our scope but a tantalizing direction. Future work should focus on deriving $$I_c$$ from quantum gravity or testing via quantum simulators [Georgescu et al., Rev. Mod. Phys., 2014].
  • Dynamics of Collapse: We described collapse in an instantaneous way for simplicity. In reality, the transition might have a finite timescale. How fast does collapse proceed once $$I_c$$ is reached? Is it essentially immediate (relative to system dynamics), or could it be slow enough to observe intermediate states (“half-collapsed” density matrices)? A detailed dynamical model (e.g. a modified Schrödinger–Lindblad equation with an $$I$$-dependent term, as sketched in Section 3.3) would be valuable for making quantitative predictions (like exact decoherence rates for given systems). Developing such a model while keeping it consistent with conservation laws and relativity is an important technical challenge.
  • Relativistic and Field-Theoretic Generalization: We formulated ITC mostly in a non-relativistic quantum mechanics setting. For quantum field theory, one would need to define the information measure perhaps in terms of entropy in field modes or correlation functions. Does each localized region of space have its own $$I_c$$ (like Bekenstein bound suggests entropy limit per region)? One might need a relativistic invariant way to accumulate information (perhaps using the concept of mutual information across spacetime regions). Ensuring no superluminal signaling occurs with collapse in relativistic settings is also crucial – one may have to resort to spontaneous collapse formulations that treat different frames consistently, as in Tumulka’s relativistic GRW framework [Tumulka, 2006]. This is a complex but necessary step to make ITC a complete theory.
  • Connection to Entanglement Measures: Our use of Shannon entropy and algorithmic complexity is somewhat heuristic. In rigorous quantum information theory, one deals with von Neumann entropy and entanglement entropy for bipartitions. It would strengthen ITC to recast the criterion in those terms: e.g. “collapse occurs when the entanglement entropy between the system and some environment exceeds a certain size.” That might naturally pick out a preferred split (system vs environment) at which to apply the rule. Alternatively, maybe the quantum mutual information between different parts of the universal wavefunction reaches a value where the state effectively becomes classical (as mutual information high means records are imprinted). Formalizing these ideas could link ITC to existing theorems in quantum information (like quantum discord or spectrum broadcast structures in Quantum Darwinism).
  • Experimental Implementation of Toy Models: It could be useful to simulate an ITC-like collapse on a smaller scale using for instance a controlled system where we artificially introduce a collapse when a qubit becomes entangled with too many others (by monitoring entanglement entropy in a quantum simulator and then randomly projecting the state). This would not prove ITC in nature, but it could demonstrate the qualitative behavior (e.g. a sharp transition in observables) in a controlled setting. Some many-body quantum simulators can measure entanglement entropy or purity; one could envision an experiment that adds a feedback: when entropy hits a threshold, perform a projective measurement. Studying the outcomes might give insight into how an information-induced collapse would look and how it could be distinguished from decoherence noise.
  • Philosophical Implications: If ITC holds, it invites interpretation: is information playing an almost “agent-like” role in physics (collapsing wavefunctions as if nature is performing a computation and compressing data)? It also raises the question of whether the universe as a whole is information-limited. Could the cosmic expansion or other phenomena be related to the universe’s attempt to not exceed some total information bound (reminiscent of the holographic principle)? These remain speculative, but the fact that we might integrate information into core physical law would be a paradigm shift comparable to the integration of thermodynamics and statistical mechanics in the 19th century.

In conclusion, the theory of Information-Triggered Collapse provides a conceptually rich and promising route toward demystifying quantum measurement by weaving together threads from quantum mechanics, information theory, and thermodynamics. It remains firmly grounded in known physics where it should – respecting confirmed quantum predictions – while boldly venturing a new hypothesis about where standard physics might give way to new behavior. Whether or not ITC (in this precise form) is the final answer, the continued dialogue between quantum foundations and information theory seems destined to play a crucial role in our evolving understanding of reality. As Wheeler presciently suggested, information may not only explain quantum phenomena but also point the way to a deeper unity of physics. Future experimental tests and theoretical refinements of ideas like ITC will determine if indeed “it from bit” is more than a metaphor – potentially becoming a quantitative law of nature that triggers the collapse of the quantum wavefunction itself, once enough bits say so.

References

Arndt, M. et al. (1999). “Wave–particle duality of C60 molecules.” Nature 401, 680–682. doi:10.1038/44348
Bekenstein, J. D. (1981). “Universal upper bound on the entropy-to-energy ratio for bounded systems.” Phys. Rev. D 23, 287. doi:10.1103/PhysRevD.23.287
Bérut, A. et al. (2012). “Experimental verification of Landauer’s principle linking information and thermodynamics.” Nature 483, 187–189. doi:10.1038/nature10872
Bousso, R. (2002). “The holographic principle.” Rev. Mod. Phys. 74, 825. doi:10.1103/RevModPhys.74.825
Casini, H. (2008). “Relative entropy and the Bekenstein bound.” Class. Quantum Grav. 25, 205021. doi:10.1088/0264-9381/25/20/205021
Ciampini, M. A. et al. (2018). “Experimental signature of quantum Darwinism in photonic cluster states.” Phys. Rev. A 98, 020101(R). doi:10.1103/PhysRevA.98.020101
Diósi, L. (1989). “Models for universal reduction of macroscopic quantum fluctuations.” Phys. Rev. A 40, 1165. doi:10.1103/PhysRevA.40.1165
Donadi, S. et al. (2021). “Underground test of gravity-related wave function collapse.” Nature Physics 17, 74–78. doi:10.1038/s41567-020-01089-3
Fein, Y. Y. et al. (2019). “Quantum superposition of molecules beyond 25 kDa.” Nature Physics 15, 1242–1245. doi:10.1038/s41567-019-0663-9
Fuchs, C. A. (2010). “QBism, the Perimeter of Quantum Bayesianism.” arXiv:1003.5209. arXiv:1003.5209
Fuchs, C. A., Mermin, N. D. & Schack, R. (2014). “An introduction to QBism with an application to the locality of quantum mechanics.” Am. J. Phys. 82, 749. doi:10.1119/1.4874855
Georgescu, I. M., Ashhab, S. & Nori, F. (2014). “Quantum simulation.” Rev. Mod. Phys. 86, 153. doi:10.1103/RevModPhys.86.153
Ghirardi, G. C., Rimini, A. & Weber, T. (1986). “Unified dynamics for microscopic and macroscopic systems.” Phys. Rev. D 34, 470. doi:10.1103/PhysRevD.34.470
Gisin, N. (1989). “Stochastic quantum dynamics and relativity.” Helv. Phys. Acta 62, 363–371. doi:10.1016/0375-9601(89)90235-7
Grinbaum, A. (2013). “Quantum Observer and Kolmogorov Complexity.” In New Vistas on Old Problems, MPRL Proceedings. arXiv:1007.2756. arXiv:1007.2756
Hong, J. et al. (2016). “Experimental test of Landauer’s principle in single-bit operations on nanomagnetic memory bits.” Science Advances 2, e1501492. doi:10.1126/sciadv.1501492
Javadi-Abhari, A. et al. (2025). “Big cats: entanglement in 120 qubits and beyond.” arXiv preprint arXiv:2510.09520.
Jun, Y., Gavrilov, M. & Bechhoefer, J. (2014). “High-precision test of Landauer’s principle in a feedback trap.” PRL 113, 190601. doi:10.1103/PhysRevLett.113.190601
Kaltenbaek, R. et al. (2023). “MAQRO—Macroscopic Quantum Resonators.” Quantum Sci. Technol. 8, 014006. doi:10.1088/2058-9565/aca3cd
Kauten, T. et al. (2017). “Obtaining tight bounds on higher-order interferences with a 5-path interferometer.” New J. Phys. 19, 033017. doi:10.1088/1367-2630/aa5d98
Landauer, R. (1961). “Irreversibility and heat generation in the computing process.” IBM J. Res. Dev. 5, 183–191. doi:10.1147/rd.53.0183
Majorana Collaboration (2022). “Search for spontaneous radiation in the Majorana Demonstrator.” PRL 129, 080401. doi:10.1103/PhysRevLett.129.080401
Penrose, R. (1996). “On gravity’s role in quantum state reduction.” Gen. Relativ. Gravit. 28, 581. doi:10.1007/BF02105068
Rovelli, C. (1996). “Relational Quantum Mechanics.” Int. J. Theor. Phys. 35, 1637–1678. arXiv:quant-ph/9609002. doi:10.1007/BF02302261
Schlosshauer, M. & Fine, A. (2005). “On Zurek’s derivation of the Born rule.” Found. Phys. 35, 197–213. doi:10.1007/s10701-004-1941-6
Sinha, U. et al. (2010). “Ruling out multi-order interference in quantum mechanics.” Science 329, 418–421. doi:10.1126/science.1190545
Solomonoff, R. J. (1964). “A formal theory of inductive inference.” Inf. Control 7, 1–22. doi:10.1016/S0019-9958(64)90131-8
Tumulka, R. (2006). “A relativistic version of the Ghirardi–Rimini–Weber model.” J. Stat. Phys. 125, 821–840. doi:10.1007/s10955-006-9227-3
Unden, T. et al. (2019). “Revealing the emergence of classicality using nitrogen-vacancy centers.” PRL 123, 140402. doi:10.1103/PhysRevLett.123.140402
Wheeler, J. A. (1990). “Information, Physics, Quantum: The Search for Links.” In Complexity, Entropy, and the Physics of Information, SFI Studies. [PDF]
Yan, L. L. et al. (2018). “Single-atom demonstration of the quantum Landauer principle.” PRL 120, 210601. doi:10.1103/PhysRevLett.120.210601
Zhu, K. et al. (2025). “Observation of quantum Darwinism and the origin of classicality with superconducting circuits.” Science Advances. doi:10.1126/sciadv.adx6857
Zurek, W. H. (2003). “Environment-assisted invariance, entanglement, and probabilities in quantum physics.” PRL 90, 120404. doi:10.1103/PhysRevLett.90.120404
Zurek, W. H. (2003). “Decoherence, einselection, and the quantum origins of the classical.” Rev. Mod. Phys. 75, 715. doi:10.1103/RevModPhys.75.715
Zurek, W. H. (2005). “Probabilities from entanglement, Born’s rule from envariance.” Phys. Rev. A 71, 052105. doi:10.1103/PhysRevA.71.052105
Zurek, W. H. (2009). “Quantum Darwinism.” Nature Physics 5, 181–188. doi:10.1038/nphys1202

Quantum Upside & Quantum Risk - Handled

My company - Applied Quantum - helps governments, enterprises, and investors prepare for both the upside and the risk of quantum technologies. We deliver concise board and investor briefings; demystify quantum computing, sensing, and communications; craft national and corporate strategies to capture advantage; and turn plans into delivery. We help you mitigate the quantum risk by executing crypto‑inventory, crypto‑agility implementation, PQC migration, and broader defenses against the quantum threat. We run vendor due diligence, proof‑of‑value pilots, standards and policy alignment, workforce training, and procurement support, then oversee implementation across your organization. Contact me if you want help.

Talk to me Contact Applied Quantum

Marin Ivezic

I am the Founder of Applied Quantum (AppliedQuantum.com), a research-driven consulting firm empowering organizations to seize quantum opportunities and proactively defend against quantum threats. A former quantum entrepreneur, I’ve previously served as a Fortune Global 500 CISO, CTO, Big 4 partner, and leader at Accenture and IBM. Throughout my career, I’ve specialized in managing emerging tech risks, building and leading innovation labs focused on quantum security, AI security, and cyber-kinetic risks for global corporations, governments, and defense agencies. I regularly share insights on quantum technologies and emerging-tech cybersecurity at PostQuantum.com.
Share via
Copy link
Powered by Social Snap