How Many Ways Can We Shuffle the Universe?
How Many Ways Can We Shuffle the Universe?
A standard deck of cards contains 52 cards. The number of possible shuffles is
$$ 52! $$that is,
$$ 52 \cdot 51 \cdot 50 \cdot \ldots \cdot 3 \cdot 2 \cdot 1. $$This is already an enormous number:
$$ 52! \approx 8.07 \times 10^{67}. $$It is so large that any thoroughly shuffled deck of cards has, in all likelihood, never appeared in precisely that order before in the history of the universe.
This is a familiar example of how quickly combinatorics produces large numbers. A deck of cards is a small finite system. It has only 52 distinguishable objects. Yet the number of possible arrangements is already far beyond ordinary intuition.
But the deck of cards raises a natural question.
If a deck can be shuffled in $52!$ ways, how many ways can the universe be “shuffled”?
Of course, this question is not precise as it stands. The universe is not literally a deck of cards. Its degrees of freedom are not ordinary objects placed in a line. Quantum theory, relativity, gravity, and the holographic principle all complicate the picture. Still, the question is useful because it gives us a way to think about the scale of possible physical configurations.
From cards to bits
Suppose a system has $N$ binary degrees of freedom. Each degree of freedom can be in one of two states, say
$$ 0 $$or
$$ 1. $$Then the number of possible configurations is
$$ 2^N. $$This is the basic combinatorial mechanism behind digital information. With one bit there are two possibilities. With two bits there are four. With ten bits there are
$$ 2^{10}=1024 $$possible states. With one hundred bits there are
$$ 2^{100} \approx 10^{30} $$possible states.
The important point is that the number of possible configurations grows exponentially with the number of degrees of freedom.
So if $N$ itself becomes cosmologically large, the number of possible states becomes unimaginably large.
Googol and googolplex
A googol is
$$ 10^{100}. $$A googolplex is
$$ 10^{10^{100}}. $$A googol is a large number. A googolplex is of a different kind: it is a number whose exponent is itself a googol.
At first sight, a googolplex seems like a playful mathematical monster, a number invented to be absurdly large. But when we begin to count possible states of the universe, numbers of this general kind appear quite naturally.
This is because a system with $10^{100}$ binary degrees of freedom would have
$$ 2^{10^{100}} = 10^{(\log_{10}2)10^{100}} \approx 10^{0.301\cdot 10^{100}} $$possible states.
That is not exactly a googolplex, but it is a googolplex-like number: a power of ten whose exponent is itself of order $10^{100}$.
A naive Planck-scale estimate
One tempting approach is to imagine that space is divided into small “pixels” at the Planck scale.
The Planck length is approximately
$$ \ell_P \approx 1.6 \times 10^{-35}\,\mathrm{m}. $$The present comoving radius of the observable universe is roughly
$$ R \approx 4.4 \times 10^{26}\,\mathrm{m}. $$This is not the age of the universe multiplied by the speed of light. Because the universe has expanded, the present comoving radius of the observable universe is much larger than 13.8 billion light years. It is about 46.5 billion light years.
The number of Planck lengths across this radius is approximately
$$ \frac{R}{\ell_P} \approx 2.7 \times 10^{61}. $$If we naively imagine space as a three-dimensional grid of Planck-sized cells, then the number of spatial cells is approximately
$$ \left(\frac{R}{\ell_P}\right)^3 \approx (2.7\times 10^{61})^3 \approx 2.0\times 10^{184}. $$So the number of spatial Planck cells is of order
$$ 10^{184}. $$If each such cell could be in only two possible states, the number of possible spatial configurations would be
$$ 2^{2.0\times 10^{184}}. $$Written as a power of ten, this is
$$ 10^{(\log_{10}2)(2.0\times 10^{184})} \approx 10^{6.0\times 10^{183}}. $$This is already vastly larger than a googolplex.
If we only care about the rough family of the number, we may write it schematically as
$$ 10^{10^{184}}, $$but the more explicit estimate is closer to
$$ 10^{6\times 10^{183}}. $$The distinction is not physically important here, but it is worth being clear about the rounding.
Naive spacetime histories
If we also include time, the number becomes still larger.
The age of the universe is roughly
$$ 4.4 \times 10^{17}\,\mathrm{s}, $$while the Planck time is approximately
$$ 5.4 \times 10^{-44}\,\mathrm{s}. $$The number of Planck-time intervals in the history of the universe is therefore of order
$$ \frac{4.4\times 10^{17}}{5.4\times 10^{-44}} \approx 8\times 10^{60}. $$Multiplying this by the number of spatial Planck cells gives a rough number of spacetime cells:
$$ (2.0\times 10^{184})(8\times 10^{60}) \approx 1.6\times 10^{245}. $$If each spacetime cell had two possible states, the number of possible spacetime histories would be
$$ 2^{1.6\times 10^{245}}. $$As a power of ten, this is
$$ 10^{(\log_{10}2)(1.6\times 10^{245})} \approx 10^{4.8\times 10^{244}}. $$Schematically, this is of order
$$ 10^{10^{245}}. $$This is an almost absurdly large number.
But this estimate is probably not physically correct.
Why the naive estimate is probably wrong
The problem with the Planck-pixel picture is that it treats the universe as if information were stored independently in each tiny volume element. That is a natural idea from the standpoint of ordinary matter or computer memory, but gravity appears to behave differently.
Black hole thermodynamics suggests that the maximum amount of information in a region does not scale with the volume of the region. It scales with the area of its boundary.
This is the essential idea behind the holographic principle.
For a black hole, the Bekenstein–Hawking entropy is
$$ S_{\mathrm{BH}} = \frac{k_B A}{4\ell_P^2}, $$where $A$ is the area of the horizon and $\ell_P$ is the Planck length.
The crucial point is the area dependence. The entropy is not proportional to the volume enclosed by the horizon. It is proportional to the horizon area.
This changes the estimate dramatically.
Using the radius $R$ above, the horizon area is approximately
$$ A = 4\pi R^2. $$Therefore
$$ \frac{S_{\mathrm{BH}}}{k_B} = \frac{A}{4\ell_P^2} = \frac{4\pi R^2}{4\ell_P^2} = \pi \left(\frac{R}{\ell_P}\right)^2. $$Since
$$ \frac{R}{\ell_P} \approx 2.7\times 10^{61}, $$we get
$$ \frac{S_{\mathrm{BH}}}{k_B} \approx \pi(2.7\times 10^{61})^2 \approx 2.3\times 10^{123}. $$So this simple horizon-area estimate gives an entropy of order
$$ 10^{123} k_B. $$Different choices of cosmological horizon lead to slightly different estimates. A particle horizon, a Hubble radius, and a future event horizon are not exactly the same object. In an accelerating universe close to de Sitter behaviour, the cosmological event horizon is especially relevant to estimates of a maximal entropy bound, often giving a value closer to
$$ 10^{122}k_B, $$while the simple present-radius area estimate above gives
$$ 10^{123}k_B. $$For the purposes of this discussion, the important point is not the precise coefficient or even the difference between $10^{122}$ and $10^{123}$. The important point is the area scaling.
The holographic estimate is much smaller than the naive Planck-volume count. In the volume picture, one imagines about $10^{184}$ independent spatial degrees of freedom. In the holographic picture, gravity reduces the maximum number of independent degrees of freedom to something closer to
$$ 10^{122} $$or
$$ 10^{123}. $$That is still enormous, but it is a reduction by roughly sixty orders of magnitude in the exponent that counts degrees of freedom.
Gravity, in other words, appears to cut down the naive combinatorial freedom of the universe.
Entropy and the number of possible states
Entropy is related to the number of microscopic states by Boltzmann’s formula:
$$ S = k_B \ln \Omega. $$Here $\Omega$ is the number of possible microstates corresponding to a given macroscopic description.
Solving for $\Omega$, we get
$$ \Omega = e^{S/k_B}. $$So if the maximal entropy associated with a cosmic horizon is somewhere in the range
$$ \frac{S}{k_B} \sim 10^{122} \text{ to } 10^{123}, $$then the corresponding number of possible microstates is roughly
$$ \Omega \sim e^{10^{122}} \text{ to } e^{10^{123}}. $$Since
$$ e^x = 10^{x/\ln 10}, $$this corresponds, schematically, to
$$ 10^{10^{122}} \quad\text{to}\quad 10^{10^{123}}. $$The exact exponent is not the main point. The main point is that a more serious physical estimate does not point to an ordinary large number like $10^{80}$, $10^{100}$, or even $10^{1000}$. It points to something of the same general kind as a googolplex:
$$ 10^{10^{100}}. $$The universe, treated as a finite information-bearing system constrained by gravity, leads naturally to googolplex-like numbers.
From counting configurations to counting orbits
But even this is still not the final physical question.
The estimates above count possible microstates in a very broad sense. They do not tell us which histories are dynamically possible, which ones respect the laws of physics, or which ones should be regarded as genuinely distinct.
This is where the raw combinatorics has to be corrected by structure.
In gravity this is not an optional refinement. General relativity is diffeomorphism-invariant: different coordinate descriptions of spacetime need not represent different physical worlds. This is one reason why the naive image of spacetime as a fixed grid of independent cells is misleading. Before we count possible worlds, we must ask which differences are physical and which are merely differences of description.
Suppose $X$ is a space of possible configurations or histories. A naive count would ask for
$$ |X|. $$But if a symmetry group $G$ acts on $X$, many elements of $X$ may represent the same physical situation described in different ways. Then the physically meaningful count is not the number of raw configurations, but the number of equivalence classes, or orbits:
$$ |X/G|. $$In a very simple finite case, if the action of $G$ is free and every orbit has $|G|$ elements, then
$$ |X/G| = \frac{|X|}{|G|}. $$Real physical theories are more subtle than this. Symmetry actions need not be free. Gauge redundancies are not ordinary finite permutations. Quantum field theory imposes further consistency conditions. Gravity complicates the notion of localization itself. But the basic lesson remains: physics does not count all descriptions as distinct realities.
A large part of modern physics consists precisely in identifying what survives the quotient by admissible transformations.
This is also the point at which the question connects with observer equivariance. From the standpoint of observer-equivariant thinking, a physically admissible history is not just any formal arrangement of degrees of freedom. It must be compatible with the relevant transformations between observer-perspectives. What is physically meaningful is what remains well-defined under those transformations.
In my article The Equivariant Multiverse: Globally Consistent $G$-Equivariant Histories, I formulate this idea in terms of a structured space of globally consistent histories. The point is not to count every imaginable world, but to identify histories that satisfy symmetry, equivariance, and global consistency conditions.
So the progression is:
$$ \text{raw configurations} \longrightarrow \text{admissible configurations} \longrightarrow \text{equivalence classes of admissible histories}. $$Or schematically:
$$ X \longrightarrow X_{\mathrm{adm}} \longrightarrow X_{\mathrm{adm}}/G. $$This does not merely reduce a number. It changes the question.
The naive combinatorial question is:
$$ \text{How many arrangements can be imagined?} $$The physical question is:
$$ \text{Which histories remain meaningful under the symmetries and consistency conditions of the theory?} $$This is a much sharper question.
Does this mean there are that many universes?
No.
The numbers above should not be interpreted as the number of actual parallel universes. Nor should they be interpreted as the number of physically realized worlds. They are better understood as estimates of the size of a possible state space, constrained by gravitational entropy.
That distinction matters. A state-count is not the same as an ontology. Counting possible microstates does not tell us which of them exist. It only tells us how large the relevant space of possibilities may be, once a certain physical framework has been assumed.
The true theory of quantum gravity may also change the picture. The relevant degrees of freedom may not be local in any ordinary sense. The structure of the space of possible histories may be more constrained than any simple count of binary states suggests.
The combinatorial estimate gives only the outer shape of the problem. It tells us how quickly the space of possibilities explodes. But the physical problem is to understand how much of this space is actually admissible.
Is googolplex close to the largest meaningful number?
Mathematically, of course, there is no largest number.
One can easily write
$$ 10^{10^{10^{100}}}, $$or define vastly larger numbers using recursion, tetration, Graham’s number, or Rayo’s number. Mathematics has no difficulty going far beyond a googolplex.
But physics is different.
If we ask for numbers that can be connected to the observable universe — numbers of particles, possible observations, information-bearing degrees of freedom, horizon entropy, or possible microstates — then googolplex-like numbers appear to be near the upper end of what is physically meaningful.
Not exactly googolplex itself. The physically motivated estimates are larger:
$$ 10^{10^{122}}, $$or perhaps
$$ 10^{10^{123}}, $$depending on the precise horizon and entropy bound used.
But these numbers are in the same family as a googolplex.
This is the striking point. A googolplex is not merely a childish exaggeration. It is close to the scale that appears when we ask how many possible microstates may be associated with the observable universe.
Beyond this scale, numbers may still be perfectly meaningful in pure mathematics. Graham’s number and Rayo’s number are mathematically legitimate. But their connection to physical reality is much less clear.
They may not count anything that could be stored, instantiated, distinguished, observed, or realized within our universe.
Conclusion
A deck of cards can be shuffled in
$$ 52! \approx 8 \times 10^{67} $$ways.
That number is already large enough to defeat ordinary intuition.
But the universe is a much larger deck.
If we estimate the number of possible cosmic microstates using entropy and holographic reasoning, we arrive at numbers of order
$$ 10^{10^{122}} $$to
$$ 10^{10^{123}}. $$These numbers are larger than a googolplex, but not of a completely different kind.
The lesson is simple but profound: combinatorics turns finite systems into vast spaces of possibility. A deck of cards already does this. The universe does it on a scale where googolplex-like numbers cease to be mere curiosities and become physically suggestive.
But physics also teaches a second lesson. Not every formal arrangement is a possible world. The laws of physics, the structure of symmetry, and the requirement of consistency cut down the raw combinatorial space.
So the real question is not simply how many ways the universe can be shuffled.
It is how many of those shuffles still count as physically admissible histories.
A googolplex is not the largest number in mathematics.
But it may be close to the scale at which numbers stop describing possibilities inside our universe and begin to belong almost entirely to mathematics itself.
Reference
Gustaf Ullman, The Equivariant Multiverse: Globally Consistent $G$-Equivariant Histories, Zenodo, 2025. DOI: 10.5281/zenodo.17711300.
Kommentarer