Sign up for the Starts With a Bang newsletter
Travel the universe with Dr. Ethan Siegel as he answers the biggest questions of all.
When most of us think about science, we don’t often think about something very fundamental to the enterprise: what the goal of it all actually is. Reality is a complicated place, and the only tools we have to guide us in understanding what it is and how it works is the combination of what we can observe, measure, and experiment on. When we add up the full suite of that body of observational and experimental knowledge, we have a record of all the phenomena that we know exists. The enterprise of science, then, seeks to make sense of all of it, and to explain it as simply and powerfully as possible: to maximize our predictive power of nature’s phenomena with as few assumptions, parameters, and variables as are absolutely necessary.
We’ve come incredibly far in our understanding of the Universe, and can describe all of the particles and interactions that describe everything we can directly detect and measure exquisitely. The Standard Model of elementary particles describes the electromagnetic, strong nuclear, and weak nuclear forces precisely and without fail, while general relativity describes gravitation for all forms of matter and energy. The inflationary Big Bang describes our cosmic origins. And current mysteries like dark matter, dark energy, and the baryogenesis puzzle hint that there’s even more to the Universe than we currently are aware of.
Could all of this be solved by coming up with a “Theory of Everything?” Although the idea continues to fascinate many, there are some strong arguments to be made against it. Here’s why, perhaps, there isn’t a Theory of Everything out there to be found at all.

The idea of unification holds that all three of the Standard Model forces, and perhaps even gravity at higher energies, are unified together in a single framework. This idea, although it remains popular and mathematically compelling, does not have any direct evidence in support of its relevance to reality. Only electroweak unification, among all the unified possibilities, has been established.
The modern idea of a Theory of Everything goes back more than 100 years, to the early days of general relativity. Einstein was able, starting in 1915, to successfully describe the observed phenomenon of gravitation as a consequence of the curvature of our four-dimensional spacetime. The presence, distribution, and motion of matter and energy through spacetime determined the curvature and evolution of that spacetime fabric, and then the curvature of that spacetime fabric determined the future trajectories and fates of every particle that exists within that spacetime. Put simply, general relativity took the idea of special relativity and unified it with the idea of gravitation, creating the powerful framework that many would argue was Einstein’s greatest scientific accomplishment.
Prior to Einstein, however, there was a different theory that:
- had the speed of light as the ultimate speed limit at which anything could travel,
- described particles and interactions in terms of fields and charges,
- was relativistically invariant under boosts and translations,
- and that conserved energy and momentum on top of all this.
This was Maxwell’s (classical) theory for electromagnetism, putting the previously distinct notions of electricity and magnetism together into a unified footing.
It only took four years from the publication of Einstein’s general relativity until someone had the idea to try to unify it with Maxwell’s electromagnetism, which Theodor Kaluza did in a remarkable paper back in 1919.

Newton’s law of universal gravitation (left) and Coulomb’s law for electrostatics (right) have almost identical forms, but the fundamental difference of one type vs. two types of charge open up a world of new possibilities for electromagnetism. These similarities are profound, but ultimately superficial, as electromagnetism and gravitation also exhibit several profound fundamental differences in the form of the force law, the structure of the underlying theories, and their effects on matter and energy.
Credits: Dennis Nilsson/RJB1, Wikimedia Commons
This was, in many ways, the first modern attempt at a Theory of Everything. Einstein’s general relativity was already a four dimensional theory (requiring 10 independent degrees of freedom), and Maxwell’s electromagnetism required four separate degrees of freedom (the scalar electric potential and the three-dimensional vector magnetic potential) in addition to that, meaning that the same four dimensions used in Einstein’s theory would be insufficient to hold general relativity and electromagnetism together in a single, unified framework.
Kaluza’s solution, subsequently reused down the line by many others, was to take a speculative leap into the fifth dimension, allowing general relativity and electromagnetism to be unified together in what would become known as Kaluza-Klein theory.
But there was an immediate problem with Kaluza-Klein theory: apparent to all who looked at it. Sure, it was incredible that you could write down a single equation that contained all of general relativity and all of electromagnetism within it; that’s the positive. But the negatives were threefold:
- The fifth dimension couldn’t impact anything in our four dimensional spacetime; it must somehow “disappear” from all the equations that impact physical observables.
- Even by 1919, we knew the Universe didn’t just obey Maxwell’s classical electromagnetism, but rather required a quantum description for electromagnetism (at least).
- And Kaluza’s theory also required an additional field to make his five-dimensional framework self-consistent: a scalar field known as the dilaton, which must also “disappear” as it plays no physical role in electromagnetism, gravitation, or the interplay between the two.

In theory, there could be more than three spatial dimensions to our Universe, so long as those “extra” dimensions are below a certain critical size that our experiments have already probed. There is a range of sizes in between ~10^-19 and 10^-35 meters that are still allowed for a fourth (or more) spatial dimension, but nothing that physically occurs in the Universe can be allowed to rely on that fifth (or more) dimension. This idea of a “compact” extra dimension was introduced into theoretical physics way back in 1919 by Theodor Kaluza, in an attempt to unify classical electromagnetism with general relativity.
That last point is often overlooked when people talk about the notion of a Theory of Everything, but is arguably even more important now than it was back 100 or so years ago: if you try to unify the Universe into a grander, more all-encompassing framework, it often requires the addition of new entities — particles, fields, interactions, etc. — whose existence is already either ruled out or highly constrained by observations, measurements, and experiments whose results we already know. If there is a fifth dimension, and if there is a dilaton, the dimension must be so tiny and the dilaton’s effects must be so weak that they must be consistent with the full suite of data we’ve collected that reveal no evidence for their existence, only constraints on where and how they cannot affect our physical reality.
It’s no wonder, then, that rather than the path of unification and the quest for a Theory of Everything leading to enormous advances in physics during the 20th century, it was instead developments in nuclear physics, quantum physics, and particle physics that progressed the field. The combination of novel experimental results and new theoretical developments, side-by-side, helped us understand the full suite of particles that existed in the Universe, what rules they followed in binding together, and how the forces that governed them behaved. If we fast-forward to the present day, we wind up with the Standard Model of elementary particles, which is simultaneously simple and yet, contradictorily, full of complexities.

This diagram displays the structure of the Standard Model (in a way that displays the key relationships and patterns more completely, and less misleadingly, than in the more familiar image based on a 4×4 square of particles). In particular, this diagram depicts all of the particles in the Standard Model (including their letter names, masses, spins, handedness, charges, and interactions with the gauge bosons: i.e., with the strong and electroweak forces). It also depicts the role of the Higgs boson, and the structure of electroweak symmetry breaking, indicating how the Higgs vacuum expectation value breaks electroweak symmetry and how the properties of the remaining particles change as a consequence. Neutrino masses remain unexplained.
The stuff that makes up the matter we’re familiar with — atoms — isn’t merely protons, neutrons, and electrons, as most of us learned back in school. Rather, the electron is just the lightest of three generations of charged leptons: along with the muon and tau lepton. There are their antiparticles, plus a species of neutrino (and antineutrino) that is the corresponding “uncharged lepton” to each of the charged leptons. Protons and neutrons, meanwhile, aren’t fundamental particles, but are composite particles composed of quarks and gluons. There are three generations of quarks, with the up-and-down quarks (making up the first generation) having charm-and-strange and then top-and-bottom quarks as their heavier-generation counterparts.
Meanwhile, there are eight massless gluons (mediating the strong nuclear force), one massless photon (mediating the electromagnetic force), and three very massive W-and-Z bosons (mediating the weak nuclear force), plus the Higgs boson to complete the Standard Model. Every particle-based experiment performed and every detector set up to observe particles ever concocted has only found evidence of these particles and these particles alone, with the properties given to them by our Standard Model framework. These particles and forces may look very disjointed, but there are relationships between them that are extremely powerful.

Nature is not symmetric between particles/antiparticles or between mirror images of particles, or both, combined. Neutrinos, for example, are always left-handed, in that if you point your thumb in their direction of motion, they spin in the direction that your left hand’s fingers curl. Similarly, antineutrinos are always right-handed. Prior to the detection of neutrinos, which clearly violate mirror-symmetries, weakly decaying particles offered the only potential path for identifying P-symmetry violations.
The quarks and charged leptons are all spin-½ particles, where each quark or charged lepton can have its measured spin in any direction turn out to be +½ or -½, depending on which way its intrinsic angular momentum happens to be oriented. But for the neutrinos — the uncharged leptons — their status as a spin-½ particle is a little different.
- All of the neutrinos always have their spin as +½, meaning that if you point your left thumb in the direction of motion of the neutrino, your left hand’s fingers will curl in the direction of its spin. You never observe a right-handed (or a spin -½) neutrino.
- But all of the antineutrinos always have their spin as -½, meaning that you must point your right thumb in the direction of motion of the antineutrino to get your fingers to curl in the direction of the antineutrino’s spin. You never observe a left-handed (or a spin +½) antineutrino.
This is a fundamental asymmetry of nature: the same way that nature is asymmetric in the sense that there are electric charges (positive and negative charges) but not magnetic charges (north and south magnetic monopoles). Even when we unified the electromagnetic force with the weak nuclear force, the electroweak interactions still exhibit this fundamental left-right asymmetry, and there are still no such things as magnetic monopoles (or charges). This is an important feature of nature, borne out over and over again by experiment and observation, and must be preserved in order to have our theories that describe reality actually match up with our measured physical reality.

When the neutral kaon (containing a strange quark) decays, it typically results in the net production of either two or three pions. Supercomputer simulations are required to understand whether the level of CP-violation, first observed in these decays, agrees or disagrees with the Standard Model’s predictions. With the exception of only a few particles and particle combinations, almost every set of particles in the Universe are unstable, and if they don’t annihilate away, they will decay in short order.
Now, many have sought — and are still seeking — to further unify our theories of physics into a single description. The idea that the strong force unifies with the electroweak force at some point is an example of a Grand Unified Theory, and many have worked on those since the 1970s. The idea that there’s a Theory of Everything that includes gravity is a hallmark of string theory and, more recently, the novel field of positive geometry. And many additional ideas in theoretical physics seek to add in additional symmetries, additional dimensions, additional extra particles, or additional unification frameworks. These are popular and for good reasons: the mathematical path to a Theory of Everything inevitably, for self-consistency’s sake, must go through these routes.
But when we confront these ideas with reality, that’s when we start to run into enormous amounts of trouble. All of these ideas necessitate adding some additional ingredients to our reality: ingredients which, if we just write them down in a general fashion, can lead to new interactions or decays of the particles we already know about. Because we have such exquisite data on how the known (Standard Model) particles are already established to interact and decay (as well as how they are forbidden from interacting or decaying), we have to take extreme care that any attempt toward a Theory of Everything doesn’t conflict with already-existing data, particularly with the data we have from particle physics experiments.

The particle content of the hypothetical grand unified group SU(5), which contains the entirety of the Standard Model plus additional particles. In particular, there are a series of (necessarily superheavy) bosons, labeled “X” in this diagram, that contain both properties of quarks and leptons, together, and would cause the proton to be fundamentally unstable. Their absence, and the proton’s observed stability, provide strong evidence against the validity of this theory in a scientific sense.
We normally consider the simplest, smallest additional scenarios when attempting to extend known physics. So if you:
- Want to make your Universe left-right symmetric? You can do it theoretically, sure, but then you need a way to get rid of all the right-handed neutrinos (and left-handed antineutrinos) and the corresponding weak interactions that they would permit, as well as explain the lack of a fourth strong-force color.
- Want to unify the strong nuclear force with the electroweak force? Again, you can do it theoretically, but the extra particles that come along with it (fractionally charged gauge bosons) would admit proton decay, so you have to find some way to suppress or eliminate that to be consistent with what we observe.
- Want to impose an additional symmetry, such as supersymmetry, to explain why the particles of the Standard Model have the masses that they do? Then you have to explain why there’s only one Higgs boson (instead of multiple Higgses that supersymmetry predicts), why there are no superpartners of approximately the mass of the top quark found at the LHC, and why there are no flavor-changing-neutral-currents (i.e., quark-or-lepton generational changes that preserve electric charge) in nature, as supersymmetry’s predictions contradict the observations that we have.
The simplest left-right symmetric scenario is the Pati-Salam model, which would have no parity violation, but parity violation is robustly observed. The simplest force that unifies the strong force with the electroweak force is the Georgi-Glashow model, which predicts proton decay on a timescale of around 1030 years, but the proton’s lifetime is already established to be at least 1034 years. And not just the simplest model of supersymmetry, but all models of supersymmetry, predict several other Higgs bosons, not just the one (and only one) we’ve found at the Large Hadron Collider.

The Standard Model particles and their supersymmetric counterparts. Slightly under 50% of these particles have been discovered, and just over 50% have never shown a trace that they exist. Supersymmetry is an idea that hopes to improve on the Standard Model, but it has yet to achieve the all-important step for supplanting the prevailing scientific theory: having its new predictions borne out by experiment.
All of these schemes, it would seem, conflict with what’s already known about the Universe. And yet, the way that most people interested in a Theory of Everything have attempted to reckon with this problem is not to find an alternate approach that doesn’t have these features that contradict reality. Instead, it’s been to extrapolate further, to:
- larger, grander unification schemes,
- that impose even more symmetries that aren’t reflected by nature,
- that introduce greater numbers of unobservable parameters (fields, particles, degrees of freedom, extra dimensions, new symmetries, etc.),
- and then to assert that somehow — without introducing any mechanism for how it happens — all of the extraneous, unwanted features miraculously get suppressed,
eventually leading to the highly asymmetric Universe we now observe today.
This is precisely the approach taken by string theory (and positive geometry). Instead of one extra dimension, there are many: at least six and as many as 22 more, in addition to the four we know about. Instead of magnetic monopoles, extra Higgs sectors, superheavy bosons admitting proton decay, and left-right symmetric features, they have all of these plus many more. Instead of space, there’s superspace; there’s supergravity; there’s not just the conventional “for every Standard Model particle, there’s a superpartner particle” version of supersymmetry, but there are a total of four new supersymmetries and hundreds of additional new particles.
It’s as though, by adding more and more and more and more ingredients — ingredients that aren’t reflected by our observed-and-measured physical reality — we can somehow solve, rather than worsen, the puzzles we’re facing when it comes to the Universe today.

The central idea of string theory is that all the quanta we know of are described by tiny strings that vibrate in various ways on minuscule scales: far below what’s ever been probed. String theory is an attempt at a framework for quantum gravity, and arguably the only viable candidate for finding out what’s real in the Universe on trans-Planckian scales.
The big question that you would think to ask, which is, “how would you take this large, unified structure that has hundreds upon hundreds of ingredients that must be eliminated and suppressed in order to recover the Universe we actually observe,” is never a question that gets answered when it comes to these attempts at a Theory of Everything. It’s why I’ve described these attempts as an unlikely broken box, or more recently, like “a puzzle where a wild child painted all the pieces independently of one another.” If even one of those additional ingredients (or puzzle pieces) isn’t sufficiently eliminated or suppressed, that attempt at a theory is dead-in-the-water, as it will conflict with observations, experiments, and measurements that have already established, “this does not line up with reality.”
And that’s the big thing that it always must come back to: do our theoretical ideas line up with reality? When we formulate attempts at a Theory of Everything, are we to go back to the very beginning where we talked about what the goals of science are, working “to maximize our predictive power of nature’s phenomena with as few assumptions, parameters, and variables as are absolutely necessary?”
Our current big scientific mysteries compel us to keep seeking truths about the Universe, as there are many aspects of reality that we cannot, as of yet, fully explain. But relying on loose, superficial analogies and mathematical hand-waving is more than dissatisfying; it’s an approach that loses a fundamental connection with what makes something scientific: a connection with observable, measurable reality. Perhaps one or more of the ideas we have surrounding a Theory of Everything will ultimately turn out to be correct. But if so, it will be a connection with reality that cements and establishes it, not mere arguments about what’s mathematically possible.
Sign up for the Starts With a Bang newsletter
Travel the universe with Dr. Ethan Siegel as he answers the biggest questions of all.