It sounds like something from a sci-fi thriller: what if everything we experience—every tree, every star, even our own thoughts—is part of a colossal digital simulation? That unsettling idea, once reserved for philosophers and screenwriters, is gaining traction in scientific circles. Thanks to new research on information dynamics, the line between physics and computation is blurring in ways that could rewrite our understanding of reality.
Amazon co-founder MacKenzie Scott has donated over $19 billion to charity in just five years
Diamond batteries powered by nuclear waste promise 28,000 years of clean energy
The strange physics of information
At the heart of this growing debate is Dr. Melvin Vopson, a physicist at the University of Portsmouth. He has long championed the idea that information isn’t just abstract—it has mass, just like matter and energy. In fact, he argues that every particle stores information about itself, much like how our DNA encodes the genetic blueprint of life.
Building on this concept, Vopson and physicist Serban Lepadatu recently proposed a new law of physics—one that governs not energy or motion, but how information behaves over time. Dubbed the second law of information dynamics, this principle could be the first concrete clue that our universe operates like a highly efficient codebase.
Entropy and the behavior of information systems
In traditional physics, the second law of thermodynamics tells us that entropy—the measure of disorder—tends to increase in isolated systems. Applied to information, the assumption was that the disorder in data systems should also increase. But Vopson’s team observed something unexpected: in systems ranging from digital data storage to RNA sequences, entropy didn’t spiral out of control—it decreased or plateaued.
This finding was consistent across both artificial and biological systems. When analyzing the RNA of different SARS-CoV-2 variants, for instance, researchers found that as the virus mutated, informational entropy actually declined—pointing to a surprising pattern of optimization.
In other words, evolution, whether in viruses or code, seems to favor cleaner, more ordered systems. And that, Vopson argues, hints at something deeper—possibly a universal law, with implications for everything from genetics and artificial intelligence to cosmology.

Information behaves like matter—so does it shape our universe?
The more this theory is tested, the more it appears to hold water. When exploring quantum models of atoms, Vopson found that electron arrangements follow rules that match the predictions of his new law. The classic “Hund’s Rule” in quantum chemistry—which determines how electrons fill atomic orbitals—could be explained by information entropy minimization.
NASA warns China could slow Earth’s rotation with one simple move
This dog endured 27 hours of labor and gave birth to a record-breaking number of puppies
He goes further, suggesting that this law is not just consistent, but necessary, in a cosmological sense. As the universe expands, physical entropy increases. But to preserve energy conservation and balance, something must decrease—and that something, he proposes, is the entropy of information.
This idea also provides a potential answer to one of nature’s oldest mysteries: why is everything so symmetrical? From galaxies to snowflakes to the human face, symmetry abounds. Vopson demonstrated through basic geometry that symmetry naturally produces the lowest informational entropy—meaning our universe may be “wired” to prefer order over chaos.

Could the universe be optimized like a computer?
If all of this sounds eerily familiar to computer science, that’s the point. Efficient data compression, the kind used to store videos or zip files, relies on minimizing unnecessary information. The same behavior is exactly what Vopson’s law describes: a drive toward optimization, simplicity, and efficiency—hallmarks of well-written code.
“If we are living in a simulated reality,” Vopson notes, “then the simulation would need to conserve resources—processing power, storage, energy.” His findings suggest our universe behaves exactly like such a system, where data is constantly compressed, cleaned, and optimized.
The next step: proving information has mass
So how do we test a theory this bold? Vopson believes the key lies in particle physics. Since his earlier work proposed that information has mass, he suggests using particle-antiparticle collisions—like an electron and its antimatter twin, the positron—to detect what happens to their information when they annihilate each other.
According to him, such annihilations should emit more than just energy—they might also release subtle, low-energy photons that account for the “deleted” information. Detecting these would not only validate the physical nature of information, but could even suggest that information is the elusive fifth state of matter.
And here’s where things get truly wild: if information has mass, and behaves the way Vopson suggests, it could help explain dark matter—the invisible substance that makes up about 27% of the universe’s energy density.
A simulation, or just smart physics?
Vopson is careful not to declare we’re definitely inside a Matrix-style simulation. But his work pushes the conversation away from pure speculation and toward testable, empirical science. “If this law holds up,” he says, “we may be able to turn the simulation hypothesis into something we can actually study.”
Whether or not the universe turns out to be a simulation, these discoveries point to something extraordinary: information may be as real and fundamental as mass and energy. And if that’s true, we’re going to need to rethink what we mean by “reality” altogether.
