The proton seems to be 0.00000000000003 millimeters smaller than researchers previously thought, according to work published in the July 8 issue of Nature.
The difference is so infinitesimal that it might defy belief that anyone, even physicists, would care. But the new measurements could mean that there is a gap in existing theories of quantum mechanics. “It’s a very serious discrepancy,” says Ingo Sick, a physicist at the University of Basel in Switzerland, who has tried to reconcile the finding with four decades of previous measurements. “There is really something seriously wrong someplace.”
Protons are among the most common particles out there. Together with their neutral counterparts, neutrons, they form the nuclei of every atom in the Universe. But despite its everday appearance, the proton remains something of a mystery to nuclear physicists, says Randolf Pohl, a researcher at the Max Planck Institute of Quantum Optics in Garching, Germany, and an author on the Nature paper. “We don’t understand a lot of its internal structure,” he says.
From afar, the proton looks like a small point of positive charge, but on much closer inspection, the particle is more complex. Each proton is made of smaller fundamental particles called quarks, and that means its charge is roughly spread throughout a spherical area.
Physicists can measure the size of the proton by watching as an electron interacts with a proton. A single electron orbiting a proton can occupy only certain, discrete energy levels, which are described by the laws of quantum mechanics. Some of these energy levels depend in part on the size of the proton, and since the 1960s physicists have made hundreds of measurements of the proton’s size with staggering accuracy. The most recent estimates, made by Sick using previous data, put the radius of the proton at around 0.8768 femtometers (1 femtometer = 10-15 meters).
Pohl and his team have a come up with a smaller number by using a cousin of the electron, known as the muon. Muons are about 200 times heavier than electrons, making them more sensitive to the proton’s size. To measure the proton radius using the muon, Pohl and his colleagues fired muons from a particle accelerator at a cloud of hydrogen. Hydrogen nuclei each consist of a single proton, orbited by an electron. Sometimes a muon replaces an electron and orbits around a proton. Using lasers, the team measured relevant muonic energy levels with extremely high accuracy and found that the proton was around 4 percent smaller than previously thought.
That might not sound like much, but the difference is so far from previous measurements that the researchers actually missed it the first two times they ran the experiment in 2003 and 2007. “We thought that our laser system was not good enough,” Pohl says. In 2009, they looked beyond the narrow range in which they expected to see the proton radius and saw an unmistakable signal.
“What gives? I don’t know,” says Sick. He says he believes the new result, but that there is no obvious way to make it compatible with years of earlier measurements.
“Something is missing, this is very clear,” agrees Carl Carlson, a theoretical physicist at the College of William & Mary in Williamsburg, Va. The most intriguing possibility is that previously undetected particles are changing the interaction of the muon and the proton. Such particles could be the “superpartners” of existing particles, as predicted by a theory known as supersymmetry, which seeks to unite all of the fundamental forces of physics, except gravity.
But, Carlson says, “the first thing is to go through the existing calculations with a fine tooth comb.” It could be that an error was made, or that approximations made in existing quantum calculation simply aren’t good enough. “Right now, I’d put my money on some other correction,” he says. “It’s also where my research time will be going over the next month.”
It would be a mistake to consider this a setback for science. On the contrary, it is a victory. We are again furnished with proof of the robustness of the institution’s self-correcting apparatuses. The overwriting of established errors is just as important as new discoveries. Maybe even more important; for the overturning of foundational assumptions keeps the enterprise of free inquiry humble—which I think it needs to be moreso.
I am a realist and naturalist opposed to irrealism and postmodernism. I am convinced reality is constituted entirely by the physical universe which exists independent of observation, and is governed by infallibly regular laws. But I don’t think humanity (or any individual human) will ever have a complete mathematical or conceptual understanding of those laws. Feynman predicted a day when we would reduce all the laws of nature into neat, axiomatic equations, and Stephan Hawking types about “knowing the mind of [a pantheistic] God” through a postulated Unified Field Theory. I’m not prepared to say the evidence before us logically commits us to denying these men’s predictions; but I do think the burden of proof is on them. I see no reason why our ape-brains should be able to represent the whole of existence.
We can represent a lot of it though. Abstract thinking helps there. Mathematics is remarkable for its reductive power, in its ability to conceptualize concrete things by quantifying some property under examination into an irreducible symbol functioning in a system (re: equation) governed by internally fixed laws. But I think our conceptual powers can only go so far. Even if everything that exists is quantifiable, even if only in principle, there might be irresoluble conceptual, which is to say, philosophic or even linguistic disputes, with no final resolution. For example: Quantum entanglement is the phenomena by which two particles, separated spatially, nonetheless exert such equal causal influence over each other it is impossible to describe one’s activity abstracted from the other’s. Is it proper, then, to speak of two entangled particles, or of one particle extended in two places?
There’s probably someone who actually knows things about physics who could tell me what empirical experiment or mathematical modeling could settle this question, if it hasn’t been already. But I think the example still holds as a hypothetical. Answering quantum entanglement doesn’t mean there aren’t scientific questions that can’t be resolved by empirical observation.
For example: Some people interpret Gödel’s incompleteness theorems as disproving a mechanistic or computational theory of mind.
I actually don’t entirely understand the logic of their arguments point-for-point myself, but the gist of it states that Gödel’s theorems prove we can know more mathematical truths than we can prove to the highest degree of logical rigor. This is interpreted by some as disproving the possibility that the mind operates according to algorithms (stepped informational patterns whose actions are determined by the previous steps), and possess some undefined but nonalgorithmic and indeterministic creative power.
However, the validity of anti-computationalist rests on a conjecture it is literally impossible to prove. This Gödelian anti-computationalist argument is only valid if there are no equations in the special branch of algebra called “Diophantine equations” for which the mind cannot give a definite answer to. However, mathematicians, including Gödel himself, agree it is logically impossible to prove or disprove the existence of such unsolvable Diophantine equations. No one can say whether Gödelians or computationalists are correct based on the incompleteness argument alone, but both sides must bolster their case with other empirical observations and logical formulations.*
This is where hard science and epistemology meet. Science isn’t merely the accumulation of facts, but the fostering of the right interpretation of those facts. (And I don’t mean “interpretation” in the perspectivist sense of postmodernists, which declares there are no facts, but only interpretation. I mean “interpretation” in the sense of recognizing a datum’s wider implications on the description of the phenomena it represents, and also on our conceptual thinking on that phenomena. For example, inductive interpretation of a population survey entails the question, “To what extent are these findings generalizable?” Conceptual interpretation of special relativity entails the question, “How does this conflict with the commonsense conception of absolute time?”) Science is about knowledge and understanding; and a person and a species can at once have much of the former and little of the latter.
Of course, we can and do wield an adequate grasp of physical laws to utilize our environment technologically, sometimes with godlike results; but that doesn’t mean we know, or need to know, everything about the materials we use. We don’t even need to know everything we know can be known about materials to make use of them to godlike ends.
To wit: We have been splitting atoms for over half a century, even while proceeding from an error about the size of one of their basic constituent parts. Nuclear reactors are an integral part of our infrastructure, and will only predominate more if industrial society outlives fossil fuels. Hiroshima and Nagasaki loomed in the global consciousness for half a century of Cold War because they demonstrated with gory starkness the power of human technological command of the atom it didn’t fully understand.
Such is the power of science, even inexact science.
*This example also doesn’t illustrate the sort of example I am looking for, but fails in a different way my quantum entanglement one does. I do accept mathematics as a legitimate science; but unlike the physical sciences, I don’t think the objects of mathematical inquiry are objects existent dependent of human consciousness. Rather, the mathematician gives exposition to certain intuitions about the foundations of deductive reason by pushing those intuitions to their logical limits. Therefore, mathematical truths do exist–only they are true relative to the nonarbitrary internal laws of mathematical reasoning. There are no unexperienced mathematical truths.