Once aware of the two aspects of scientific knowledge, the verified laws and the sense we make of them, it is fascinating to see what happens when new data is inconsistent with a current explanation, i.e., our understanding of the phenomenon. The process by which this is sorted out is often at the heart of the tale. This section includes two such stories—the first in which a contrary observation was found to be erroneous and the other in which an unexpected observation disproved an explanation.
Neutrinos were known to travel at the speed of light in space and they were known to pass, with equal ease, through the earth. A detector at the Laboratori Nazionali del Gran Sasso in Abruzzo, Italy was set up to receive neutrinos generated at CERN in Geneva, Switzerland. In 2011, an experiment was set up to study the shift in neutrino states as they traveled 731 km. Dario Autiero realized that this same apparatus could be used to measure the neutrino’s speed through so much granite. It was this ancillary experiment that made the news.
The reason for its notoriety was that the neutrinos appeared to negotiate that distance some 63 ns (That’s 6.3 percent of a millionth of a second) quicker than light would take through free space. If true, this would upset a basic tenet of Einstein’s special theory of relativity and have far-reaching implications. Roughly nine months later, two sources of error in the equipment measuring the interval between generation and detection were discovered. The neutrinos had flown at exactly the speed of light, and theoretical physicists could finally exhale.
For a few months, it seemed this could be the moment Einstein anticipated when he said that no experiment could prove him right, but a single experiment could prove him wrong. But would he have been completely wrong?
Strong gravitational fields would still bend the path of light rays, e would still equal mc2, and time dilation on moving objects would still occur to the same degree. If the premise used by Einstein in his derivation was wrong, a different one would be sought. Again, the laws would remain valid, a revised explanation would be pursued, and new limits might be found.
The implications of an explanation can itself stimulate the discovery of new laws. That has notably been the case in Einstein’s theories of relativity. Their predictions are still being tested, and in every case so far, they have been confirmed. The prediction of gravitational lensing, for example, has now become a major tool of astronomical observation.
The second story also involves an observation that upset an accepted explanation. This time, the explanation was not just wrong; it was blocking progress. In the early days of computerizing scientific instruments, I wanted to build an analytical device that would separate and then identify components in a mixture under computer control. With my graduate student, Rick Yost, we chose quadrupole mass analyzers for both functions. The charged molecules (called ions) whose mass had been selected by the first quadrupole would then be fragmented so its distinctive pattern of fragment masses, as seen by the second quadrupole, would provide identification.
Tandem mass-selection stages (using magnetic and electric sectors) were already used to study ion fragmentation by energetic collision with gas molecules. These studies, which used ion accelerations of thousands of volts, showed that fragmentation efficiency quickly declined from poor to non-existent as the ion acceleration voltage decreased.
The efficiency at various levels of acceleration had been fit to an equation and an explanation for the observation developed. It was that an electron in the ion to be fragmented was excited by a near encounter with a collision gas molecule. This energy then moved to a chemical bond and caused its rupture. Lower ion velocities did not induce enough energy to break a bond.
The “required” ion acceleration energy for fragmentation was a hundred times higher than those used with quadrupole analyzers. If collisional fragmentation wouldn’t work, what could we use? A chance discussion with Jim Morrison broke the ice. He was studying laser excitation to fragment ions. And, just as we had envisioned our instrument, he used one quadrupole analyzer to select the ions to fragment and the other to find the fragment masses.
Would photofragmentation work for us? Jim said no, because its efficiency was so poor his laser-produced fragments were drowned out by continuously produced background fragments.
We puzzled over what process could be producing the interfering fragments. And if we found out, could we use it in our instrument? Contrary to the accepted mechanism for ion fragmentation, experiments in Morrison’s lab proved his “noise” fragments were formed by low-energy collisions with sparse gas molecules. Jim had placed an ion-containment chamber between his two analyzers to enable the transfer of fragments to the second analyzer.
Thus was born the triple-quadrupole mass spectrometer, the precursor of a myriad succession of ‘MS/MS’ instruments that have revolutionized the role of mass spectrometry in chemical analysis. Their evolution continues some fifty years after their introduction and their invention was the subject of an Association of Biomedical Research Facilities award in March 2023.
Two factors stood in the way of this discovery. One was the incorrect explanation the sector mass spectroscopists had for the high energy requirement for fragmentation. As the energy of the collisions decreased, an increasing fraction of the collision products were lost due to scattering. The incorrect fragmentation explanation sent the search for higher efficiency in an unfruitful direction and discouraged consideration of a lower energy process.
The second factor was a lack of communication between scientists with different goals. Those studying ion-molecule reactions were familiar with the scattering of their low-energy collision products. But they had no idea it could be analytically useful. Those studying ion fragmentation between sector mass analyzers were focused on the nature of the products and the process of their formation. Having a still different goal, I became a bridge between them.
The distinction between a law and its explanation reveals their separate influences in scientific research and discovery and adds an interesting perspective to those processes.
We have shown that laws have limits on their assured applicability but have asserted that they are reliable within those limits. But we cannot test them under all possible conditions, even within those limits. Could exceptions lurk in this presumably reliable zone? We’ll look at that in the next post.
Pleased stay tuned, keep in touch, and tell your friends where to find this.
Rick,
You have indicated that the identification of the law and explanation parts of our knowledge of a phenomenon is not always obvious . In my experience, it takes practice, even with the most obvious cases. Newton's equation of gravitational force is the law. It provides no clue as to why. The explanation will depend on which grade you are learning about it, all the way from simply that gravity produces an attractive force between masses and the concept of gravitons. But for many topics, they are so combined in our minds, that parsing them takes some serious thought.
I have been thinking that a series of exercises in making this distinction might be the best way to introduce philosophical concepts to science students.
Right. And can you blame us? In our learning, there has been no basis for distinguishing what parts of what we're being taught is for sure and what is the part we made up to explain it. So, we don't know what to let go of in the face of contrary data.