Free Radicals - Michael Brooks [21]
It has been suggested that the unpleasant character traits that became so obvious in Newton’s later life can be blamed on his gradual ingestion of mercury during alchemical experiments. But it is clear that the dark side was there all along. Parts of Newton’s most celebrated work – the Principia – are ‘nothing short of deliberate fraud’, according to his biographer Richard Westfall: ‘If the Principia established the quantitative pattern of modern science, it equally suggested a less sublime truth – that no one can manipulate the fudge factor quite so effectively as the master mathematician himself.’
Newton fudged theoretical calculations of the speed of sound, the precession of the equinoxes, the strength of gravity on the Moon and the heights of the tides so as to fit with experiment. And in each new edition of the Principia he introduced changes that took the same data but significantly increased the level of apparent precision. Westfall calls this ‘a cloud of exquisitely powdered fudge factor blown in the eyes of his scientific opponents’.
The thing is, to scientists it seems that this is all just fine. Ptolemy has been forgiven as ‘honestly motivated’; there’s nothing unusual about publishing only the data that support your theories, according to Harvard historian Owen Gingerich. No lesser a figure than Einstein has exonerated Galileo – this time because Galileo was right about the motion of the Earth around the Sun. ‘It was Galileo’s longing for a mechanical proof of the motion of the Earth which misled him into formulating a wrong theory of the tides,’ wrote Einstein in a preface to a modern edition of the Dialogue. ‘His endeavors are not so much directed at “factual knowledge” as at “comprehension”.’ And here is where we start to see a new pattern emerge, one that exposes the secret anarchists.
A 2007 report into scientific misbehaviour published in Nature concluded that ‘many of the risk factors for misconduct also seem to be what makes for good science’. That certainly seems to be the case. Galileo and Newton were the founding fathers of science. Newton in particular made great play of the role of observations and data, setting the tone of science for centuries to come. But data, as we have seen, are not always reliable, and in private scientists rely on intuition to guide them in their work. When intuition and data clash, it is usually intuition that wins out. As Peter Medawar pointed out, ‘scientists who fall deeply in love with their hypothesis are proportionately unwilling to take no as an experimental answer’.
Is this justifiable? Yes, if the object of their infatuation turns out to be worth the attention.
As the twentieth century began, Robert Millikan was fast approaching forty. All around him, physics was at its most exhilarating: J.J. Thomson had just discovered the electron, and Max Planck had pulled quantum theory into existence with a brilliant piece of scientific detective work. Outshining everyone else, Einstein had made it clear that everything was composed of atoms and, with his special theory of relativity, that the universe was stranger than anyone had imagined.
Millikan, on the other hand, had done practically nothing. So he decided to measure e, the charge on the electron. Finding the value of e was important because it – and the very existence of the electron – was the subject of a heated and complex international debate. Although Thomson had ostensibly discovered the electron in 1897, German physicists – at that time considered the best in the world – were unconvinced.
Their hesitations were to do with the aether, a ghostly fluid that was thought to fill all of space. The aether provided a medium through which light could travel, and in the corridors of Germany’s universities it was agreed that the experiments which claimed to demonstrate Thomson’s ‘negatively charged matter’ merely provided evidence