The Demon-Haunted World_ Science as a Candle in the Dark - Carl Sagan [101]
These are all cases of proved or presumptive baloney. A deception arises, sometimes innocently but collaboratively, sometimes with cynical premeditation. Usually the victim is caught up in a powerful emotion - wonder, fear, greed, grief. Credulous acceptance of baloney can cost you money; that’s what P.T. Barnum meant when he said, “There’s a sucker born every minute’. But it can be much more dangerous than that, and when governments and societies lose the capacity for critical thinking, the results can be catastrophic, however sympathetic we may be to those who have bought the baloney.
In science we may start with experimental results, data, observations, measurements, ‘facts’. We invent, if we can, a rich array of possible explanations and systematically confront each explanation with the facts. In the course of their training, scientists are equipped with a baloney detection kit. The kit is brought out as a matter of course whenever new ideas are offered for consideration. If the new idea survives examination by the tools in our kit, we grant it warm, although tentative, acceptance. If you’re s inclined, if you don’t want to buy baloney even when it’ reassuring to do so, there are precautions that can be taken there’s a tried-and-true, consumer-tested method.
What’s in the kit? Tools for sceptical thinking.
What sceptical thinking boils down to is the means to construct and to understand, a reasoned argument and, especially impor tant, to recognize a fallacious or fraudulent argument. Thi question is not whether we like the conclusion that emerges out o a train of reasoning, but whether the conclusion follows from thi premises or starting point and whether that premise is true.
Among the tools:
• Wherever possible there must be independent confirmation o the ‘facts’.
• Encourage substantive debate on the evidence by knowledge able proponents of all points of view.
• Arguments from authority carry little weight – ‘authorities have made mistakes in the past. They will do so again in thi future. Perhaps a better way to say it is that in science there ar no authorities; at most, there are experts.
• Spin more than one hypothesis. If there’s something to b explained, think of all the different ways in which it could b explained. Then think of tests by which you might systemati cally disprove each of the alternatives. What survives, th hypothesis that resists disproof in this Darwinian selectioi among ‘multiple working hypotheses’, has a much bette chance of being the right answer than if you had simply run wit! The first idea that caught your fancy.*
[* This is a problem that affects jury trials. Retrospective studies show that som jurors make up their minds very early – perhaps during opening arguments and then retain the evidence that seems to support their initial impressions an reject the contrary evidence. The method of alternative working hypotheses not running in their heads.]
• Try not to get overly attached to a hypothesis just because it’ yours. It’s only a way-station in the pursuit of knowledge. Asi yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
• Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
• If there’s a chain of argument, every link in the chain must work (including the premise) - not just most of them.
• Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well