Case Studies and Theory Development in the Social Sciences - Alexander L. George [71]
The first flaw of the D-N model is that it does not distinguish between regularities that might be considered causal and those that clearly are not. The D-N model equates explanation with prediction, but some observations may be predictive without being causal or explanatory. For example, a sharp drop in a barometer’s reading of air pressure may indicate that a storm is coming, but we would not argue that the barometric reading causes or explains the storm. Both the drop in barometric readings and the storm are caused by atmospheric conditions that work through mechanisms involving air pressure (as well as factors such as temperature and topography). Yet the D-N model allows the change in barometric readings to count as an “explanation” of the storm, and cannot distinguish between the explanation via barometric readings and that via air pressure and other mechanisms.259 In some instances, a good predictive capability may suffice to guide decisions or policy choices, and in the colloquial sense we may use the term “cause” for such phenomena. For many years, for example, smoking was considered on the basis of statistical evidence to be a “cause” of cancer, and the evidence was sufficiently strong to dissuade many people from smoking. Only recently have the intervening mechanisms through which smoking causes cancer become better understood, bringing us closer to a causal explanation in the scientific sense. A better understanding of the mechanisms through which smoking contributes to cancer can lead to better predictions on which individuals are more likely to develop cancer from smoking and better means of prevention and intervention to reduce the risk of cancer.
A second problem with the D-N model is that its predictions must be rendered with perfect certainty. If laws are to predict outcomes with absolute certainty, then the model founders in the physical sciences on the problem of quantum mechanics, which render quantum phenomena inherently probabilistic. In the social sciences, few nontrivial covering law type regularities hold with certainty across a wide variety of contexts.
For these reasons, philosophers and statisticians have labored mightily to construct a modification of the D-N model that would allow explanation to proceed in probabilistic terms rather than through exceptionless regularities. The “inductive statistical” (I-S) model, for example, argued for using high likelihood as the standard for explanation, but did not specify how likely an outcome must be to be considered law-like. Must a phenomenon be 99 percent likely, or only 51 percent likely? What about phenomena that are rare but occur with statistical regularity under specified circumstances? The problem, as the philosopher of science Wesley Salmon argues, is that the D-N model’s two components -regularity and expectability—can conflict with one another. Salmon notes that “a particular event, such as a spontaneous radioactive decay, may be rather improbable, yet we know the ineluctably statistical laws that govern its occurrence. The nomic [regularity] side [of the D-N model] is fulfilled, but the expectability side is not.”260
The failure of the I-S model prompted other attempts at probabilistic explanation. One of these is the “statistical relevance” or S-R model, which suggests that factors are causal if they raise the probability with which an outcome is to be expected, whether or not the resulting probability was high or low. The S-R model and other probabilistic approaches to explanation remain unsatisfactory, however.261 Salmon, a pioneer of the S-R approach, recounts his