Willard C. Humphreys takes a stance against the importance of "the given", as it is termed in other contexts, in scientific explanation. He provides a criticism of the myth of pure empiricism that subsists even in philosophies of science that attempt to provide a logico-deductive justification for the validity of scientific theories (he names here figures such as Hempel, Braithwaite). Such theories commit the mistake reducing extension of knowledge to a repetition of syllogistic-type certainties derived from generalizations, against which Humphreys postulates the notion of anomaly context based on the inherently theory-laden character of observation. The background of the theory provides a context against which any anomaly is perceived as such since in absence of it no clear identification is possible as to which aspect of the observed object should be considered as requiring an explanation. Considering its background in theory, the explanans of the anomaly must deviate from the assumptions that make it possible as minimally as possible. Counterfactual laws are a necessary part of this project and this necessity seemingly overcomes mere pragmatism or preference considering that meaningful observations cannot emerge out of an indefinite mass of ready-made alternatives: rather, they must be referred to existing laws. The fact that laws are approximative rather than true in the literal sense emerges as an argument in their favour, as laws are treated as boundary conditions within which observation is possible.
One of the most interesting parts of the book is Humphreys' treatment of probability. Even the estimates of probability in science appear theory-laden in that they must refer to an underlying physical theory to make sense of individual cases - a theoretical issue that lies at the heart of quantum computing and the relations of the statistical character of quantum theory to probabilistic estimations of the outcome of single measurement. Humphreys advances this view against the traditional alternatives of Bayesianism vs. frequentism as well as Carnap's merely linguistic-pragmatic concept of probability. He refutes the notion of simple event and pays attention to the breakdown of Bayesianism in terms of the absence of any simple given - probabilistic events may be redistributed at will by recombining them with others, resulting in different probabilities and statistical interpretations. It's a good criticism but arguably his own idea falls to the same quagmire through a different route: while it makes sense to refer the probability of individual event to law we can always make that aspect contingent by thinking about the probability that the next batch of observations will fulfill the statistical criteria of the law or not. This can no longer be processed from within the context of the statistical law of, say, interference fringe distributions (what he calls "semicausal"). This highlights a need for a quantum theory that surpasses the uncertainty principle even as Humphreys devotes a whole chapter to criticizing the Bohmian explorations.
Humphreys' argument for the supra-pragmatic character of laws has merit in that it shows how such considerations direct even the set-up of experiments. There is a Kantian hue in how he does it but there is, in fact, a remarkable difference: Humphreys' theory avoids reifying generalizations as necessary while still distancing itself from radical empiricisms of Hume et al. as well as from linguistic philosophies and pragmatisms. Humphreys' conception of law and his demonstration of its effect on practice via the example of Newton's discovery of variable refrangibility of light waves showcases a world-view not based on generalizations or even their necessity but rather to laws as grounds for surprise. Referring from empirical manipulations to counter-factual idealizations and back, from is to ought to is, from the representative to the will and back, from the phenomenal to the noumenal and back: this is all based on using certain scientific concepts as conditions so that one could have new thoughts about the empirical.
In the end, when pursued beyond the boundaries of this book and beyond the tacit assumption that there is only one progressive cumulation of ideas Humphreys' ideas get simultaneously more problematic and more intriguing. This book flirts with very interesting ideas I haven't read elsewhere but falls back on the auto-justification of the theoretical scaffolding of modern physics: for example, his central point is that a single anomaly is not enough to overthrow a theory but should be given new interpretations. This is all well and good from the perspective of modern cosmology, buttressing the positing of dark matter as a solution to anomalies of galactic movement. However, this overlooks the discrepancy between the fact that anomalous movement of Mercury resulted in a complete reformulation of every other aspect of theory instead of positing a matter that is by definition incapable of being seen as it does not interact with light, as does happen with the anomaly of galactic movements. The explanations of this book, then, fail to account for conditions of full theory change which, in turn, leaves the main ideas too somewhat obscured.
If we still accept the merits of the of the theory-anomaly relation posited here, the interesting aspect of this theory is revealed in absence of tacit conventionalism/progressivism: to understand a phenomenon means in some way to classify it as something that should not have happened. All the rest is "reality". Can this work as a method of knowledge extension right from the start? One thinks of the concepts of cosmos and God: in cosmos, everything must have happened while with God we approach the idea that a brute fact should not have happened and stands in need of explanation. So while God lacks entirely the tacit rationality of the cosmic worldview, it actually provides a need for explanation by detaching the universe from the idea of things being in their proper place. The second interesting part is that the ceteris paribus character assigned to physical laws means absence becomes worked into the fabric of explanation: we know that God's most important creation was in the 7th day when he created "nothing". In explaining things, scientists are emulating this master-image in creating instrumental arrangements to observe the behaviour of objects in comparison to how an object pulled from a system characterized by absence of real conditions would behave in the real conditions. Obviously there is immense room for error and divergent pathways here.
The main feeling I get from reading things like this is bewilderment that the scientific process connected with mighty technology is in many ways so rickety. At every turn, it does seem like these models are not to be taken literally for what is there but then the question of what is there behind this energy becomes equally interesting. A feeling that is pacified a bit by the understanding that the kinds of tactics used for experimenting with refraction laws and photoelectric effects so that their anomaly context may be properly established cannot be compared so easily to methods of astrophysics which should have an entirely different philosophy of what is the correct way proceed.