Ancient, published in 1992, but still interesting if not entertaining, even if much of its contents has now been rehashed in later works, though still poignantly relevant for our extensive modern online discourse, where reason is often not the primary concern.
Sutherland spends roughly half the book explaining the different types of irrationality and then uses the remainder to highlight how these work in practice.
Judging by the first thing that comes to mind is called the availability error.
Related are the halo and devil effect, if a person has one salient good, or bad, trait, it overshadows, coloring the person's other properties.
Being more influenced by early than by late items in a list is called the primacy error.
The habit of obedience is so ingrained that people can act out of obedience without even knowing that they are doing so. And that's on top of what happens when obeying is central to your position, like in the military, or only has consequences for outsiders, like when you are not directly confronted with the results of your actions.
Related, but different, is conformity, behaving in the same way as one's peers. Connected to this is the boomerang effect, where people become more convinced that they are right when their (public) beliefs are challenged. In other words, it's near-impossible to convince someone he's wrong.
Conformity to crowds can lead to panic and violence. And religious conversion.
Within groups, opinions tend to shift beyond the typical, average, opinion of the individuals. Liberals, in a group, tend to become more liberal, conservatives more conservative. Essentially, this is groupthink.
This means that committees, subgroups of in-groups tend to be more extreme than the group from which they are drawn.
One consequence of the in-group/out-group dichotomy is that competitive sports, even as friendly matches, more typically foster animosity as opposed to friendship. To alleviate this, nembers of different groups can cooperate to work towards a common goal, though this only is shown to bring people together when the goal is achieved. Otherwise, prejudice is maintained.
As a consequence, organizations, particularly public ones, tend to make irrational decisions. Opening up the decision process to the public can alleviate this to some extent.
Misplaced consistency is when people hold on to initial decisions for the sake of not being seen to backtrack, whether publicly or not. It's why people tend to not cut their losses. Hence the 'sunk cost error'.
Tests show that rewards devalue any activity considered worth doing in its own right. The activity in question will be valued less after the rewards no longer are handed out.
As a consequence, a carrot for promoting good behavior does not facilitate institutional change.
The exception is 'praise', non-monetary rewards, which can be seen as a form of reward, but typically has a positive influence on the future completion of tasks.
In the obverse, mild threats produce stronger results than strong threats.
Related, large rewards, strong motivation, but also stress, foster inflexibility when attempting to solve a problem.
When a particular belief is involved, we tend to go to extreme lengths to look for supporting evidence while refusing to believe contrary evidence. If that fails, we distort existing evidence.
Specifically, evidence favoring a belief strengthens it, while if the same evidence disproves a belief, it is ignored.
A very surprising consequence is that we seek confirmation of our own opinions of ourselves, even when they are derogatory.
Drawing causal conclusions from unrelated events is called illusory correlation. This is typically reinforced by what the person expects in the first place, being blind to non-matching results and having an eye for comparative, but possibly meaningless, outliers. And statistics, though essential for, for example, diagnosis, is a bitch.
We tend to feel there's an increased likelihood of something implausible happening when that is paired with something very plausible, even though that should decrease the chances of it happening.
Of similar difficulty is the ability to identify cause and effect. What's more, the more prominent an effect, from the same action, the more we hold the agent responsible.
With hindsight on our side, we become overconfident in our ability to predict future outcomes of events.
Risk assessment is difficult for specialists and engineers, but nearly impossible for the general public: misunderstandings, fear of the unknown, untested technologies, overconfidence, etc, all result in the less specialized relying on a, typically, very bad, gut feeling, typically underestimating risk in general or associating irrelevant imagery with the risk involved (like, for example, nuclear energy).
Sutherland also discusses utility theory, basically a statistical method to not only include expected outcomes, but including the usefulness of these outcomes.
The related cost/benefit analysis, though having its own place, is limited by it only addressing financial gains and losses.
In medicine, a similar technique is QALY, the quality-adjusted life year.
The book is full of conclusions that are fairly obvious to skeptics and represent hard to convey truths in practice. For one, there's a large gap between what people think they do and what they in fact do. This may sound obvious, but in practice it will mean you hit a brick wall when pointing this out in the real world: in one study, 77% of subjects were shown to not read the warning labels on dangerous consumer products, while 97% claimed to do so, when interviewed.
At the end, Sutherland attempts to list underlying causes for irrationality:
+ Evolution, where our ancestors typically had to make many crucial decisions under duress.
+ The analog, imperfect, nature of the brain favors generalizations.
+ Our implicit desire to take mental shortcuts in deducing conclusions.
+ Our general incapacity to apply basic statistics.
+ The self serving bias.