Didn't feel quite as interesting as the first book. The title suggests something more concrete, but the content is at the same level of abstraction. Did provide some really interesting food for thought about the foundations of rationality - bootstrapping up belief systems, the empirical nature of logical axioms, Principle of the Uniformity of Nature, etc.
Notes:
• The Sophisticate: “The world isn’t black and white. No one does pure good or pure bad. It’s all gray. Therefore, no one is better than anyone else.”
The Zetet: “Knowing only gray, you conclude that all grays are the same shade. You mock the simplicity of the two-color view, yet you replace it with a one-color view . . .”
• "Politics is the mind-killer"
• "But studies such as the above show that people tend to judge technologies—and many other problems—by an overall good or bad feeling. If you tell people a reactor design produces less waste, they rate its probability of meltdown as lower"
• Students guessed 1x2x3x4x5x6x7x8=512, but 8x7x6x5x4x3x2x1=2250
• Anchoring is suspected to be due to priming, or "contamination" as it's called at the cognitive level
• "cognitive business", distraction, interruption seem to interfere with our ability to recognize and reject falsehoods, but not to accept truths. Proposed conclusion: we accept ideas as true be default, and it takes effort to reject
• Affect heuristic: people tend to judge things as overall "good" or "bad", even when various aspects aren't necessarily related like that
○ Presenting information about high benefits made people perceive lower risks; presenting information about higher risks made people perceive lower benefits
○ Exacerbated by time pressure
○ Causes halo effect
• Affective death spirals
○ Similar to what I called "building a philosophical system", where one idea lies at the crux of every issue and could solve every problem. Taleb with black swans, Walker with sleep, possibly me with incentive alignment
○ Probably the single most reliable sign of a cult guru is that the guru claims expertise, not in one area, not even in a cluster of related areas, but in everything.
• "Cognitive dissonance" explanation of increased backfire effect seems weak. Cognitive dissonance seems either like a mysterious answer, or just a term for "threat to established beliefs". In either case, the question of why views get stronger in response to dissonance instead of weaker still stands.
• EY's explanation is that the counter-evidence pushes people closer to leaving, the ones that go over the threshold leave, and the more extreme ones stay. But if this is like pushing a queue over a cliff, then the average should still decrease - it's as if the most extreme disappeared, since the least extreme disappear but everyone else shifts to doubting more. The only way this would lead to an increase in fanaticism is if the people that don't change their minds don’t even shift towards doubt (or do so very little), or if the fanaticism is distributed super-linearly in the first place.
• It's important to be tolerant of criticism to yourself because if you eject, the group you left will shift away from you. If everyone ejects early, groups will tend to get more extreme very quickly.
○ Although I guess they'll also shrink very quickly. If everyone ejects at criticism, groups just collapse to the most extreme member shouting at nobody.
○ Makes sense given echo chamber intuition. Desire to avoid criticism is what leads to bubbles
• EY's "hate death spirals" are what I've been calling signaling races. It's where everybody wants to get their signaling and say how bad the enemy is, and the discussion (negative halo effect) doesn't allow consideration of the other side.
○ 9/11 hyper-patriotism reaction
○ Excessive political correctness/cancel culture
○ Red Scare
• To be one more milestone in humanity’s road is the best that can be said of anyone; but this seemed too lowly to please Ayn Rand. And that is how she became a mere Ultimate Prophet.
• The presence of a single dissenter dramatically decreases conformity bias, but only while they are dissenting. Immunity to confirmation bias doesn't seem to be learned from past dissenters paving the way.
• Careful about self-image as a rationalist. Can lead to motivated investigation - thinking hard enough to satisfy your self-image, but not actually optimal
• The principles of rationality are the same kind of laws as the Laws of Thermodynamics
○ Ong says writing enables syllogisms. But if logic is learned, not somehow "unlocked", how does this work?
• For ideas that you are attached to, consider what would happen if they were false, simply as a hypothetical. Once this "retreat" is planned out and mapped, hopefully the fear of abandoning the idea if it does turn out to be false will be reduced.