A celebrated mathematician explores how math helps us make sense of the unpredictable
We would like to believe we can know things for certain. We want to be able to figure out who will win an election, if the stock market will crash, or if a suspect definitely committed a crime. But the odds are not in our favor. Life is full of uncertainty --- indeed, scientific advances indicate that the universe might be fundamentally inexact --- and humans are terrible at guessing. When asked to predict the outcome of a chance event, we are almost always wrong.
Thankfully, there is hope. As award-winning mathematician Ian Stewart reveals, over the course of history, mathematics has given us some of the tools we need to better manage the uncertainty that pervades our lives. From forecasting, to medical research, to figuring out how to win Let's Make a Deal , Do Dice Play God ? is a surprising and satisfying tour of what we can know, and what we never will.
Ian Nicholas Stewart is an Emeritus Professor and Digital Media Fellow in the Mathematics Department at Warwick University, with special responsibility for public awareness of mathematics and science. He is best known for his popular science writing on mathematical themes. --from the author's website
Half of the book is taken up with anecdotes, historical developments in the sciences and mathematics, biographical sketches of key figures in the scientific and mathematical fields, all with the central theme of recognizing and quantifying uncertainty - Babylonian divinations, evolution of dice, astronomy, quantum physics... I found this aspect of the book engaging and fascinating. Delightful reading.
The other half of the book is devoted to the actual mathematics of uncertainty as it evolved along the way. Stewart is clearly adept and conversant with the concepts, terms and calculations... unfortunately he assumes the reader is equally so. This content is simply incomprehensible to me, and I labored long to try to follow along. He presents formulas, and invites the reader to solve them... but that implies a level of conceptual understanding and mathematical divination that I suspect escapes the average reader. It certainly escaped me. He talks in concepts and uses terms that he assumes the reader already knows and comprehends... it as though he is talking to an audience of grad students with a background in uncertainty principles and mathematics. I suspect such a reader will enjoy the entire book.
A big history of mankind's encounter with uncertainty, from early superstitions like haruspicy (or the belief on Corfu that "when you see a praying mantis, it either brings good luck or bad luck depending on what happens"), up until the attempt to generate sufficiently random numbers for cryptography. Stewart's overall narrative is that humans were able to tame and understand uncertainty, eventually seeing everything as the result of deterministic material processes, until Quantum Mechanics restored uncertainty to the core of the natural world (thus Chaos preceded Cosmos, as the Greeks said); though he suspects that even here there might be a hidden deterministic explanation yet to be uncovered. It remains to be seen if God's dice are truly random.
Stewart is a maths professor at the University of Warwick, and a prolific author of popular (The Science of Discworld), semi-popular (Concepts of Modern Mathematics) and downright unpopular (Galois Theory, 3rd edition) books on mathematics. As such he doesn't shy away from trying to explain even the thorniest corners of his topic. Even with extensive rewinding, I got a bit lost in chapter 10 (nonlinear dynamics, aka chaos theory, and the underlying Partial Differential Equations) and 17 (where he attempts to use those theories to explain why there might be hidden variables explaining the oddities of quantum physics, contra the Bell Inequalities).
Chapter 5 contains a fascinating history of statistics, drawing on the work of Stephen Stigler. The field originated in astronomical observations with randomly distributed errors, comprising highly overdetermined systems of equations. Euler tried and failed to solve for the variables. The solution was discovered by Tobias Mayer: strategically grouping similar equations and combining them. The subsequent errors were not combined but actually reduced, since as they were randomly distributed they tended to cancel each other out. Legendre formalised this as finding the approximation of a system of linear equations with the smallest error, as the least squares method (the fundamental idea behind linear regression). Later Abraham de Moivre discovered that Bernoulli's binomial distribution generalises to the normal distribution, and Gauss and Laplace defined the Central Limit Theorem, stating that independent random variables distributed in any way will at a big enough scale fit a normal distribution (with certain caveats. Stewart is a stickler for details, sometimes to the detriment of his prose, as when explaining the Birthday Paradox he detours to note that births aren't fairly distributed around the year, or that we are assuming the year in question isn't a leap year. Alright, it's just an example!) The chapter ends with the formal axioms of probability, which Kolmogorov generalised to include infinite probability spaces using measure theory. Phew!
Adolphe Quételet was a student of Lagrange, and the first to use statistical methods for demography, criminology, and public policy. (He also invented BMI.) Chapter 8 introduces Bayesian reasoning and conditional probability, allowing you to reason about a future event without having trial data - you assign a prior and then update as you get more evidence. Such reasoning can be a powerful tool in court cases, but has frequently been rejected by innumerate judges as counterintuitive. (See the miscarriages of justice of Lucia de Berk and Sally Clark, the latter of whose case prompted a statement by the Royal Statistical Society.)
Chapter 9 covers statistical dynamics and its seeming randomness: at the macro level the behaviour of a gas is predictable, but at the micro level it is random. Still, Maxwell's demon could theoretically push all of the air in a room to one side, making it impossible to breathe: it is just the low probability of this event which causes the Second Law of Thermodynamics. Laplace's Essai philosophique sur les probabilités claimed that the universe was deterministic. In practice, however, it's basically impossible to calculate all of the initial conditions - chaos theory shows that tiny changes in initial conditions can have an enormous effect when run through the differential equations.
Which brings us to on weather. Since the atmosphere is a chaotic system, we don't just take observations and then try to solve the equations. Instead, meteorologists run their model and then randomly perturb it, run it again, and so on. The different runs will tend to converge on the likeliest behaviour, so we can give a percentage confidence level for our prediction. Nowadays an ensemble of models is used to include different assumptions as well as different conditions. Edward Lorenz, the original discoverer of chaos theory and the "butterfly effect", claimed that there is a hard limit on prediction accuracy of about a week, since fine-level errors propagate into the high-level system; but new theories and ever-improving computer hardware have improved this a little bit.
Chapter 12 looks at the most common modern uses of statistics, the battery of tools used in scientific trials developed at the still-extant Rothamsted Experimental Station (for agricultural research) by Ronald Fisher, Karl Pearson, and William Gosset (better known by his pseudonym "Student"). The "double-blind" randomised control trial, p-values, t-tests and chi-tests all come from them. At the same time the polymath (and famous eugenicist) Francis Galton was coining standard deviation and regression to the mean, and developing the standard correlation tests. In 1958 David Cox invented logistic regression for binary variables, and in the 1970s the toolkit was bolstered by the "bootstrapping" technique for estimating sample statistics. Stewart stresses the importance of careful and ethical study design, to avoid getting to a predetermined answer, answering the wrong question, or harming the study participants.
Quantitative methods began to take over economics through the work of Léon Walras, who posited a model of prices where agents with different utility functions are calibrating their desire for what they want against those of others. John von Neumann claimed that every market has such an equilibrium using a theorem from topology(!), Brouwer's Fixed Point Theorem. Bachelier, a student of Poincaré, was probably the first person to apply advanced mathematics to finance in his doctoral thesis in 1900, using Brownian motion (the movement of particles) to model the spread in variance in an option: it is a random walk.
The last couple of chapters take us back to quantum physics. (Stewart is always game to try explaining dense topics, and he goes for some of the toughest.) The "double split" experiment shows how light behaves both as a wave and a particle, depending on the observer. Then the famous "EPR" paper (authored by Einstein, Podolsky and Rosen) tried to disprove QM by showing that quantum entanglement implies "spooky action at a distance". I have always remembered Bill Bryson's analogy from A Short History of Nearly Everything: it is as if "you had two identical pool balls, one in Ohio and the other in Fiji, and the instant you sent one spinning the other would immediately spin in a contrary direction at precisely the same speed". (Looking it up now I see he attributes it to the science writer Lawrence Joseph.) As to the question of whether there is some undiscovered deterministic mechanism explaining quantum uncertainty, Stewart holds out hope. He believes that the "shut up and calculate" school of thought (a bald statement of the "Copenhagen interpretation") is an artefact of logical positivism, and maybe military influence on the Manhattan Project - they wanted scientists to focus on the practical. Anyway, he argues, if a deeper answer is out there, we won't find it unless we look. (As every conspiracy theorist knows!)
I've omitted a huge amount from this review: on neuroscience, climate change, lots more quantum mechanics, and a host of other topics. Stewart is a brilliant, inexhaustible source of knowledge, and his capacity to write exceeded my ability to imbibe and record. Highly recommended.
Ian Stewart develops the six ages of uncertainty by delving into everything from early thoughts about probability to quantum mechanics. I picked the book to determine if it could be used as a book for AP Statistics students to develop a bigger picture of statistics/probability. It definitely does so, but may be too much for many of them. Still, there is an audience among my students who will like the book and enjoy the many ways humans have bumped up against uncertainty over time.
It's a good book with Stewart's usual gift for clarifying heady concepts. For what it's worth, though, it didn't feel quite as engaging to me as several other introductory texts (like "The Drunkard's Walk," I believe it's called) or even some of Stewart's own past works. His metaphors tend to work, but some of the concepts are so complicated they just fail to connect with the lay man. On average (you're welcome), a good read.
I’m uncertain who the audience is for this book. As others have mentioned it is “math heavy”. Though I wouldn’t say it is heavy on math, it is heavy on logic. I found those parts engaging, even when I couldn’t always follow. Ian Stewart isn’t an innovative or super nimble communicator of tough ideas, but he has some ability to simplify hard topics and can do so with some buoyancy. Balancing this with some descriptions of the history makes it a decent read for novices but not amateurs. His major detour into global warming was quite a soapbox. He debated with some rather low bar opposing arguments that felt unnecessary for his readers. “You think one degree in a century is hardly bad news, think again”. Considering the rest of the book requires more than base level understanding of logic and math, I don’t think this was a worthwhile diatribe. Some 1/2 baked critiques of the U.S. withdrawing from the Paris agreement and then we finally get back to the mathematics of uncertainty 15 pages later. The concept of the book was a big undertaking. Either whittle it down to fewer topics and keep with the dense mathematical descriptions or focus on the history and impacts that the many eras of uncertainty have had for a more amateur audience.
A+ title. Book itself was also super interesting. I remember complaining about my previous non-fiction book that it was too pop and not enough science. This book did not have that problem. Did I get confused and lost sometime around quantum mechanics, as usual? Of course. Was it fun anyway? Absolutely. Is this at all useful for understanding the halakhic concept of safek? I’m not sure, although it seems clear that safek is from the Bayesian rather than the frequentist school. Also the deeply British mix of enthusiasm and dry observations definitely added to the appeal. I have no idea how enjoyable this would be to someone familiar with statistics and the science of uncertainty, but I certainly enjoyed it.
The statisticsy side of this was interesting, but he kept going on about quantum physics so i skipped several entire chapters and I don't think I could tell you one thing I actually learnt that wasn't on the A level maths spec or too confusing to understand.
Эх, математика, любовь моя, неразделенная... Открыл для себя неожиданно нового замечательного автора - популяризатора математики, почти такого же интересного, как Мартин Гарднер (не который "Игра престолов", а который совсем другой). И этот Иэн Стюарт тоже классный, но он другой. Сейчас наберу на Литресе его книжек, и чтением до лета обеспечен. Конкретна данная книга посвящена неопределенности в самом широком смысле слова и математической теории хаоса в смысле узком - от гадания по внутренностям до квантовой теории через переписку Паскаля и Ферма, метод наименьших квадратов, странные аттракторы, горизонт предсказаний, устаревшую копенгагенскую интерпретацию квантовой механики, отличия частотного обоснования теории вероятностей от байесовского подхода, а также почему чем точнее мы измеряем метеорологические параметры тем хуже прогноз погоды. Наконец я понял, что такое странный аттрактор! Ну или получил уверенность, что понял. Но надо сказать, к последним главам я поплыл, понял только вывод - Бог (или Природа, кому как нравится) все-таки играет в кости, здесь Эйнштейн был неправ. Правда это я узнал еще несколько лет назад. Единственный недостаток книги относится уже к издателю - цифровая версия книги представлена только в pdf-формате, так что ни на электронной книжке ни на планшете почитать ее не получится нормально. Ладно бы еще куча иллюстраций была, или специальное форматирование. Так нет, обычный текст. Как-то так
The book starts like a proper probability refresher with the much needed anecdotes here and there. As the content increased in complexity so are the stories that come with it, until a certain point. Then the book takes a deep dive on Quantum physics which for me, completely sucked all interest i had in quantum physics while i was reading it. Then he eventually comes out of it and finishes the book in a bang.
Except the quantum physics which went miles over my head, most of the concepts are conveyed very interstingly. A lot of paragraphs were snapshot worthy.
This book will be an easy choice for ones who love dealing with randomness and uncertainty.
I loved the description of "shut up and calculate" quantum mechanics. I also liked the description of oil droplets (particles) on an undulating surface of oil (waves) as a classical analog of the double slit experiment.
Covers a comprehensive range of topics that all relate to probability theory. Ian explains each concept with great clarity so non-experts can still grasp the subject.
It may be a bit too in-depth for some readers, but I really appreciated the coverage dedicated to each topic
Do Dice Play God is a wide ranging exploration of the mathematics of uncertainty. Written with a degree of wit and clarity, Stewart makes some rather complex topics generally accessible up to a point. However, for this reader he lost me in a few sections, particularly when he goes into the intricacies of quantum physics.
Over the course of the book, he organizes the chapters in what he describes as the six ages of uncertainty from explanatory mythology to quantum physics and chaos theory where in each “age” humans have made advances in how we categorize, make sense of, and try to wrangle uncertainty.
Over the course of the text, Stewart explores gambling, astronomy, law, thermodynamics, weather & climate science, medicine, the brain & perception, and quantum physics. Much of this covers similar ground to Nate Silver in The Signal and the Noise and to Charles Wheelan in Naked Statistics. I felt both of those books provided slightly more accessible versions of this content, but they also narrowed their attention to a subset of what Stewart covers here.
This is a strong text for anyone interested in how we make sense of the uncertainty inherent in our existence where we have a conceptual understanding of a future but no ability to control that future. A concept with which we all should grapple.
This is a nice book about statistics, probabilities, mathematics, or uncertainty, or chaos, or knowledge, or physics, or reasoning... I am not sure anymore! Chaotic in itself, it was still fun. Since I am into this kind of books, I would dare to say that most of the information contained was not new for me. The book was enjoyable despite the fact that it jumps a lot without a clear vision from one thing to another. Another problem of these "jumps" was that it had some steep curves: it would leap from very simple notions to very complicated in no time. Sometimes I had to read through a part more than once to make sure I got everything straight. Suggested for lay people standing in the grey zone of geekiness.
A great book on uncertainty, from the origins of probability theory and statistics to quantum physics. Although the title clearly is a reference to Einstein's remark on quantum physics, for me the book spent to long on the topic of quantum physics. As that topic is more complex and less easy to follow it took a lot longer to read and for me reduced the pleasure I'd had at reading the rest of the book. Though I do find the topic interesting, I enjoyed the rest of the book a lot more and even learned some things about the history of probability theory and regression. Would recommend to anyone interested in the topic of (the history of) uncertainty, probability theory or statistics.
In Do Dice Play God? Ian Stewart attempts to map the evolution of our understanding of uncertainty. However, for a popular science book, it was a surprisingly difficult read.
Structurally, the book is prone to excessive detours. Rather than a focused exploration of probability, Stewart spends significant time on the history of superstition and the mechanics of gambling. While the gambling analogies are helpful at first, they eventually feel repetitive. Furthermore, he often wanders into tangents, such as global warming, which feels like a distraction from the core mathematical thesis.
Casual readers seeking a clear introduction to probability will likely find it frustrating.
A good book to revisit concepts of probability and combinatorics introduce yourself to chaos theory and nonlinear dynamic systems, their applications to many aspects of uncertainties in the physical world.
Further readings on some concepts are necessary to obtain a holistic understanding on the broad theme of mathematics of uncertainty. The book however is just the right introduction to deeper exploration on the future of how humanity is attempting to solve unanswered questions and getting better at issues which don't have perfect solutions for, yet.
This book is a perfect overview of basically everything I'm currently interested in, starting with a bit of a history of mathematical theories on probability and statistics, forecasting, chance, with dive into biases of our brains, chaotic systems and toped off with an interesting polemic on quantum mechanics, determinism and randomness. It was full of interesting ideas that really made me rethink some of my opinions, very well written and readable and equipped with subtle humor. I really loved it.
I wish I had read this book during my introduction to probability course. It is a far more interesting, compelling, engaging, and intuitive explanation of probability theory, uncertainty, and statistics than the material that I studied when I was getting my degree in statistics. The historical and narrative components are well complemented by the rigorous explanation of key probabilistic principles which I wish everyone would learn. I would highly recommend this book as a primer for anyone who needs to learn more about probability theory, randomness, and uncertainty.
Ian Stewart walks readers through several ages of uncertainty in human history, with each era producing a mathematician or two that leaps humanity forward into slightly less uncertainty. Generally, we are bad at uncertainty - measuring it, understanding the extent of it, working around it. He discusses current uncertainty issues facing physicists via quantum mechanics to the use of Bayesians theorem potentially corrupting trials. It starts off fairly easy to follow but does get harder to comprehend later chapters, despite a number of efforts at analogies.
This entire review has been hidden because of spoilers.
A good overview of the principles of probability and uncertainty with examples drawn from coin tosses, dice and climate versus weather. Statistics, cryptography, quantum physics and gambling are all discussed at length. There's some very good explanations of why weather forecasts are so uncertain while climate change is very certain. There's actually quite a few good examples and analogies to walk you through the basic ideas of mathematical uncertainty.
Interesting coverage of probability and randomness in history and daily life. I got a lot out of the comparison and background on frequentist and bayesian interpretations. There were parts that seemed too narrowly focused on individual case studies and long tangential examples, but overall it kept to the task of showing where we've been, establishing where we are, and exploring where we're going.
DNF'ed at Chapter 3. It's interesting and all, but how can one man be so strict about the rules of science and mathematics and completely ignore the rules of the English language? Stewart makes the incomprehensible choice to ignore the singular "die" in favor of using "dice" for both singular and plural, which wouldn't bother me so much except for how much more confusing that is, and that this is the same guy who is so concerned with rationality in the human brain. Make it make sense!
A book originally designed to be named "the 6 ages of uncertainty" gets renamed hastily and receives unwarranted attention. Some chapters, particularly at the beginning all the way to entropy, are informative and sometimes entertaining. In later chapters the author mixes his opinions with facts too freely, and loses the ability to write accessibility.
I like popular science books and I like reading about and trying to understand probability and statistics, but I struggled to get through this and wasn't motivated to finish. Maybe it was trying to cover too many different things without a clear focus, approach or writing style that worked for me. Might just be me, but there are better, easier-to-read books on this topic.
Wow this was really full of math. I little too much for my test but anyone that is into stats and statistics would find this book interesting. The first portion of the book was very heavy on the math side but as the book progressed it was more about the theories and examples of things that were far more interesting and next institutional lecture.
A very good introduction to statistics and a number of applications in the real world. I saw this on the shelves of one guest on a TV show.
The title was large enough to read. I've been in tech for 40 years, working with statistics and probabilities fairly often, so I had to check it out. Worth the read.
Quite enjoyed this one. Read the whole book twice in a month and read another five chapters a third time just for good measure. A lot of the ideas and history were not new to me, but were presented in a way that simplified the concepts without diminishing their nuance and complexity. Sign me up for more reading from this author.