Statistics lectures have been a source of much bewilderment and frustration for generations of students. This book attempts to remedy the situation by expounding a logical and unified approach to the whole subject of data analysis.This text is intended as a tutorial guide for senior undergraduates and research students in science and engineering. After explaining the basic principles of Bayesian probability theory, their use is illustrated with a variety of examples ranging from elementary parameter estimation to image processing. Other topics covered include reliability analysis, multivariate optimization, least-squares and maximum likelihood, error-propagation, hypothesis testing, maximum entropy and experimental design.The Second Edition of this successful tutorial book contains a new chapter on extensions to the ubiquitous least-squares procedure, allowing for the straightforward handling of outliers and unknown correlated noise, and a cutting-edge contribution from John Skilling on a novel numerical technique for Bayesian computation called 'nested sampling'.
This is not an ordinary introduction to bayesian statistics. It is highly compressed, with extensive use of advanced matrix algebra and some very weird notation which may slow down your reading. You also have to master, to a some degree, the physics of spectra and signals, for they are the subject of almost all the examples in the book. It can be hard to understand the logic of what the authors are doing if you do not know where they are heading. Otherwise, it is a fairly good and concise treatment of probability theory and bayesian statistics.
Clearly a somewhat tougher read, but possibly not much more so than Pynchons GR or Foster Wallaces IJ. Bayesian statistics are a whole different way of thinking about probabilities that makes a lot of sense.
This book not only is a good book to learn Bayesian statistics from, but it's also a great reference for the subject as well. Taking a very hands-on approach, the concepts and philosophy of Bayesian statistical analysis are clearly presented through lucid explanations and an abundance of well-chosen examples. In the second edition, there is also a significant portion of the book dedicated to algorithmic implementation of Bayesian inference schemes; and this material is accompanied by C source code snippets to really solidify the ideas behind the algorithms. My one issue with this book is that I wish more pages had been dedicated to discussing MCMC (Markov Chain Monte-Carlo) algorithms for sampling posterior distributions. Indeed, adaptive MCMC algorithms represent the majority of sampling algorithms implemented when it comes to sampling analytically unknown posterior distributions, but these are scarcely mentioned in this book.[return][return]Overall, I think this is the best book out there in regards to explaining how to actually implement Bayesian analytical techniques on scientific or engineering data.
I'm afraid I only got about half way through this, and will return to it once I've brushed up on my vector calculus, as some of the maths was *very* unfamiliar to me (I suspect it's written with experimental scientists with a background in physics or engineering in mind, rather than compsci geeks). Despite that though, I think I managed to absorb a lot of the principles of and the rationale behind the bayesian approach to statistical analysis, and I look forward to understanding the detail once I've dealt with some of the mathematical dependencies!
I enjoyed the first few chapters of this book. They were a good introduction to Baysian inference even if you don't know much about statistics. Unfortunately this fall to this fell to the bottom of my priority list and I took so long reading the rest of it that I didn't get anything out of it our really follow it. My fault not the author's.