Considering that this book was published in 2000, one might think that the technology of quantum computing has progressed significantly in the 15 years since then. However, that's not quite clear. Much of the theory of quantum computing, the algorithms in particular, was already in place in 2000. There are newer algorithms, of course, but the most important ones – such as Shor's algorithm for factoring and Grover's algorithm for database searching – were already known.
Yet even now, the technology for actually building a quantum computer is still in its infancy. (That's excluding the so-called "adiabatic quantum computers" of D-Wave Systems, whose status as genuine quantum computers is still controversial, and which are not covered in Brown's book.) The book discusses four possible technologies that were known at the time: ion traps, cavity QED, NMR (nuclear magnetic resonance), and quantum dots). All of these have troublesome limits (to a greater or lesser extent) with switching time (the basic speed of operations), "decoherence" time, and scalability. (Decoherence is the problem that quantum qubits cannot be kept in a state suitable for computation for very long times, due to disruption by external influences.)
At the present time, there are somewhere around 16 different technologies that have been seriously considered, yet the problems mostly remain – especially the problem of decoherence. A useful quantum computer needs to be able to process 1000 or more qubits at a time, perhaps a lot more. For example, just to represent a large number with, say, 500 digits (as would be needed for highly secure encryption), takes 1661 qubits (as a binary number). Several times as many qubits, at least, would be needed for any operations on such numbers.
An additional problem is that reliable error detection and correction must be done, because unavoidable decoherence must be assumed to introduce errors. Detection and correction algorithms exist, but they can multiply the number of qubits required by an order of magnitude, at least. It's rather sobering to realize that to date (2015) the largest number factored by a quantum computer (excluding adiabatic systems) is just 21.
All these issues explain why many people are still skeptical that useful quantum computing will be possible in foreseeable future, if ever. Brown's book does not spend much time discussing the skepticism, since 15 years ago it wasn't clear just how difficult the technical problems are.
The book does, however, go into considerable technical detail about the theoretical aspects. These are fairly well understood. Even so, the architecture of a quantum computer is nothing like a traditional von Neumann computer. Because of this, there are still relatively few general quantum algorithms, since programming a quantum computer is rather tricky. This brings into question whether important applications will actually run a lot faster on a quantum computer. Shor's algorithm offers an exponential speed-up, but many other algorithms, such as Grover's, don't run that fast. (Grover's algorithm can search a database in time proportional to √N, where N is the number of database entries. A non-quantum search runs in time proportional to N, unless the data is sorted (which requires an order of N times log(N) operations).)
Other topics treated in the book include reversible computing (which drastically reduces theoretical amounts of energy required to run an algorithm), the history of quantum computing (which really got started with ideas presented by David Deutsch as early as 1977 and by Richard Feynman in 1981), computability theory (e. g. "P vs. NP"), quantum communication, and quantum cryptography.
You may be wondering how "minds" and the "multiverse" are relevant to this topic. The issue of minds arises, since some people (notably Roger Penrose) have suggested that the human brain might be a quantum computer. (Most neurobiologists doubt that idea.) The "multiverse" refers to Hugh Everett's "many-worlds" interpretation of quantum mechanics. David Deutsch is an especially ardent supporter of this interpretation. He thinks the reason a quantum computer may be very much faster than a conventional one is that it can do computations in a highly parallel way in multiple "parallel" universes. It's an interesting idea, but not essential for believing quantum computers can be much faster. A more prosaic explanation is that when large numbers of qubits are "entangled", operations on all of them can be carried out simultaneously.
If you've found that some of the technical terms in this review are unfamiliar, take that as a sign this book is not an easy-going introduction to the subject. All of these concepts are clearly explained in the book – but some effort by the reader is still required. In addition, a good understanding requires some mathematical prerequisites, such as familiarity with complex numbers and a little elementary number theory. (But at least one doesn't need to know how to solve Schrödinger's equation.) It also helps a lot if a reader can follow logical diagrams and knows a bit about quantum concepts like entanglement and superposition.
So this book isn't a painless introduction to the subject. But it is still one of the few books accessible to non-specialists that actually gives the reader a good idea of what quantum computing is all about.