Quantum computing has become a hobby interest of mine. The idea of quantum computers were first introduced decades ago, but until recently they were entirely theoretical. Today, the public's understanding of quantum computers largely reduces to "magic," which is how quantum computers are typically treated in science fiction. It's assumed that something "quantum" happens at the subatomic level and presto, quantum computers can perform massive computations faster than any classical computer, communicate faster than light, and do all sorts of other ridiculous things.
Understanding how and why this is not the case, and what quantum computers actually can do, requires a lot of study. It also requires a lot of math.
I picked up Dancing with Qubits because it looked like a fairly heavy but approachable layman's introduction to quantum computing. It is that, but it's very heavy, and dives deeply into the math, with many chapters on sets and fields and rings, number theory, logic, vector spaces, and matrix operations before you even get into the quantum. This book is clearly intended to be used as a textbook for a class and is probably a lot more digestible in that environment. Reading it entirely as independent study, as I did, was a pretty hard slog.
My prior experience was playing around a little bit with the open source Qiskit API, and with IBM's Cloud Quantum Computing platform, which allows people to build quantum circuits and run them (for free!) on one of IBM's actual quantum computers.
My background is in software engineering, with just the minimal amount of math necessary for a CS degree: a few semesters of calculus and linear algebra, and some basic statistics and probability theory. This is barely enough to do a deep dive into the math behind quantum computing, and I found myself buying refresher linear algebra and calculus workbooks to keep up.
So I will freely admit that I did not actually work through all the problems, and at some points I skimmed through the math and the algorithmic proofs.
Past the math, this is a thorough introduction to quantum computing, and eventually you start with simple 1-qubit operations, work your way up to Hadamard and Tiffoli gates, and then ever-more complex circuits, until you get to the chapter that explains Shor's Algorithm in detail.
Shor's Algorithm, if you know anything about quantum computing, is the famous algorithm that will someday break private-key cryptography and thereby much of our information security infrastructure. To simplify greatly, a lot of modern cryptography (including the "https" connections you rely on to do secure banking with your phone app) relies on the computational difficulty of factoring very, very large prime numbers. Even supercomputers can't do it in less than centuries, for sufficiently large numbers. Shor's Algorithm is a quantum algorithm that has been mathematically proven capable of factoring very, very large prime numbers much, much faster than any classical computer can. So in theory, someday a quantum computer will be able to hack all banks everywhere and take over the Internet.
So why isn't everyone panicking? It's complicated, and fully understanding it requires a book like, well, this one. Which requires kind of understanding the math. But it boils down to this, and these were some of the most useful chapters: what quantum computers can do in theory is still quite a long way from what any real-world quantum computers can do in practice. For example, IBM and Microsoft and China are bragging every day about how they have built a 10-qubit, 20-qubit, or 60-qubit quantum computer, and expect to have a 100-qubit quantum computer by 2024, etc. The catch is that those are physical qubits, and physical qubits are very fragile. They have to be generated using cryonics or lasers or nuclear magnetic resonance - not anything that you'll be able to put on your desktop this decade. And right now we need hundreds of physical qubits to reliably produce one usable logical qubit. Performing a useful quantum operation (such as, for example, using Shor's Algorithm to break private key cryptography) will require hundreds or thousands of logical qubits.
So, it will happen eventually, but it's not going to happen tomorrow. And people are already working on quantum cryptography. It's also important to note that the nature of quantum computers is such that they are never going to "replace" classical computers. It would be enormously inefficient to use a quantum computer as a simple calculator, for example. Even though you theoretically could, most computing will always be cheaper and faster to do on classical computers. Quantum computing, when it "arrives," will be a hybrid where classical computers do their thing and occasionally hand off certain operations to quantum computers and process the results.
None of this is quite as exciting as some quantum news sounds, is it? The bottom line is that although many companies and countries are investing billions in quantum research, there does not exist anywhere in the world today (2022) a quantum computer that can actually do anything useful and practical. Investors are starry-eyed about cryptography, pharmacology (quantum algorithms will eventually be able to model and test new drug molecules much faster than we currently can), information retrieval, and certain other specialty fields, but no quantum computer currently on the market.... actually does anything but toy problems.
That said, don't bet against a field where Microsoft, IBM, and the People's Republic of China are all investing billions of dollars. So quantum computing will be "real" someday and it will make a major impact. Robert Sutor's book provides a grounding in the math and theory behind it, but not much in the way of doing things. He talks about "quantum volume" as a somewhat ill-defined but more useful way to measure how powerful a quantum computer actually is. He points you to some popular software packages and quantum simulators, but actually experimenting with quantum circuits is the next step for you to take on your own, and from a software engineering perspective, I've found you can, to a certain extent, do this already without knowing the deep background.
I recommend this book if you really want to know the math and the theory behind quantum computing, and a good high-level introduction to the technology as it exists today. If you want to get into programming quantum algorithms with more foundational knowledge than just how to use a Python API, it's a great starter. If you don't want to dive into linear algebra and complex numbers and matrix operations, or going in the other direction, you want to know about quantum physics, this is probably not the book you want to start with.