The objective of this book is to break down an extremely complex topic, neural networks, into small pieces, consumable by anyone wishing to embark on this journey. Beyond breaking down this topic, the hope is to dramatically demystify neural networks. As you will soon see, this subject, when explored from scratch, can be an educational and engaging experience. This book is for anyone willing to put in the time to sit down and work through it. In return, you will gain a far deeper understanding than most when it comes to neural networks and deep learning. This book will be easier to understand if you already have an understanding of Python or another programming language. Python is one of the most clear and understandable programming languages; we have no real interest in padding page counts and exhausting an
I think it accomplishes its goal of getting across a deeper understanding of what's happening under the hood of a neural network, by tackling things in raw Python before moving into NumPy.
It never touches PyTorch, TensorFlow, etc, instead giving an arcane intuition for each puzzle piece before you use those libraries later.
Like learning basic car repair before committing to a Sydney to Perth drive. If it goes bad, you potentially won't need to wait for a fixer.
This starts with mapping basic nonlinear functions, then training for prediction on a spiral dataset (shown here), then mapping a sine wave, and finally predicting clothes PNGs using the fashion_mnist dataset. The graphic representations were a big help.
Having not touched vector calculus, the partial derivatives were the most challenging part for me. If there's one thing I'd add to this behemoth, it's some exercises to practice. Though I'm sure I can find those online.
I enjoyed writing the code from scratch and building on it in every chapter. I will definitely come back to read this every year and see what improvements I can make.