Chapter 1: Machine Learning Basics and Mathematical Foundation for Deep Learning Chapter Introduce Machine Learning basics and Mathematical Foundations that are associated with Deep Learning No of pages 70-90Sub-Topics1. Linear Algebra basics.2. Numerical Stability and Conditioning.3. Probability.4. Different types of cost functions and introduction to least squares and maximum likelihood methods.5. Convex and Non-convex function 6. Optimization Techniques such as Gradient Descent and Stochastic Gradient Descent as well as Constrained Optimization problems.7. Regularization and Early stopping8. Auto Differentiators and Symbolic Differentiators. Chapter 2: Introduction to Deep Learning Concepts and TensorFlow Chapter Introduce Deep Learning concepts and its comparison with previous Neural Networks. Reasons for its success and computational efficiency and a start to TensorFlow Development.No of pages 60-70Sub -Topics 1. Previous Neural Networks and their shortcomings 2. Introduction to Deep Learning Framework and its advantages.3. Why TensorFlow for Deep Learning and its comparison with other Deep Learning Frameworks like Theano, Caffe, Torch, etc.4. Hands on in TensorFlow development environment and introduction to Dynamic Computation graphs. 5. Linear and Logistic regression in a TensorFlow environment6. Feed forward networks through TensorFlow.7. Leveraging GPUs for Computational efficiency. Chapter 3: Image and Audio Processing in TensorFlow through Convolutional Neural Networks Chapter Learn to process image and audio data to solve classification, clustering, and recommendation problems using Convolutional Neural Network. No of 70-80Sub - 1. Convolution and Image processing through Convolution.2. Different Kinds of Image processing filters like Guassian Filter, Sobel Filter, Canny's edge detection filter.3. Different Layers of Convolutional Neural Network - Convolution layer, Pooling Layers, activation layers using RELUs, Dropout layers and fully connected layer. Intuition of features learned in Different layers. Concepts of strides, padding and kernels.4. Solving image classification, clustering and recommendation problems through Convolutional Neural network.5. Feature transfer in Convolutional Neural Network.6. Audio classification problems through Convolutional Neural networks. Chapter 4: Restricted Boltzmann Deep Learning Architectures through TensorFlow for Various ProblemsChapter Leverage Restricted Boltzmann Machines (RBMs) for solving Recommendation problems, weight initialization in Deep Learning Networks and for Layer by Layer training of Deep Neural Networks.No of - 1. Introduction to Restricted Boltzmann Machines (RBMs) and its architecture.2. Using RBMs to build Recommendation engines.3. RBMs for smart weight initialization of Deep Learning Networks.4. Train complex deep learning networks layer by layer (one layer at a time) through RBMs
Chapter 5: Deep Learning for Natural Language Processing through TensorFlow Chapter Leverage TensorFlow Deep learning capabilities for Natural Language processing No of 50-601. Text processing basics such as Word2Vec Representation, Semantic and Syntactic Analysis. 2. Recurrent Neural network(RNNs) for language modelling through TensorFlow3. Backpropagation through time and problems of Vanishing and Exploding gra
A very good introductory book for Deep Learning beginner: the math is just right, plenty of intuitive explanations, and it tackles many fundamental concepts that may take months to understand if you purely study from original papers. I can assure you this is a high-quality book, though admittedly, sometimes there are pages full of Python code that may signal a wrong impression. Note that, the book was written in 2017, and three years is a big gap given the community's growth pace, so you would not expect off-the-self models here, say, BERT or GAN variations.
Here are selected topics that I think may benefit readers: - DL-relevant topic in Linear Algebra (eigenvectors), Calculus (gradient computation, a closed-form solution to find maxima/minima of functions, Taylor series), and Probability (distributions, MLE). - Conventional optimization techniques: an excellent section on Gradient Descent and its variant/characteristics; on par with are how to define an objective function with inequality constraints. - Formulate and derive the solution for PCA and linear regression with regularization, - Perceptron, details to apply Backpropagation on Neural Network. [Important]: characteristics of cost functions in high-dimensional space to navigate model updating effectively. - Convolutional Neural Network: 2D convolution operation, image-processing filter, translation equivariance, translation invariance, DropOut, BatchNorm, transfer learning, LeNet up to Inception V3. - Recurrent Neural Network: vector representation of words (CBOW, Skip-Gram, GloVe), Vanishing and Exploding Gradient problem, RNN, LSTM, GRU. - Another excellent section on Restricted Boltzmann Machines with a nice touch on MCMC methods. Other unsupervised methods are Autoencoders, PCA, and ZCA Whitening. - Introduces CV challenging tasks: Image segmentation, Image classification and localization, Object detection, and Generative Adversarial Network.