Khaula Nauman’s Reviews > Neural Networks and Deep Learning > Status Update

Khaula Nauman
Khaula Nauman is on page 50 of 224
The lesson to take away from this is that debugging a neural network is not trivial, and, just as for ordinary programming, there is an art to it. You need to learn that art of debugging in order to get good results from neural networks. More generally, we need to develop heuristics for choosing good hyper-parameters and a good architecture.
Sep 19, 2025 06:07AM
Neural Networks and Deep Learning

flag

Khaula’s Previous Updates

Khaula Nauman
Khaula Nauman is on page 50 of 224
"In the early days of AI research people hoped that the effort to build an AI would also help us understand the principles behind intelligence and, maybe, the functioning of the human brain. But perhaps the outcome will be that we end up understanding neither the brain nor how artificial intelligence works!"
this cracked me up
Sep 19, 2025 06:09AM
Neural Networks and Deep Learning


Khaula Nauman
Khaula Nauman is on page 50 of 224
Usually, when programming we believe that solving a complicated problem like recognizing the MNIST digits requires a sophisticated algorithm. But even the neural networks in the Wan et al paper just mentioned involve quite simple algorithms, variations on the algorithm we've seen in this chapter.
Sep 19, 2025 06:08AM
Neural Networks and Deep Learning


Khaula Nauman
Khaula Nauman is on page 32 of 224
i now know what
- a neural network is
- are preceptor neurons
- are sigmoid neurons
- a sigmoid function is and why sigmoid
- are feedforward neural networks and recurrent ones
- a cost function is (i think)
- how gradient descent helps us minimize our cost function and helps us get closer to finding the correct output
- a stochastic gradient descent is and what is on-line learning
Sep 06, 2025 12:51PM
Neural Networks and Deep Learning


No comments have been added yet.