Introduces the course structure, learning objectives, and the prerequisites needed to get started.
A comprehensive list of credits for the sources that inspired and informed the course material. Useful for further reading.
Introduces the fundamental building blocks of neural networks, from individual neurons and activation functions to the layered architecture of feedforward networks.
A hands-on lab to implement the basic structure of a neural network, preparing it to classify handwritten digits.
Explores how neural networks learn using cost functions, gradient descent, and the backpropagation algorithm.
This lab applies the concepts of backpropagation and stochastic gradient descent to train the digit classifier network.
Explores the intuition behind the training algorithms and discusses what a neural network actually learns, from simple features to complex patterns.
This chapter compares activation functions, cost functions, and gradient descent variants to get a better understanding of how each affects performance.
Covers how quality and quantity of data and proper weight initialization influence learning, and introduces regularization techniques to combat overfitting and improve generalization.
Discusses often-overlooked parts of setting up a network such as the hyperparameters and practicality, and addresses the trickiness of debugging a neural network.