Search⌘ K
AI Features

Code Forward Propagation

Explore the process of forward propagation in neural networks by coding classification functions. Understand how inputs flow through layers using weighted sums and activation functions such as sigmoid and softmax. This lesson prepares you to build the foundation for a functioning neural network that processes real data.

We'll cover the following...

Recap

In the previous chapter, we made a plan for building a neural network that classifies MNIST images. We started by concatenating two perceptrons, and we jotted down the number of rows and columns for all the matrices involved. It resulted in the following diagram:

Keep the sketch handy, because we are about to convert it to code. Coding will take two chapters. In this chapter, we’ll write the neural network’s classification functions—in other words, all the code in the network, except for the training code. In the next chapter, we’ll complete the network and run it on an unsuspecting MNIST dataset.

Let’s get started with the classification functions. For reference, let’s ...