Introduction: Teaching Networks to Generate Digits
Get an overview of the topics that will be covered in this chapter.
In this chapter, our first project will recreate one of the most groundbreaking models in the history of deep learning, the deep belief network (DBN). DBN was one of the first multi-layer networks for which a feasible learning algorithm was developed. Besides being of historical interest, this model is connected to the topic of this course because the learning algorithm uses generative model to pretrain the neural network weights into a reasonable configuration prior to backpropagation.
In this chapter, we’ll cover:
How to load the Modified National Institute of Standards and Technology (MNIST) dataset and transform it using TensorFlow 2’s Dataset API.
How a Restricted Boltzmann Machine (RBM)—a simple neural network—is trained by minimizing an “energy” equation that resembles formulas from physics to generate images.
How to stack several RBMs to make a DBN and apply forward and backward passes to pretrain this network to generate image data.
How to implement an end-to-end classifier by combining this pretraining with backpropagation “fine-tuning” using the TensorFlow 2 API.
Get hands-on with 1400+ tech skills courses.