Model Implementation: Text-to-Image GAN Synthesis

Discover the generator, discriminator, and wrapper functionalities designed for convolutional layers with batch normalization and ReLU activation.

We'll cover the following

Our discriminator and generator implementations are inspired by the authors’ torch implementation.

Let’s start by implementing the wrappers. We assume that the necessary imports are already in place.

Wrapper

We add a wrapper that combines a 2D convolution with batch normalization and an optional ReLU. This sequence of layers is very common in this model. In using this wrapper, the code becomes more compact and easier to read:

Get hands-on with 1200+ tech skills courses.