Understanding Parameter Sharing, Weak Filters, and Equivariance

Learn about convolutional networks, exploring parameter sharing, weak filters, and translation equivariance.

A convolution process sweeps the entire input. Sweeping is done with a filter smaller than the input. This gives the property of parameter sharing to a convolutional layer in a deep learning network.

Moreover, the filters are small and, therefore, called weak. This enables sparse interaction. Lastly, during the sweep, the location of filter patterns in the input is also returned, which brings the equivariance property to convolution.

This section explains the benefits, downsides, and intentions behind these convolution properties.

Parameter sharing

The property of parameter sharing is best understood by contrasting a dense layer with a convolutional layer.

Suppose a dense layer is used for the image detection problem in the previous lesson. A dense layer would yield a filter of the “same shape and size” as the input image with the letter 2, as shown in the illustration below.

Get hands-on with 1200+ tech skills courses.