Quiz on the Basics of CNNs and their Architectures

Test your knowledge about image classification basics and its popular architectures.

1

(Select all that apply.) Which statement is true?

A)

If we apply dropout with a 0.3 rate to a layer with 10 neurons, only 3 of them will be selected randomly to carry the signal and will be updated with their weights in each iteration.

B)

Residual blocks, first discovered in ResNet, are especially good for vanishing gradient problems.

C)

Weights are individual for the kernels, whereas only one bias exists for one filter.

D)

While an exploding gradient is about having too small weights, the vanishing gradient is about having too large weights, and in both cases, we have an underfitting problem.

Question 1 of 40 attempted

Get hands-on with 1200+ tech skills courses.