Challenge: Flax
Test your understanding of Flax with a coding challenge.
We'll cover the following
Challenge: MLP
In this challenge, we will make a fully-connected network with the following layers:
- Dense, 2048 neurons (activation function: ReLU).
- Dense, 256 neurons (activation function: ReLU).
- Dense, 64 neurons (activation function: ELU).
- Dense, 10 neurons (activation function: Softmax).
Get hands-on with 1400+ tech skills courses.