Keras dense layer

Share

Keras stands out as a well-known high-level deep-learning library, offering a user-friendly interface to construct and train neural networks effectively. One of Keras's most commonly used layers is the Dense layer, which creates fully connected neural networks.

This Answer will explore Dense layers, their syntax, and parameters and provide examples with codes.

What are dense layers?

Dense layers are fundamental building blocks in neural networks. They consist of a set of neurons, each connecting to every neuron in the previous layer. The term "dense" refers to how each neuron is densely connected to all neurons in the previous layer.

Layers in a neural network
Layers in a neural network

Note: To learn more about Keras input layers, refer to this answer.

Syntax

Keras provides a simple way to create dense layers using the Dense class. Let's examine the syntax required to define a dense layer in Keras:

from tensorflow import keras
dense_layer = keras.layers.Dense(units, activation=None, use_bias=True, ...)

Parameters

The following are the most commonly used parameters in the Dense layer.

  • units: This parameter specifies the number of neurons in the layer. It is a required parameter and must be a positive integer.

  • activation: This parameter specifies the activation function to be applied to the layer's output. If None is specified, no activation is applied.

  • use_bias: This parameter specifies whether to include a bias vectorIn a neural network, the bias vector is an additional parameter added to each layer that allows the network to learn an intercept or offset value. in the layer. The default value is True.

  • kernel_initializer: This parameter specifies the initialization method for the weight matrixThe weight matrix in a neural network refers to the set of learnable parameters that determine the strength and importance of the connections between neurons in a given layer.. The default is glorot_uniform.

  • bias_initializer: This parameter specifies the initialization method for the bias vectorIn a neural network, the bias vector is an additional parameter added to each layer that allows the network to learn an intercept or offset value.. The default is 'zeros'.

  • kernel_regularizer: This parameter specifies the regularizationRegularization in a neural network is a technique used to prevent overfitting by adding a penalty term to the loss function. method for the weight matrix.

  • bias_regularizer: This parameter specifies the regularizationRegularization in a neural network is a technique used to prevent overfitting by adding a penalty term to the loss function. method for the bias vector.

  • activity_regularizer: This parameter specifies the regularizationRegularization in a neural network is a technique used to prevent overfitting by adding a penalty term to the loss function. method for the output of the layer.

  • kernel_constraint: This parameter specifies the constraint on the weight matrixThe weight matrix in a neural network refers to the set of learnable parameters that determine the strength and importance of the connections between neurons in a given layer..

  • bias_constraint: This parameter specifies the constraint on the bias vectorIn a neural network, the bias vector is an additional parameter added to each layer that allows the network to learn an intercept or offset value..

Example 1: Single dense layer

Here's a simple code example to build a single dense layer:

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
# Create a single dense layer with 32 units and sigmoid activation
input_dimension = 5
dense_layer = layers.Dense(32, activation='sigmoid', input_shape=(input_dimension,))
# Test with random input data
input_data = tf.random.normal((1, input_dimension))
output = dense_layer(input_data)
print(output)

Code explanation

  • Line 1: Import the TensorFlow library as tf.

  • Line 2: Import the keras module from TensorFlow.

  • Line 3: Import the layers module from TensorFlow’s Keras API.

  • Line 6: Define the dimensionality of the input data as input_dimension = 5.

  • Line 7: Create a single dense layer with 32 units and the sigmoid activation using the Dense class from the layers module. The input_shape parameter specifies the shape of the input data.

  • Line 10: Generate random input data using TensorFlow’s random.normal function. The shape of the input data is (1, input_dimension).

  • Line 11: Pass the input data through the dense layer by calling the dense_layer object a function. This applies the layer’s transformation to the input data.

  • Line 12: Print the dense layer output.

Example 2: Multi-layer neural network

Here's a simple code example to build a multi-layer neural network:

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
input_dimension = 5
model = keras.models.Sequential([
layers.Dense(64, activation='relu', input_shape=(input_dimension,)),
layers.Dense(10, activation='sigmoid'),
layers.Dense(128, activation='softmax', use_bias = False)
])
# Test with random input data
input_data = tf.random.normal((1, input_dimension))
output = model(input_data)
print(output)

Code explanation

  • Line 1: Import the TensorFlow library as tf.

  • Line 2: Import the keras module from TensorFlow.

  • Line 3: Import the layers module from TensorFlow’s Keras API.

  • Line 5: Define the dimensionality of the input data as input_dimension = 5.

  • Line 6: The Sequential class from keras.models is used to create the sequential model.

  • Lines 7–10: Add three dense layers to the model:

    • The first dense layer has 64 units and the relu activation function.

    • The second dense layer has 10 units and the sigmoid activation function.

    • The third dense layer has 128 units and the softmax activation function. Additionally, it does not use a bias term.

  • Line 13: Generate random input data using TensorFlow’s random.normal function. The shape of the input data is (1, input_dimension).

  • Line 14: Pass the input data through the model by calling the model object a function. This performs the forward pass of the model, generating the output predictions.

  • Line 15: Print the output of the model.

Conclusion

In this Answer, we explored the concept of dense layers in Keras, which play a crucial role in neural networks by capturing complex patterns and relationships in data. With the ability to configure the number of units, activation functions, and other parameters, dense layers provide flexibility and power in building deep learning models for various tasks.

Quick Quiz!

Q

Which parameter in the Keras Dense layer defines the number of neurons in that layer?

A)

activation

B)

input_shape

C)

units

D)

use_bias

Copyright ©2024 Educative, Inc. All rights reserved