Different Types of Neural Network Architecture

575 Views Posted On Aug. 23, 2020

The different types of neural network architectures are -

Single Layer Feed Forward Network

  1. In this type of network, we have only two layers, i.e. input layer and output layer but the input layer does not count because no computation is performed in this layer.
  2. Output Layer is formed when different weights are applied on input nodes and the cumulative effect per node is taken.
  3. After this, the neurons collectively give the output layer to compute the output signals.

Single Layer Feed Forward Network

Multilayer Feed Forward Network

  1. This network has a hidden layer that is internal to the network and has no direct contact with the external layer.
  2. The existence of one or more hidden layers enables the network to be computationally stronger.
  3. There are no feedback connections in which outputs of the model are fed back into itself.

Multilayer Feed Forward Network

Single node with its own feedback

  1. When outputs can be directed back as inputs to the same layer or proceeding layer nodes, then it results in feedback networks.
  2. Recurrent networks are feedback networks with closed loop. The figure below shows a single recurrent having a single neuron with feedback to itself.

Single node with its own feedback

Single Layer Recurrent Network

  1. This network is a single-layer network with a feedback connection in which the processing element's output can be directed back to itself or to other processing elements or both.
  2. A recurrent neural network is a class of artificial neural network where the connection between nodes forms a directed graph along a sequence.
  3. This allows is it to exhibit dynamic temporal behavior for a time sequence. Unlike feedforward neural networks, RNNs can use their internal state (memory) to process sequences of inputs.

Single Layer Recurrent Network

Multilayer Recurrent Network

  1. In this type of network, processing element output can be directed to the processing element in the same layer and in the preceding layer forming a multilayer recurrent network.
  2. They perform the same task for every element of the sequence, with the output being dependent on the previous computations. Inputs are not needed at each time step.
  3. The main feature of a multilayer recurrent network is its hidden state, which captures information about a sequence.

Multilayer Recurrent Network

Share this tutorial with someone who needs it

What are your thoughts?