Different Types of Neural Network Architecture
16797 Views • Posted On Aug. 23, 2020
The different types of neural network architectures are -
Single Layer Feed Forward Network
- In this type of network, we have only two layers, i.e. input layer and output layer but the input layer does not count because no computation is performed in this layer.
- Output Layer is formed when different weights are applied on input nodes and the cumulative effect per node is taken.
- After this, the neurons collectively give the output layer to compute the output signals.
Multilayer Feed Forward Network
- This network has a hidden layer that is internal to the network and has no direct contact with the external layer.
- The existence of one or more hidden layers enables the network to be computationally stronger.
- There are no feedback connections in which outputs of the model are fed back into itself.
Single node with its own feedback
- When outputs can be directed back as inputs to the same layer or proceeding layer nodes, then it results in feedback networks.
- Recurrent networks are feedback networks with closed loop. The figure below shows a single recurrent having a single neuron with feedback to itself.
Single Layer Recurrent Network
- This network is a single-layer network with a feedback connection in which the processing element's output can be directed back to itself or to other processing elements or both.
- A recurrent neural network is a class of artificial neural network where the connection between nodes forms a directed graph along a sequence.
- This allows is it to exhibit dynamic temporal behavior for a time sequence. Unlike feedforward neural networks, RNNs can use their internal state (memory) to process sequences of inputs.
Multilayer Recurrent Network
- In this type of network, processing element output can be directed to the processing element in the same layer and in the preceding layer forming a multilayer recurrent network.
- They perform the same task for every element of the sequence, with the output being dependent on the previous computations. Inputs are not needed at each time step.
- The main feature of a multilayer recurrent network is its hidden state, which captures information about a sequence.
Share this tutorial with someone who needs it
Most Popular Tutorials in Data Science
Most Popular Tutorials on Asquero