githubEdit

bring-forwardDense (FFNN)

TODO

A dense layer, also known as a fully connected layer, is one of the fundamental building blocks of neural networks. In a dense layer, each neuron is connected to every neuron in the preceding layer.

This structure allows the layer to combine all input features through learned linear combinations, making it especially suitable for feature integration and final prediction stages.

Math Explaination

Let the input be a tensor XRB×DinX \in \mathbb{R}^{B \times D_{in}}​ where:

  • BB is the batch size,

  • DinD_{\text{in}}Din​ is the input feature dimension.

Take an input IRb×dinI \in \mathbb{R}^{b \times d_{in}} where bb represents the batch size, dind_{in} the input dimension. A dense layer performs a linear projection of the input to a new dimension doutd_{out} and applies an activation function, the output is therefore ORb×doutO \in \mathbb{R}^{b \times d_{out}}.

Formally, the

Last updated