Leaky ReLU

Leaky ReLU is a variant of ReLU activation function that introduces a small, non-zero slope for negative inputs. This was introduced to solve the Dying ReLU problem where the gradient doesn't improve for the negative values.

Leaky ReLU Formula

Leaky ReLU(x)=max(αx,x)


References


Related Notes