Leaky ReLU
Leaky ReLU is a variant of ReLU activation function that introduces a small, non-zero slope for negative inputs. This was introduced to solve the Dying ReLU problem where the gradient doesn't improve for the negative values.
Leaky ReLU Formula
- Here
is a hyperparameter which is usually a very small value ([0, 1]).