Second Order Derivative or Hessian Matrix

Second order derivative is the Derivative of the derivative.
They are also known as Hessian Matrix

The optimization algorithms that use second order derivatives are,

Pros

  1. Better estimate of the curvature of the loss function
  2. Faster Convergence
  3. Better Solution

Cons

  1. Computational cost is huge, though Quasi Newton Method aims to solve that issues
  2. Memory requirement for saving the Hessian Matrix
  3. Can be down to Saddle Points

Related Notes