Linear Regression
Linear Regression Formula
- It is called Linear Regression because it's a linear combination of the parameters (
, ) - There are different types of Linear Regression
- The
and can be computed by setting the Derivative of loss to 0 - Actually the closed-form solution has come from that only
Advantages of Linear Regression
- Simple
- Works pretty well for linearly separable data
- Interpretable
Disadvantages of Linear Regression
- Lot of feature engineering is needed
- Bad at Handling Missing Data
- Bad at Handling Outliers
Basic Assumptions of Linear Regression
- Linearity: The relation between the independent variables (X) and dependent variables (Y) has to be linear.
- No or little Multicollinearity: The independent variables (X) should not be correlated to each other pair wise.
- Normality of Residuals: The residuals should be normally distributed, it will help ensure unbiased predictions and more confidence.
- Independence: The observations (X, Y) are independent to each other.
- Homoscedasticity: The variance of the errors is constant across all levels of the independent variable.