Simple and Complete Tutorial on Simple Linear Regression

Linear regression is the first stepping stone into machine learning and So, rather than just being content with a simple explanation, use it to refresh concepts related to linear algebra, statistics, and calculus.

Simple linear regression and multiple linear regression are important algorithms that are widely used.

So, here are the steps needed to master simple regression.

Different Topics Covered In The Algorithm

  1. Simple Explanation Of Linear Regression
  2. How to Prepare your data for linear Regression
  3. Why and when you should use linear regression
  4. Least squares method for solving linear regression

Definition in less than 100 words

If there is a linear relationship between Y a dependent variable,  and X(independent variable) such that the linear relationship between the 2 variables can be represented by a straight line (called regression line).

Y= Mx +c

Preprocessing Of Data Before Getting Started With Linear Regression

  1. Remove Noise. Linear regression assumes that your input and output variables are not noisy. 

Hence, check the data to see if there are some outliers in the data

2. Remove Collinearity. Linear regression will over-fit your data when you have highly correlated input variables. Consider calculating pairwise correlations for your input data and removing the most correlated.

If you are doing multiple variable linear regression, then you need to make sure that you that the variable are not correlated

3. Gaussian Distributions. Linear regression will make more reliable predictions if your input and output variables have a Gaussian distribution. 

You may get some benefit using transforms (e.g. log or BoxCox) on your variables to make their distribution more Gaussian looking.

4. Rescale Inputs: Linear regression will often make more reliable predictions if you rescale input variables using standardization or normalization.

Why you need to learn this algorithm

  • It’s Easy and Computationally Less Expensive
  • It is easy to implement on a computer using commonly available algorithms from linear algebra.
  • Its implementation on modern computers is efficient, so it can be very quickly applied even to problems with hundreds of features and tens of thousands of data points.
  • It is easier to analyze mathematically than many other regression techniques.
  • It is not too difficult for non-mathematicians to understand at a basic level.
  • It produces solutions that are easily interpretable (i.e. we can interpret the constants that least squares regression solves for).
  • It is the optimal technique in a certain sense in certain special cases. In particular, if the system being studied truly is linear with additive uncorrelated normally distributed noise (of mean zero and constant variance) then the constants solved for by least-squares are in fact the most likely coefficients to have been used to generate the data.

Least Squares Method For Solving Linear Regression

For any simple linear regression model, there is an infinite choice of m and c variable. How do we choose the most suitable one?

This is where the notion of error comes in, we want the predicted observations of our model to be as close to the original observations as possible.

For, SLR model, the error term is given by mean squared error.

It is the sum of the squared distances between each observed point and a hypothetical line.

A plot of the data points (in red), the least-squares line of best fit (in blue), and the residuals (in green). Image From Wikipedia

We minimize the error by taking the first derivative by m and c, and setting that equal to 0.


About the author

admin

Mastering Data Engineering/ Data science one project at a time. I have worked and developed multiple startups before, and this blog is for my journey as a startup in itself where I iterate and learn.

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Copyright © 2023. Created by Meks. Powered by WordPress.