Introduction:
This blog helps to understand the basic fundamentals of linear regression. The blog explains the mathematics behind linear regression using a simple linear equation such as Y = mX + c. The blog is separated into two parts: (1) simple linear regression and (2) multivariant linear regression.
- Employees' salary prediction based on experience.
- Vehicles' top speed based on their market values.
- Power consumption based on a number of family members and many more.
- House pricing prediction using carpet area, area code and etc.
- Future population Estimation using parameters of current lifestyle.
- Company growth based on expense, market conditions, and many more.
1. Simple Linear Regression.
Simple linear regression is just one variable prediction where we can pass a single feature and estimate the output using two coefficients. The Basic equation for linear regression is:
Y = mX + c
Here, m and c are learning parameters that can be changed over time during training sessions.
- calculate the mean and covariance of the input feature.
- estimate coefficients.
Calculate mean, variance, and covariance of the input feature:
- mean:
`mean = {\frac {1}{n}}\sum _{i=1}^{n}x_{i}` - variance:
`variance = \sum_{i=0}^n (x_{i} - \overline{x})^{2}` - covariance:
`covariance = \sum_{i=0}^n (x_{i} - \overline{x}) * (y_{i} - \overline{y})`
The above equations can be written as function in the python program as follows:
- mean:
mu = lambda x : sum(x) / float(len(x)) - variance:
var = lambda x, mean : sum([(val-mean)**2 for val in x]) - covariance:
def calculate_covariance(x, y, mean_x, mean_y):
covarince = 0
for xi, yi in zip(x,y):
covarince += (xi - mean_x) * (yi-mean_y)
return covarince
Estimate coefficients:
We can find coefficients m and c using the above functions. As per the linear equation terminology, we can call m the slope of the line, and c is the bias or the origin point on the x-axis.
The slope can be calculated as follows:
`Slope = \frac{covariance}{x_{variance}}`
Bias can be found very easily as we know the line equation Y = slope*X + bias. Make bias as a target and we will get the equation as bias = Y - slope*X.
The python implementation can be written as follow to estimate slope and bias value.
slope = covariance/variance_x
bias = mean_y - (slope * mean_x)
After getting the slope and bias values, the estimation can be performed using the same line equation(Y = mX + c) where we can put slope and bias values along with the feature value to get its estimation.
Multivariant Linear Regression.
- Estimator.
- Stochastic gradient descent.
1. Estimator
The estimator is a simple linear function with multiple features multiplied by their respective weights. Also, these multiplications can be summed along with a bias value.
So, The estimator function can be visualized in mathematical form as follows:
`y = b + \sum_0^n w_{i}*x_{i}`
The above linear function can be written as a function in the python program as follows:
return w[0] + sum([wi*xi for wi,xi in zip(w[1:],x)])
Where x and w are scaler vectors of a 1D array that contain all feature values and feature importance respectively.
2. Stochastic gradient descent
The stochastic gradient descent method can be broken down into 3 steps (1) run estimation (2) calculate error (3) update weights according to the generated error.
The error can be calculated as follow:
To update weights, the effect of change in every feature can be subtracted from the current value of weights by a predefined rate.
`w_{i} := w_{i} - lr * Err * x_{i}`
`b_{i} := b_{i} - lr * Err`
The above functions can be written in a function in the python program as follows:
No comments:
Post a Comment