# Linear Regression — Detailed View

Linear regression is used for finding linear relationship between target and one or more predictors. There are two types of linear regression- Simple and Multiple.

### Regression analysis is commonly used for modeling the relationship between a single dependent variable Y and one or more predictors.  When we have one predictor, we call this “simple” linear regression,It looks for statistical relationship but not deterministic relationship. Relationship between two variables is said to be deterministic if one variable can be accurately expressed by the other.for example, using temperature in degree Celsius it is possible to accurately predict kelvin.Statistical relationship is not accurate in determining relationship between two variables. For example, relationship between height and weight.

Real-time example

### For model with one predictor,  Intercept Calculation Co-efficient can be defined as

 intercept calculation
 co-efficient formula

### generally ,

• If b1>0 then the relation between continuous variable is positive which mean if one increases the other will also increases
• If b1<0 the the relation between continuous variable is negative which mean if one increases the other will reduces

### To find the value of θ that minimizes the cost function, there is a closed-form solution—in other words, a mathematical equation that gives the result directly.this is called Normal Equation

 Co-efficient calculation using Normal Equation
• Theta is the value that minimizes the cost function
• y is the vector of target values

### below is the python implementation for the theta_best

 python implementation for Normal Equation

### >>>y = 0.5* x**2 +x+2+np.random.rand(m,1)

 Generated nonlinear and noisy data set

### its clear that a straight line will never fit this data properly. so let’s use polynomialFearures in sklearn to transform our data, add the square of each in training set as new features

>>> from sklearn.preprocessing import PolynomialFeatures
>>> poly_features = PolynomialFeatures(degree=2, include_bias=False)
>>> X_poly = poly_features.fit_transform(X)
>>> X
array([-0.75275929])
>>> X_poly
array([-0.75275929, 0.56664654])

### This gives information about how far estimated regression line is from the horizontal ‘no relationship’ line (average of actual output).

 Regression error formula

### How much the target value varies around the regression line (predicted value).

 sum of square error formula

### Thank You ..!

This site uses Akismet to reduce spam. Learn how your comment data is processed.