Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
+2 votes
in Machine Learning by (200 points)

We use logistic regression to predict the value of the categorical outcome, But I believe linear regression is used to predict the output value given the input values.

Please explain the difference between these two methodologies.

2 Answers

+2 votes
by (10.9k points)
edited by

@varsha , the basic difference between logistic regression and linear regression are as follows:

1. Linear regression is a regression model which means it will give continuous output. Logistic regression is a binary classification algorithm which means it will give discrete output.

2. In Linear Regression, residuals are assumed to be normally distributed whereas, Logistic Regression, residuals need to be independent but not normally distributed.

3.Linear Regression is used when the response variable in continuous, Logistic Regression is used when the response variable is categorical.

4. Linear Regression gives an degree 1 equation: y=mx+c , Logistic Regression gives an equation of the form: Y=e^x + e^-x.

5. In Linear Regression the coefficient interpretation is straightforward whereas, in Logistic Regression it depends on log ,inverse-log, binomial, etc.

0 votes
by (33.1k points)
edited by

Linear Regression:

Linear regression is useful for finding the relationship between two continuous variables. One is a predictor or independent variable and the other is a response or dependent variable.

The main task of linear regression is to obtain a line that best fits the data. The best fit line is the one for which total prediction error (all data points) is as small as possible. Error is the distance between the point to the regression line. You can implement linear regression in scikit-learn from sklearn.linear_model class.


from sklearn.linear_model import LinearRegression

cls = LinearRegression(),y_train)


Logistic regression:

Logistic Regression is used when the dependent variable(target) is categorical.

For example,

  • To predict whether an email is spam (1) or not (0)

  • Whether the tumor is malignant (1) or not (0)


from sklearn.linear_model import LogisticRegression

clf = LogisticRegression(), y_test)


Difference between error minimization techniques:

  • Linear regression is usually solved by minimizing the least-squares error of the model to the data, therefore large errors are penalized quadratically.

  • Logistic regression is just the opposite. Using the logistic loss function causes large errors to be penalized to an asymptotically constant.

Learn more about Logistic Regression by watching this video tutorial:

You can go through this video to learn about Multiple Linear Regression:

Browse Categories