Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Machine Learning by (55.6k points)

Can anyone explain feature scaling in Machine learning?

1 Answer

0 votes
by (119k points)

In Machine learning, feature scaling is the technique to bring all the features to the same scale. If we don’t scale the features to the same scale, the model tends to give higher weights to higher values and lower weights to lower values irrespective of the units of values. In short, feature scaling is bringing continuous variables to the same scale

For example, student A got 60 out of 100 in subject 1, 120 out of 150 in subject 2, 180 out of 200 in subject 3. After rescaling to 10, Student A got 6 out of 10 in subject 1,  8 out of 10 in subject 2, 9 out of 10 in subject 3

There are two types of feature scaling used in Machine learning such as Min-Max normalization, and Standardization.

In Min-Max Normalization scales all the continuous variables to range between ‘0’ and ‘1’

Xscaled= (X – min(X)) / (max(X) – min(X))

Standardization technique scales all the continuous variables with mean ‘0’ and standard deviation to ‘1’.

Xnew = (Xi – Xmean )/ Standard deviation

You can sign up for this Machine learning course by Intellipaat to learn Machine learning.

You can watch this video on Machine learning Full course to learn about feature scaling techniques:

Related questions

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

Browse Categories

...