Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Data Science by (18.4k points)

I am trying to create a function that will calculate mean squared error from y (true values) and y_pred (predicted ones) not using sklearn or other implementations.

I'll try next:

def mserror(y, y_pred):

    i=0

    for i in range (len(y)):

        i+=1

        mse = ((y - y_pred) ** 2).mean(y)   

        return mse

How to fix it?

1 Answer

0 votes
by (36.8k points)

You are modifying the index for no reason. A for loop increments it anyways and also, you are not using the index, for example, you are not using any y[i] - y_pred[i], hence you don't need the loop at all.

Use the arrays

mse = np.mean((y - y_pred)**2)

Do check out Data Science with Python course which helps you understand from scratch 

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...