0 votes

The terms *cost* and *loss* functions almost refer to the same meaning. But, loss function mainly applies for a single training set as compared to the cost function which deals with a penalty for a number of training sets or the complete batch. It is also sometimes called an error function.

In short, we can say that the loss function is a part of the cost function. The cost function is calculated as an average of loss functions. The loss function is a value which is calculated at every instance.

So, for a single training cycle loss is calculated numerous times, but the cost function is only calculated once.

You can learn more about cost and loss function by enrolling in the ML course.