Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in AI and Deep Learning by (50.2k points)

I am reading A Tutorial on Energy Based-Learning and I am trying to understand the difference between all those terms stated above in the context of SVMs. This link summarizes the differences between a loss, cost, and an objective function. Based on my understanding,

Objective function: Something we want to minimize. For example ||w||^2 for SVM.

Loss function: Penalty between prediction and label which is also equivalent to the regularization term. An example is the hinge loss function in SVM.

Cost function: A general formulation that combines the objective and loss function.

Now, the 1st link states that the hinge function is max(0, m + E(W,Yi,Xi) - E(W,Y,X)) i.e. it is a function of the energy term. Does that mean that the energy function of the SVM is 1 - y(wx + b)? Are energy functions are a part of a loss function. And a loss + objective function a part of the cost function?

A concise summary of the 4 terms would immensely help my understanding. Also, do correct me if my understanding is wrong. The terms sound so confusing. Thanks!

1 Answer

0 votes
by (108k points)

The objective function is - as the name suggests - the objective of the optimization. It can be either something we want to minimize like cost function or maximize (like probability). In general - the function that measures how good is our current solution (usually by returning a real number).

Loss function: A penalty between prediction and label which is also equivalent to the regularization term. An example is the hinge loss function in SVM.

The loss is not equivalent to regularization, in any sense. The loss function is a penalty between a model and the truth. This can be a prediction of class conditional distribution vs true label, thus can also be a data distribution vs. empirical sample, and many more.

Regularization is a term, penalty, measure which is supposed to be a penalty for the too complex model. In ML, or generally in statistics when dealing with estimators, you always try to balance two sources of error - variance (coming from too complex models, overfitting) and bias (coming from too simple models, bad learning methods, underfitting). Regularization is a technique of penalizing high-variance models in the optimization process to get less overfitted one.

A cost function is just an objective function which one minimizes. It can be composed of some agglomeration of loss functions and regularizers.

Browse Categories

...