Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Machine Learning by (19k points)

I know xgboost need first gradient and second gradient, but anybody else has used "mae" as obj function?

1 Answer

0 votes
by (33.1k points)

The XGBoost library in python implements Mean Absolute Error (MAE) by using the following Huber loss function.

For example:

import xgboost as xgb

dtrain = xgb.DMatrix(x_train, label=y_train)

dtest = xgb.DMatrix(x_test, label=y_test)

param = {'max_depth': 5}

num_round = 10

def huber_approx_obj(preds, dtrain):

    d = preds - dtrain.get_labels() #remove .get_labels() for sklearn

    h = 1  #h is delta in the graphic

    scale = 1 + (d / h) ** 2

    scale_sqrt = np.sqrt(scale)

    grad = d / scale_sqrt

    hess = 1 / scale / scale_sqrt

    return grad, hess

bst = xgb.train(param, dtrain, num_round, obj=huber_approx_obj)  

To get a better grasp on Xgboost, get certified with Machine Learning Certification. Learning Machine Learning Algorithm would also help 

Hope this answer helps. 

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

30.5k questions

32.5k answers

500 comments

108k users

Browse Categories

...