0 votes
1 view
in Machine Learning by (17.4k points)

I know xgboost need first gradient and second gradient, but anybody else has used "mae" as obj function?

1 Answer

0 votes
by (33.2k points)

The XGBoost library in python implements Mean Absolute Error (MAE) by using the following Huber loss function.

For example:

import xgboost as xgb

dtrain = xgb.DMatrix(x_train, label=y_train)

dtest = xgb.DMatrix(x_test, label=y_test)

param = {'max_depth': 5}

num_round = 10

def huber_approx_obj(preds, dtrain):

    d = preds - dtrain.get_labels() #remove .get_labels() for sklearn

    h = 1  #h is delta in the graphic

    scale = 1 + (d / h) ** 2

    scale_sqrt = np.sqrt(scale)

    grad = d / scale_sqrt

    hess = 1 / scale / scale_sqrt

    return grad, hess

bst = xgb.train(param, dtrain, num_round, obj=huber_approx_obj)  

To get a better grasp on Xgboost, get certified with Machine Learning Certification. Learning Machine Learning Algorithm would also help 

Hope this answer helps. 

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...