Back
I know xgboost need first gradient and second gradient, but anybody else has used "mae" as obj function?
The XGBoost library in python implements Mean Absolute Error (MAE) by using the following Huber loss function.
For example:
import xgboost as xgbdtrain = xgb.DMatrix(x_train, label=y_train)dtest = xgb.DMatrix(x_test, label=y_test)param = {'max_depth': 5}num_round = 10def huber_approx_obj(preds, dtrain): d = preds - dtrain.get_labels() #remove .get_labels() for sklearn h = 1 #h is delta in the graphic scale = 1 + (d / h) ** 2 scale_sqrt = np.sqrt(scale) grad = d / scale_sqrt hess = 1 / scale / scale_sqrt return grad, hessbst = xgb.train(param, dtrain, num_round, obj=huber_approx_obj)
import xgboost as xgb
dtrain = xgb.DMatrix(x_train, label=y_train)
dtest = xgb.DMatrix(x_test, label=y_test)
param = {'max_depth': 5}
num_round = 10
def huber_approx_obj(preds, dtrain):
d = preds - dtrain.get_labels() #remove .get_labels() for sklearn
h = 1 #h is delta in the graphic
scale = 1 + (d / h) ** 2
scale_sqrt = np.sqrt(scale)
grad = d / scale_sqrt
hess = 1 / scale / scale_sqrt
return grad, hess
bst = xgb.train(param, dtrain, num_round, obj=huber_approx_obj)
To get a better grasp on Xgboost, get certified with Machine Learning Certification. Learning Machine Learning Algorithm would also help
Hope this answer helps.
31k questions
32.8k answers
501 comments
693 users