I am new to GBM and xgboost, and am currently using xgboost_0.6-2 in R. The modeling runs well with the standard objective function "objective" = "reg:linear" and after reading this NIH paper I wanted to run a quantile regression using a custom objective function, but it iterates exactly 11 times and the metric does not change.
I just simply switched out the 'pred' statement following the GitHub xgboost demo, but am afraid it is more complicated than that and I cannot find any other examples on using the custom objective function. Do I need to take it a step further and take derivatives for the 'grad' and 'hess' part?
Or could it be a problem with xgboost (doubtful)?
qntregobj <- function(preds, dtrain) { qr_alpha = .5 labels <- getinfo(dtrain, "label") preds <- ifelse( preds - labels >= 0 , (1-qr_alpha)*abs(preds - labels) , qr_alpha*abs(preds - labels) ) grad <- preds - labels hess <- preds * (1 - preds) return(list(grad = grad, hess = hess)) } step1.param <- list( "objective" = qntregobj , "booster" = "gbtree" , "eval.metric" = "rmse" , 'nthread' = 16 ) set.seed(123) step1.xgbTreeCV <- xgb.cv(param = step1.param , data = xgb.train , nrounds = nrounds , nfold = 10 , scale_pos_weight = 1 , stratified = T , watchlist = watchlist , verbose = F , early_stopping_rounds = 10 , maximize = FALSE ## set default parameters here - baseline , max_depth = 6 , min_child_weight = 1 , gamma = 0 , subsample = 1 , colsample_bytree = 1 , lambda = 1 , alpha = 0 , eta = 0.3 ) print(Sys.time() - start.time) step1.dat <- step1.xgbTreeCV$evaluation_log step1.dat Which produces:
iter train_rmse_mean train_rmse_std test_rmse_mean test_rmse_std nround 1: 1 122.6362 0.04268346 122.6354 0.3849658 1 2: 2 122.6362 0.04268346 122.6354 0.3849658 2 3: 3 122.6362 0.04268346 122.6354 0.3849658 3 4: 4 122.6362 0.04268346 122.6354 0.3849658 4 5: 5 122.6362 0.04268346 122.6354 0.3849658 5 6: 6 122.6362 0.04268346 122.6354 0.3849658 6 7: 7 122.6362 0.04268346 122.6354 0.3849658 7 8: 8 122.6362 0.04268346 122.6354 0.3849658 8 9: 9 122.6362 0.04268346 122.6354 0.3849658 9 10: 10 122.6362 0.04268346 122.6354 0.3849658 10 11: 11 122.6362 0.04268346 122.6354 0.3849658 11