Gradient boosting tree is an iterative decision tree algorithm that is widely used for classification and regression. The algorithm is composed of multiple decision trees.Different from the random forest algorithm, the random forest gets the final result after voting the results of multiple decision trees, and the gradient lifting tree learns the conclusions of all the previous trees, and builds a weak learner at each step of the iteration to make up for the shortcomings of the original model, so it is more accurate. In addition, the gradient boosting tree is sensitive to outliers. And the classification here can only be used for binary classification.

The method carries out a data Training Procedure of a gradient lifting tree classification method, and a model can be obtained according to data characteristics, and then the model is used for prediction.

 

When creating a gradient lift classification training task, you need to set the following parameters:

 

After executing the training task, the following Result Parameter is output: