Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
8-2019
Abstract
Gradient Boosted Decision Trees (GBDT) is a very successful ensemble learning algorithm widely used across a variety of applications. Recently, several variants of GBDT training algorithms and implementations have been designed and heavily optimized in some very popular open sourced toolkits including XGBoost, LightGBM and CatBoost. In this paper, we show that both the accuracy and efficiency of GBDT can be further enhanced by using more complex base learners. Specifically, we extend gradient boosting to use piecewise linear regression trees (PL Trees), instead of piecewise constant regression trees, as base learners. We show that PL Trees can accelerate convergence of GBDT and improve the accuracy. We also propose some optimization tricks to substantially reduce the training time of PL Trees, with little sacrifice of accuracy. Moreover, we propose several implementation techniques to speedup our algorithm on modern computer architectures with powerful Single Instruction Multiple Data (SIMD) parallelism. The experimental results show that GBDT with PL Trees can provide very competitive testing accuracy with comparable or less training time.
Discipline
Databases and Information Systems
Research Areas
Data Science and Engineering; Intelligent Systems and Optimization
Publication
Proceedings of the 28th International Joint Conference on Artificial Intelligence (IJCAI 2019), Macao, August 10-16
First Page
1
Last Page
9
Publisher
International Joint Conferences on Artificial Intelligence
City or Country
Macao, China
Citation
SHI, Yu; LI, Jian; and LI, Zhize.
Gradient boosting with piece-wise linear regression trees. (2019). Proceedings of the 28th International Joint Conference on Artificial Intelligence (IJCAI 2019), Macao, August 10-16. 1-9.
Available at: https://ink.library.smu.edu.sg/sis_research/8675
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://arxiv.org/abs/1802.05640