Please use this identifier to cite or link to this item:
Title: Practical federated gradient boosting decision trees
Authors: Li, Q
Wen, Z
He, B 
Keywords: cs.LG
Issue Date: 1-Jan-2020
Publisher: Association for the Advancement of Artificial Intelligence
Citation: Li, Q, Wen, Z, He, B (2020-01-01). Practical federated gradient boosting decision trees. AAAI 2020 - 34th AAAI Conference on Artificial Intelligence 34 (04) : 4642-4649. ScholarBank@NUS Repository.
Abstract: Gradient Boosting Decision Trees (GBDTs) have become very successful in recent years, with many awards in machine learning and data mining competitions. There have been several recent studies on how to train GBDTs in the federated learning setting. In this paper, we focus on horizontal federated learning, where data samples with the same features are distributed among multiple parties. However, existing studies are not efficient or effective enough for practical use. They suffer either from the inefficiency due to the usage of costly data transformations such as secure sharing and homomorphic encryption, or from the low model accuracy due to differential privacy designs. In this paper, we study a practical federated environment with relaxed privacy constraints. In this environment, a dishonest party might obtain some information about the other parties' data, but it is still impossible for the dishonest party to derive the actual raw data of other parties. Specifically, each party boosts a number of trees by exploiting similarity information based on locality-sensitive hashing. We prove that our framework is secure without exposing the original record to other parties, while the computation overhead in the training process is kept low. Our experimental studies show that, compared with normal training with the local data of each party, our approach can significantly improve the predictive accuracy, and achieve comparable accuracy to the original GBDT with the data from all parties.
Source Title: AAAI 2020 - 34th AAAI Conference on Artificial Intelligence
ISBN: 9781577358350
DOI: 10.1609/aaai.v34i04.5895
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
1911.04206v2.pdf685.84 kBAdobe PDF



Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.