VF 2 Boost: Very Fast Vertical Federated Gradient Boosting for Cross-Enterprise Learning
Autor: | Fangcheng Fu, Bin Cui, Jiawei Jiang, Yingxia Shao, Yangyu Tao, Lele Yu, Huanran Xue |
---|---|
Rok vydání: | 2021 |
Předmět: |
Speedup
Gradient boosting decision tree business.industry Computer science Distributed computing Privacy protection 020207 software engineering Cryptography 02 engineering and technology Set (abstract data type) 020204 information systems 0202 electrical engineering electronic engineering information engineering Gradient boosting business Implementation Protocol (object-oriented programming) |
Zdroj: | SIGMOD Conference |
DOI: | 10.1145/3448016.3457241 |
Popis: | With the ever-evolving concerns on privacy protection, vertical federated learning (FL), where participants own non-overlapping features for the same set of instances, is becoming a heated topic since it enables multiple enterprises to strengthen the machine learning models collaboratively with privacy guarantees. Nevertheless, to achieve privacy preservation, vertical FL algorithms involve complicated training routines and time-consuming cryptography operations, leading to slow training speed. This paper explores the efficiency of the gradient boosting decision tree (GBDT) algorithm under the vertical FL setting. Specifically, we introduce VF^2Boost, a novel and efficient vertical federated GBDT system. Significant solutions are developed to tackle the major bottlenecks. First, to handle the deficiency caused by frequent mutual-waiting in federated training, we propose a concurrent training protocol to reduce the idle periods. Second, to speed up the cryptography operations, we analyze the characteristics of the algorithm and propose customized operations. Empirical results show that our system can be 12.8-18.9 times faster than the existing vertical federated implementations and support much larger datasets. |
Databáze: | OpenAIRE |
Externí odkaz: |