BOLT: An Automated Deep Learning Framework for Training and Deploying Large-Scale Search and Recommendation Models on Commodity CPU Hardware

Autor: Meisburger, Nicholas, Lakshman, Vihan, Geordie, Benito, Engels, Joshua, Ramos, David Torres, Pranav, Pratik, Coleman, Benjamin, Meisburger, Benjamin, Gupta, Shubh, Adunukota, Yashwanth, Medini, Tharun, Shrivastava, Anshumali
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
DOI: 10.1145/3583780.3615458
Popis: Efficient large-scale neural network training and inference on commodity CPU hardware is of immense practical significance in democratizing deep learning (DL) capabilities. Presently, the process of training massive models consisting of hundreds of millions to billions of parameters requires the extensive use of specialized hardware accelerators, such as GPUs, which are only accessible to a limited number of institutions with considerable financial resources. Moreover, there is often an alarming carbon footprint associated with training and deploying these models. In this paper, we take a step towards addressing these challenges by introducing BOLT, a sparse deep learning library for training large-scale search and recommendation models on standard CPU hardware. BOLT provides a flexible, high-level API for constructing models that will be familiar to users of existing popular DL frameworks. By automatically tuning specialized hyperparameters, BOLT also abstracts away the algorithmic details of sparse network training. We evaluate BOLT on a number of information retrieval tasks including product recommendations, text classification, graph neural networks, and personalization. We find that our proposed system achieves competitive performance with state-of-the-art techniques at a fraction of the cost and energy consumption and an order-of-magnitude faster inference time. BOLT has also been successfully deployed by multiple businesses to address critical problems, and we highlight one customer case study in the field of e-commerce.
Comment: 6 pages, 5 tables, 3 figures. CIKM 2023 (Applied Research Track)
Databáze: arXiv