Autor: |
Fan, Qilin, Li, Xiuhua, Li, Jian, He, Qiang, Wang, Kai, Wen, Junhao |
Rok vydání: |
2020 |
Předmět: |
|
Druh dokumentu: |
Working Paper |
Popis: |
As ubiquitous and personalized services are growing boomingly, an increasingly large amount of traffic is generated over the network by massive mobile devices. As a result, content caching is gradually extending to network edges to provide low-latency services, improve quality of service, and reduce redundant data traffic. Compared to the conventional content delivery networks, caches in edge networks with smaller sizes usually have to accommodate more bursty requests. In this paper, we propose an evolving learning-based content caching policy, named PA-Cache in edge networks. It adaptively learns time-varying content popularity and determines which contents should be replaced when the cache is full. Unlike conventional deep neural networks (DNNs), which learn a fine-tuned but possibly outdated or biased prediction model using the entire training dataset with high computational complexity, PA-Cache weighs a large set of content features and trains the multi-layer recurrent neural network from shallow to deeper when more requests arrive over time. We extensively evaluate the performance of our proposed PA-Cache on real-world traces from a large online video-on-demand service provider. \rb{The results show that PA-Cache outperforms existing popular caching algorithms and approximates the optimal algorithm with only a 3.8\% performance gap when the cache percentage is 1.0\%}. PA-Cache also significantly reduces the computational cost compared to conventional DNN-based approaches. |
Databáze: |
arXiv |
Externí odkaz: |
|