Autor: |
Abdullahi Mohammad, Christos Masouros, Yiannis Andreopoulos |
Jazyk: |
angličtina |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
IEEE Open Journal of the Communications Society, Vol 4, Pp 1334-1349 (2023) |
Druh dokumentu: |
article |
ISSN: |
2644-125X |
DOI: |
10.1109/OJCOMS.2023.3285790 |
Popis: |
This paper proposes a memory-efficient deep neural network (DNN) framework-based symbol level precoding (SLP). We focus on a DNN with realistic finite precision weights and adopt an unsupervised deep learning (DL) based SLP model (SLP-DNet). We apply a stochastic quantization (SQ) technique to obtain its corresponding quantized version called SLP-SQDNet. The proposed scheme offers a scalable performance vs memory trade-off, by quantizing a scalable percentage of the DNN weights, and we explore binary and ternary quantizations. Our results show that while SLP-DNet provides near-optimal performance, its quantized versions through SQ yield $\sim 3.46\times $ and $\sim 2.64\times $ model compression for binary-based and ternary-based SLP-SQDNets, respectively. We also find that our proposals offer $\sim 20\times $ and $\sim 10\times $ computational complexity reductions compared to SLP optimization-based and SLP-DNet, respectively. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|