Machine Learning-Based Queueing Time Analysis in XGPON.

Autor: Ismail, N. A., Idrus, S. M., Iqbal, F., Zin, A. M., Atan, F., Ali, N.
Předmět:
Zdroj: International Journal of Nanoelectronics & Materials; 2021 Special Issue, Vol. 14, p157-163, 7p
Abstrakt: Machine learning has been a popular approach in predicting future demand. In optical access network, machine learning can best predict bandwidth demand so as to reduce delays. This paper presented a machine learning approach to learn queueing time in XGPON given the traffic load, number of frames and packet size. Queueing time contributes to upstream delay and therefore would improve the network performance. Output R acquired from the trained ANN is close to value 1. From the trained ANN, mean squared error (MSE) shows significantly low value and this proves that machine learning-based queueing time analysis offers another dimension of delay analysis on top of numerical analysis. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index