Current Volume 6
5G networks deployment supports large number of devices and requires high data rates, needing too much energy consumption. The need to efficiently manage this energy is a major drive for this work. This work focuses on improving the energy efficiency of a 5G network using machine learning. 5G production dataset generated with G-net track pro was analyzed using Python programming language. The results obtained using the key significant features identified by the significant indicator showed that to avoid over-fitting and for optimal model performance, the number of estimators should not exceed 25 and the maximum depth of gradient descent should not exceed 9. Five algorithms were developed, including random-forest algorithm, gradient boosting algorithm, xgboost algorithm, lasso algorithm and ridge stacking algorithm; the ridge stacking algorithm performed better than the individual algorithms with the root mean square error (RMSE) value of 1.931 and R^2 error of 1.321, being the measure of how best a regression model fits into the data. The xgboost algorithm performed better than all the individual algorithms with RMSE value of 1.943 and R^2 error of 0.114.
5G, Energy Efficiency, Artificial Intelligence, Machine Learning.
Chinedu Reginald Okpara , Victor E Idigo , Ogbonna P. Ngwu "Improving the Energy Efficiency of a 5G Network: The Machine Learning Approach" Iconic Research And Engineering Journals Volume 6 Issue 8 2023 Page 142-145
Chinedu Reginald Okpara , Victor E Idigo , Ogbonna P. Ngwu "Improving the Energy Efficiency of a 5G Network: The Machine Learning Approach" Iconic Research And Engineering Journals, 6(8)