An Energy-Efficient Scheduling and Routing Protocol Based on Q-Learning for WSN

Authors

  • Nour Al-Huda Salih Abd Al-Furat Al-Awsat Technical University
  • Asaad. S. Daghal Al-Furat Al-Awsat Technical University

DOI:

https://doi.org/10.47604/ajcet.3541

Keywords:

WSNs, RL, K-Means Clustering, Sleep Scheduling, PDR, Network Lifetime

Abstract

Purpose: This article introduces an energy-efficient routing protocol for wireless sensor networks (WSNs) that integrates a dynamic K-means clustering algorithm with Q-Learning and adaptive sleep scheduling. The proposed model aims to extend the network’s lifetime, reduce energy consumption, and maintain the reliability of high data delivery in limited-resource nodes.

Methodology: Each sensor tag autonomously makes optimal forwarding decisions based on local parameters such as remaining energy, distance, jumping, link quality and sensory data variation. To increase adaptation, the network regularly prepares the cluster depending on the node energy and position. In contrast, the sensor nodes enter sleep mode when no significant data changes are detected, reducing inactive communication.

Findings: The model was evaluated with separate network density and simulation settings in 25 scenarios. The best executive landscape achieved a package delivery ratio (PDR) of 94.73 %, delayed 5080.1 episodes in First Node Death (FND), and reduced the average energy consumption by up to 0.0111189 J per episode.

Unique Contribution to Theory, Practice, and Policy: Compared to standard protocols such as LEACH and RLBEEP, the proposed method outperforms them in all performance matrices. These results demonstrated the effectiveness of learning combined with adaptive grouping and transmission control for achieving durable and intelligent WSN operation.

Downloads

Download data is not yet available.

References

Anastasi, G., Conti, M., Di Francesco, M., & Passarella, A. (2009). Energy conservation in wireless sensor networks: A survey. Ad hoc networks, 7(3), 537-568.

Barto, A. G. (2021). Reinforcement learning: An introduction. by richard’s sutton. SIAM Rev, 6(2), 423.

Behera, T. M., Samal, U. C., Mohapatra, S. K., Khan, M. S., Appasani, B., Bizon, N., & Thounthong, P. (2022). Energy-efficient routing protocols for wireless sensor networks: Architectures, strategies, and performance. Electronics, 11(15), 2282.

Boyan, J., & Littman, M. (1993). Packet routing in dynamically changing networks: A reinforcement learning approach. Advances in neural information processing systems, 6.

Chandel, A., Chouhan, V. S., & Sharma, S. (2020). A survey on routing protocols for wireless sensor networks. In Advances in Information Communication Technology and Computing: Proceedings of AICTC 2019 (pp. 143-164). Singapore: Springer Singapore.

Donta, P. K., Amgoth, T., & Annavarapu, C. S. R. (2022). Delay-aware data fusion in duty-cycled wireless sensor networks: A Q-learning approach. Sustainable Computing: Informatics and Systems, 33, 100642.

Guo, W., Yan, C., & Lu, T. (2019). Optimizing the lifetime of wireless sensor networks via reinforcement-learning-based routing. International Journal of Distributed Sensor Networks, 15(2), 1550147719833541.

Heinzelman, W. R., Chandrakasan, A., & Balakrishnan, H. (2000, January). Energy-efficient communication protocol for wireless microsensor networks. In Proceedings of the 33rd annual Hawaii international conference on system sciences (pp. 10-pp). IEEE.

Hu, T., & Fei, Y. (2010). QELAR: A machine-learning-based adaptive routing protocol for energy-efficient and lifetime-extended underwater sensor networks. IEEE transactions on mobile computing, 9(6), 796-809.

Kandris, D., Nakas, C., Vomvas, D., & Koulouras, G. (2020). Applications of wireless sensor networks: an up-to-date survey. Applied system innovation, 3(1), 14.

Kiani, F., Amiri, E., Zamani, M., Khodadadi, T., & Abdul Manaf, A. (2015). Efficient intelligent energy routing protocol in wireless sensor networks. International Journal of Distributed Sensor Networks, 11(3), 618072.

Pateria, S., Subagdja, B., Tan, A. H., & Quek, C. (2021). Hierarchical reinforcement learning: A comprehensive survey. ACM Computing Surveys (CSUR), 54(5), 1-35.

Polastre, J., Szewczyk, R., & Culler, D. (2005, April). Telos: Enabling ultra-low power wireless research. In IPSN 2005. Fourth International Symposium on Information Processing in Sensor Networks, 2005. (pp. 364-369). IEEE.

Raghunandan, K. (2022). Introduction to wireless communications and networks: A practical perspective. Springer Nature.

Raghunathan, V., Schurgers, C., Park, S., & Srivastava, M. B. (2002). Energy-aware wireless microsensor networks. IEEE Signal processing magazine, 19(2), 40-50.

Renold, A. P., & Chandrakala, S. (2017). MRL-SCSO: multi-agent reinforcement learning-based self-configuration and self-optimization protocol for unattended wireless sensor networks. Wireless Personal Communications, 96(4), 5061-5079.

Saranya, V., Shankar, S., & Kanagachidambaresan, G. R. (2018). Energy efficient clustering scheme (EECS) for wireless sensor network with mobile sink. Wireless Personal Communications, 100(4), 1553-1567.

Shafiq, M., Ashraf, H., Ullah, A., & Tahira, S. (2020). Systematic literature review on energy efficient routing schemes in WSN–a survey. Mobile Networks and Applications, 25(3), 882-895.

Taheri, H., Neamatollahi, P., Naghibzadeh, M., & Yaghmaee, M. H. (2010, December). Improving on HEED protocol of wireless sensor networks using non probabilistic approach and fuzzy logic (HEED-NPF). In 2010 5th International Symposium on Telecommunications (pp. 193-198). IEEE.

Younis, O., & Fahmy, S. (2004). HEED: a hybrid, energy-efficient, distributed clustering approach for ad hoc sensor networks. IEEE Transactions on mobile computing, 3(4), 366-379.

Downloads

Published

2025-10-17

How to Cite

Abd, N., & Daghal, A. (2025). An Energy-Efficient Scheduling and Routing Protocol Based on Q-Learning for WSN. Asian Journal of Computing and Engineering Technology, 6(1), 48–62. https://doi.org/10.47604/ajcet.3541

Issue

Section

Articles