Segment routing optimization algorithm fusing deep reinforcement learning and load centrality theory
CSTR:
Author:
Affiliation:

1.College of Computer Science and Technology, National University of Defense Technology, Changsha 410073 , China ; 2.School of Computer, Changsha University of Science and Technology, Changsha 410114 , China ; 3.School of Information Science and Engineering, Yunnan University, Kunming 650500 , China

Clc Number:

TP393

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Combining software defined networking and SR (segment routing) can optimize network performance, but in large-scale dynamic networks, excessive link utilization at key nodes can lead to a surge in queue delays. To address this, a SROD-LC (segment routing optimization algorithm based on deep reinforcement learning and load centrality theory) was proposed. By quantifying the importance of network nodes using load centrality theory, key nodes are identified and their link load states are monitored; utilizing a multi-agent reinforcement learning framework, distributed deep reinforcement learning agents are deployed at key nodes, coordinating routing decisions through a shared reward mechanism to achieve proactive optimization of link loads. At the same time, leveraging the flexibility of SR, segment identifier lists are dynamically adjusted to quickly reroute partial traffic, reducing local link utilization and avoiding potential congestion. Simulation experiments based on real network topologies show that when the proportion of SR key nodes is in the range of 0.3~0.5, the SROD-LC algorithm exhibits significant optimization effects, reducing the networks maximum link utilization by 21%~35% compared to baseline algorithms.

    Reference
    Related
    Cited by
Get Citation

曹继军, 吴宗明, 汤强, 等. 深度强化学习和负载中心性理论融合的分段路由优化算法[J]. 国防科技大学学报, 2025, 47(6): 46-59.

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:June 16,2025
  • Revised:
  • Adopted:
  • Online: December 02,2025
  • Published:
Article QR Code