Hybrid Learning for Cold-Start-Aware Microservice Scheduling in Dynamic Edge Environments
/ Authors
/ Abstract
With the rapid growth of IoT devices and their diverse workloads, container-based microservices deployed at edge nodes have emerged as a lightweight, scalable solution. However, existing microservice scheduling algorithms often assume static resource availability, which is unrealistic when multiple containers are assigned to an edge node. Besides, containers suffer from cold-start inefficiencies during early-stage training in currently popular reinforcement learning (RL) algorithms. In this paper, we propose a hybrid learning framework that combines offline imitation learning (IL) with online Soft Actor-Critic (SAC) optimization to enable cold-start-aware microservice scheduling with dynamic resource allocation. We first formulate a delay-and-energy-aware scheduling problem and construct a rule-based expert to generate demonstration data for behavior cloning. Then, a GRU-enhanced policy network is designed within the policy network to extract correlations among multiple decisions by separately encoding slow-evolving node states and fast-changing microservice features, and an action selection mechanism is provided to speed up convergence. Extensive experiments show that our method significantly accelerates convergence and achieves superior final performance. Compared with baselines, our algorithm improves the total objective by 50% and convergence speed by 70%, and demonstrates the highest stability and robustness across various edge configurations.
Journal: IEEE Transactions on Mobile Computing