A ۲-Level Distributed PPO Scheduling Approach For Real-time Heterogeneous Mobile Edge Computing
سال انتشار: 1404
نوع سند: مقاله کنفرانسی
زبان: انگلیسی
مشاهده: 110
فایل این مقاله در 6 صفحه با فرمت PDF قابل دریافت می باشد
- صدور گواهی نمایه سازی
- من نویسنده این مقاله هستم
استخراج به نرم افزارهای پژوهشی:
شناسه ملی سند علمی:
IOTCONF09_001
تاریخ نمایه سازی: 4 آذر 1404
چکیده مقاله:
Mobile Edge Computing (MEC) provides low-latency computation for mobile devices, but efficient task scheduling remains a significant challenge due to user mobility and dynamic resource heterogeneity. Existing Deep Reinforcement Learning (DRL) schedulers often lack a hierarchical structure that can adapt to both local and global system states. To bridge this gap, this paper introduces a novel ۲-level hierarchical DRL framework using Proximal Policy Optimization (PPO). At the first level, a lightweight Actor on each client device decides whether to execute a task locally or offload it, guided by a Critic on the nearest edge node for rapid, localized adaptation. At the second level, an Actor on each edge node, guided by a global Critic in the cloud, manages inter-edge load balancing. For inter-edge offloading, a Pareto-optimal selection mechanism is used to choose the destination. Comprehensive simulation results demonstrate that our proposed framework significantly outperforms baseline methods, reducing average task latency by up to ۶۰% and decreasing task failure rates by over ۳۰%, providing a robust and scalable solution for dynamic scheduling in real-world MEC environments.
نویسندگان
Armin Mohammadi Ghale
K. N. Toosi University of Technology