表題番号:2025C-642
日付:2025/10/16
研究課題Applying Large Language Model in Designing and Optimizing Self-adaptive systems
| 研究者所属(当時) | 資格 | 氏名 | |
|---|---|---|---|
| (代表者) | 高等研究所 | 講師 | 李 家隆 |
- 研究成果概要
- This research focuses on leveraging Large Language Models (LLMs) to automate and streamline the design and optimization of self-adaptive systems. Traditionally, creating effective adaptation rules has been a significant challenge, relying on costly, manual, trial-and-error processes conducted by domain experts. Furthermore, a critical gap exists where modern LLM-driven adaptation methods cannot be directly applied to or evaluated with the vast number of existing research artifacts (testbeds), hindering progress.To address these challenges, this year's research yielded two primary contributions. First, we proposed LLM-ARG, a novel framework for the automatic generation and optimization of adaptation rules. LLM-ARG operates in an iterative loop: it analyzes system execution history to identify performance issues within the current rules, proposes semantic improvements to the underlying state-space design, and automatically generates the revised source code. Through experiments on a server load-balancing scenario, we demonstrated that LLM-ARG can achieve performance comparable to human-designed rules while significantly reducing design time and cost. Second, to bridge the gap between LLMs and existing research infrastructure, we developed a customizable and well-informed wrapper. This platform specifically addresses two fundamental issues: (1) Domain-grounding, by establishing an explicit domain knowledge base to prevent LLMs from generating unrealistic or impractical commands; and (2) Well-informing, by creating a runtime observation base to provide LLMs with comprehensive, real-time system data. This wrapper facilitates the integration of LLMs with legacy artifacts, enabling researchers to leverage existing assets for developing and validating new adaptation strategies. We have successfully implemented and released this wrapper for three renowned artifacts (Znn.com, DeltaIoT, and TAS), providing a practical tool that has been accepted and published at an international conference.