Author
Listed:
- Nguyen Tu
- Sukhyun Nam
- James Won‐Ki Hong
Abstract
The increasing scale and complexity of network infrastructure present a huge challenge for network operators and administrators in performing network configuration and management tasks. Intent‐based networking has emerged as a solution to simplify the configuration and management of networks. However, one of the most difficult tasks of intent‐based networking is correctly translating high‐level natural language intents into low‐level network configurations. In this paper, we propose a general and effective approach to perform the network intent translation task using large language models with fine‐tuning, dynamic in‐context learning, and continuous learning. Fine‐tuning allows a pretrained large language model to perform better on a specific task. In‐context learning enables large language models to learn from the examples provided along with the actual intent. Continuous learning allows the system to improve overtime with new user intents. To demonstrate the feasibility of our approach, we present and evaluate it with two use cases: network formal specification translation and network function virtualization configuration. Our evaluation shows that with the proposed approach, we can achieve high intent translation accuracy as well as fast processing times using small large language models that can run on a single consumer‐grade GPU.
Suggested Citation
Nguyen Tu & Sukhyun Nam & James Won‐Ki Hong, 2025.
"Intent‐Based Network Configuration Using Large Language Models,"
International Journal of Network Management, John Wiley & Sons, vol. 35(1), January.
Handle:
RePEc:wly:intnem:v:35:y:2025:i:1:n:e2313
DOI: 10.1002/nem.2313
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:intnem:v:35:y:2025:i:1:n:e2313. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1002/(ISSN)1099-1190 .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.