IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v210y2020ics0360544220317059.html
   My bibliography  Save this article

Energy emergency supply chain collaboration optimization with group consensus through reinforcement learning considering non-cooperative behaviours

Author

Listed:
  • Xiang, Liu

Abstract

As a response to emergency events occurred frequently around the world, energy emergency supply chain collaboration has becomes a business imperative with multiple energy trading organizations to respond it by group consensus. However, managing agile energy emergency supply chain collaboration, with the minimum energy recovery time regarding energy supply shortage driven by urgent events such as earthquake, is confronted with a difficult task: govern non-cooperative behaviours of energy emergency supply chain collaboration that identifies the irrational causes underlying deviations from neoclassical utility-maximizing economic decisions. In this paper, develop a smart model for energy emergency supply chain collaboration that the work bridges the divide between emergency supply chain collaboration optimization with group consensus and reinforcement learning. It sets up collaboration consensus with scenarios learning algorithm driven by the satisfaction-level combination of generalising past experience and future scenarios to new local energy supply shortage emergency situations to govern non-cooperative irrational behaviours, resulting in the response process with the minimum energy recovery time, cost and CO2 emissions. Simulations results show that proposed model has a significantly lower running time by 40%, and reduces minimisation of cost for energy restoration by 7% and minimisation of CO2 emissions by 10.8% on average.

Suggested Citation

  • Xiang, Liu, 2020. "Energy emergency supply chain collaboration optimization with group consensus through reinforcement learning considering non-cooperative behaviours," Energy, Elsevier, vol. 210(C).
  • Handle: RePEc:eee:energy:v:210:y:2020:i:c:s0360544220317059
    DOI: 10.1016/j.energy.2020.118597
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544220317059
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2020.118597?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Yang, Chao & Li, Liang & You, Sixiong & Yan, Bingjie & Du, Xian, 2017. "Cloud computing-based energy optimization control framework for plug-in hybrid electric bus," Energy, Elsevier, vol. 125(C), pages 11-26.
    2. Yang, Lei & Nagy, Zoltan & Goffin, Philippe & Schlueter, Arno, 2015. "Reinforcement learning for optimal control of low exergy buildings," Applied Energy, Elsevier, vol. 156(C), pages 577-586.
    3. Bahrami, Shahab & Amini, M. Hadi, 2018. "A decentralized trading algorithm for an electricity market with generation uncertainty," Applied Energy, Elsevier, vol. 218(C), pages 520-532.
    4. Jiuping Xu & Jiuzhou Dai & Renqiao Rao & Huaidong Xie & Yi Lu, 2016. "Critical Systems Thinking on the Inefficiency in Post-Earthquake Relief: A Practice in Longmen Shan Fault Area," Systemic Practice and Action Research, Springer, vol. 29(5), pages 425-448, October.
    5. Rajeev, T. & Ashok, S., 2015. "Dynamic load-shifting program based on a cloud computing framework to support the integration of renewable energy sources," Applied Energy, Elsevier, vol. 146(C), pages 141-149.
    6. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    7. Kuznetsova, Elizaveta & Li, Yan-Fu & Ruiz, Carlos & Zio, Enrico & Ault, Graham & Bell, Keith, 2013. "Reinforcement learning for microgrid energy management," Energy, Elsevier, vol. 59(C), pages 133-146.
    8. Steven Way & Yufei Yuan, 2014. "Transitioning From Dynamic Decision Support to Context-Aware Multi-Party Coordination: A Case for Emergency Response," Group Decision and Negotiation, Springer, vol. 23(4), pages 649-672, July.
    9. Zou, Yuan & Liu, Teng & Liu, Dexing & Sun, Fengchun, 2016. "Reinforcement learning-based real-time energy management for a hybrid tracked vehicle," Applied Energy, Elsevier, vol. 171(C), pages 372-382.
    10. Li, Yinan & Yang, Wentao & He, Ping & Chen, Chang & Wang, Xiaonan, 2019. "Design and management of a distributed hybrid energy system through smart contract and blockchain," Applied Energy, Elsevier, vol. 248(C), pages 390-405.
    11. Xiang, Liu, 2017. "Energy network dispatch optimization under emergency of local energy shortage with web tool for automatic large group decision-making," Energy, Elsevier, vol. 120(C), pages 740-750.
    12. Du, Guodong & Zou, Yuan & Zhang, Xudong & Kong, Zehui & Wu, Jinlong & He, Dingbo, 2019. "Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning," Applied Energy, Elsevier, vol. 251(C), pages 1-1.
    13. Xydas, Erotokritos & Marmaras, Charalampos & Cipcigan, Liana M., 2016. "A multi-agent based scheduling algorithm for adaptive electric vehicles charging," Applied Energy, Elsevier, vol. 177(C), pages 354-365.
    14. Lund, H. & Mathiesen, B.V., 2009. "Energy system analysis of 100% renewable energy systems—The case of Denmark in years 2030 and 2050," Energy, Elsevier, vol. 34(5), pages 524-531.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jiguang Wang & Yushang Hu & Weihua Qu & Liuxin Ma, 2022. "Research on Emergency Supply Chain Collaboration Based on Tripartite Evolutionary Game," Sustainability, MDPI, vol. 14(19), pages 1-25, September.
    2. Xiang, Liu, 2022. "A large-scale equilibrium model of energy emergency production: Embedding social choice rules into Nash Q-learning automatically achieving consensus of urgent recovery behaviors," Energy, Elsevier, vol. 259(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Paiho, Satu & Kiljander, Jussi & Sarala, Roope & Siikavirta, Hanne & Kilkki, Olli & Bajpai, Arpit & Duchon, Markus & Pahl, Marc-Oliver & Wüstrich, Lars & Lübben, Christian & Kirdan, Erkin & Schindler,, 2021. "Towards cross-commodity energy-sharing communities – A review of the market, regulatory, and technical situation," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    2. Correa-Jullian, Camila & López Droguett, Enrique & Cardemil, José Miguel, 2020. "Operation scheduling in a solar thermal system: A reinforcement learning-based framework," Applied Energy, Elsevier, vol. 268(C).
    3. Zhou, Jianhao & Xue, Siwu & Xue, Yuan & Liao, Yuhui & Liu, Jun & Zhao, Wanzhong, 2021. "A novel energy management strategy of hybrid electric vehicle via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 224(C).
    4. Guo, Yurun & Wang, Shugang & Wang, Jihong & Zhang, Tengfei & Ma, Zhenjun & Jiang, Shuang, 2024. "Key district heating technologies for building energy flexibility: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 189(PB).
    5. Gokhale, Gargya & Claessens, Bert & Develder, Chris, 2022. "Physics informed neural networks for control oriented thermal modeling of buildings," Applied Energy, Elsevier, vol. 314(C).
    6. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    7. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    8. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Data-driven battery operation for energy arbitrage using rainbow deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    9. Pinto, Giuseppe & Deltetto, Davide & Capozzoli, Alfonso, 2021. "Data-driven district energy management with surrogate models and deep reinforcement learning," Applied Energy, Elsevier, vol. 304(C).
    10. Zheng, Lingwei & Wu, Hao & Guo, Siqi & Sun, Xinyu, 2023. "Real-time dispatch of an integrated energy system based on multi-stage reinforcement learning with an improved action-choosing strategy," Energy, Elsevier, vol. 277(C).
    11. Charbonnier, Flora & Morstyn, Thomas & McCulloch, Malcolm D., 2022. "Coordination of resources at the edge of the electricity grid: Systematic review and taxonomy," Applied Energy, Elsevier, vol. 318(C).
    12. Juan D. Velásquez & Lorena Cadavid & Carlos J. Franco, 2023. "Intelligence Techniques in Sustainable Energy: Analysis of a Decade of Advances," Energies, MDPI, vol. 16(19), pages 1-45, October.
    13. Wang, Zhe & Hong, Tianzhen, 2020. "Reinforcement learning for building controls: The opportunities and challenges," Applied Energy, Elsevier, vol. 269(C).
    14. Lilia Tightiz & Joon Yoo, 2022. "A Review on a Data-Driven Microgrid Management System Integrating an Active Distribution Network: Challenges, Issues, and New Trends," Energies, MDPI, vol. 15(22), pages 1-24, November.
    15. Qicheng Xue & Xin Zhang & Teng Teng & Jibao Zhang & Zhiyuan Feng & Qinyang Lv, 2020. "A Comprehensive Review on Classification, Energy Management Strategy, and Control Algorithm for Hybrid Electric Vehicles," Energies, MDPI, vol. 13(20), pages 1-30, October.
    16. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    17. Dong, Chaoyu & Gao, Qingbin & Xiao, Qian & Yu, Xiaodan & Pekař, Libor & Jia, Hongjie, 2018. "Time-delay stability switching boundary determination for DC microgrid clusters with the distributed control framework," Applied Energy, Elsevier, vol. 228(C), pages 189-204.
    18. Chen, Kaixuan & Lin, Jin & Song, Yonghua, 2019. "Trading strategy optimization for a prosumer in continuous double auction-based peer-to-peer market: A prediction-integration model," Applied Energy, Elsevier, vol. 242(C), pages 1121-1133.
    19. Tushar, Wayes & Yuen, Chau & Saha, Tapan K. & Morstyn, Thomas & Chapman, Archie C. & Alam, M. Jan E. & Hanif, Sarmad & Poor, H. Vincent, 2021. "Peer-to-peer energy systems for connected communities: A review of recent advances and emerging challenges," Applied Energy, Elsevier, vol. 282(PA).
    20. Liu, Teng & Wang, Bo & Yang, Chenglang, 2018. "Online Markov Chain-based energy management for a hybrid tracked vehicle with speedy Q-learning," Energy, Elsevier, vol. 160(C), pages 544-555.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:210:y:2020:i:c:s0360544220317059. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.