IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v286y2024ics0360544223028669.html
   My bibliography  Save this article

Expert-demonstration-augmented reinforcement learning for lane-change-aware eco-driving traversing consecutive traffic lights

Author

Listed:
  • Zhang, Chuntao
  • Huang, Wenhui
  • Zhou, Xingyu
  • Lv, Chen
  • Sun, Chao

Abstract

Eco-driving methods incorporating lateral motion exhibit enhanced energy-saving prospects in multi-lane traffic contexts, yet the randomly distributed obstructing vehicles and sparse traffic lights pose challenges in assessing the long-term value of instantaneous actions, impeding further improvement in energy efficiency. In response to this issue, a deep reinforcement learning (DRL)-based eco-driving method is proposed and augmented with the expert demonstration mechanism. Specifically, a Markov decision process matching with the target eco-driving scenario is systematically constructed, with which, the formulated DRL algorithm, parametrized soft actor-critic (PSAC), is trained to realize the integrated optimization of speed planning and lane-changing maneuver. To promote the training performance of PSAC under sparse rewards concerning traffic lights, an expert eco-driving model and an adaptive sampling approach are incorporated to constitute the expert demonstration mechanism. Simulation results highlight the superior performance of the proposed DRL-based eco-driving method and its training mechanism. Compared with the performance of the PSAC with a pure exploration-based training mechanism, the expert demonstration mechanism promotes the training efficiency and cumulated rewards of PSAC by about 60 % and 21.89 % respectively in the training phase, while in the test phase, a further reduction of 4.23 % benchmarked on a rule-based method is achieved in fuel consumption.

Suggested Citation

  • Zhang, Chuntao & Huang, Wenhui & Zhou, Xingyu & Lv, Chen & Sun, Chao, 2024. "Expert-demonstration-augmented reinforcement learning for lane-change-aware eco-driving traversing consecutive traffic lights," Energy, Elsevier, vol. 286(C).
  • Handle: RePEc:eee:energy:v:286:y:2024:i:c:s0360544223028669
    DOI: 10.1016/j.energy.2023.129472
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544223028669
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2023.129472?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Guo, Lingxiong & Zhang, Xudong & Zou, Yuan & Han, Lijin & Du, Guodong & Guo, Ningyuan & Xiang, Changle, 2022. "Co-optimization strategy of unmanned hybrid electric tracked vehicle combining eco-driving and simultaneous energy management," Energy, Elsevier, vol. 246(C).
    2. Tran, Dai-Duong & Vafaeipour, Majid & El Baghdadi, Mohamed & Barrero, Ricardo & Van Mierlo, Joeri & Hegazy, Omar, 2020. "Thorough state-of-the-art analysis of electric and hybrid vehicle powertrains: Topologies and integrated energy management strategies," Renewable and Sustainable Energy Reviews, Elsevier, vol. 119(C).
    3. Sun, Chao & Zhang, Chuntao & Sun, Fengchun & Zhou, Xingyu, 2022. "Stochastic co-optimization of speed planning and powertrain control with dynamic probabilistic constraints for safe and ecological driving," Applied Energy, Elsevier, vol. 325(C).
    4. Li, Jie & Wu, Xiaodong & Xu, Min & Liu, Yonggang, 2022. "Deep reinforcement learning and reward shaping based eco-driving control for automated HEVs among signalized intersections," Energy, Elsevier, vol. 251(C).
    5. Qu, Xiaobo & Yu, Yang & Zhou, Mofan & Lin, Chin-Teng & Wang, Xiangyu, 2020. "Jointly dampening traffic oscillations and improving energy consumption with electric, connected and automated vehicles: A reinforcement learning based approach," Applied Energy, Elsevier, vol. 257(C).
    6. Huang, Ruchen & He, Hongwen & Zhao, Xuyang & Wang, Yunlong & Li, Menglin, 2022. "Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm," Applied Energy, Elsevier, vol. 321(C).
    7. Huang, Yuhan & Ng, Elvin C.Y. & Zhou, John L. & Surawski, Nic C. & Chan, Edward F.C. & Hong, Guang, 2018. "Eco-driving technology for sustainable road transport: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 93(C), pages 596-609.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Li, Jie & Fotouhi, Abbas & Liu, Yonggang & Zhang, Yuanjian & Chen, Zheng, 2024. "Review on eco-driving control for connected and automated vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 189(PB).
    2. Chen, Zheng & Wu, Simin & Shen, Shiquan & Liu, Yonggang & Guo, Fengxiang & Zhang, Yuanjian, 2023. "Co-optimization of velocity planning and energy management for autonomous plug-in hybrid electric vehicles in urban driving scenarios," Energy, Elsevier, vol. 263(PF).
    3. Zhang, Hanyu & Du, Lili, 2023. "Platoon-centered control for eco-driving at signalized intersection built upon hybrid MPC system, online learning and distributed optimization part I: Modeling and solution algorithm design," Transportation Research Part B: Methodological, Elsevier, vol. 172(C), pages 174-198.
    4. Zhang, Yahui & Wei, Zeyi & Wang, Zhong & Tian, Yang & Wang, Jizhe & Tian, Zhikun & Xu, Fuguo & Jiao, Xiaohong & Li, Liang & Wen, Guilin, 2024. "Hierarchical eco-driving control strategy for connected automated fuel cell hybrid vehicles and scenario-/hardware-in-the loop validation," Energy, Elsevier, vol. 292(C).
    5. He, Hongwen & Meng, Xiangfei & Wang, Yong & Khajepour, Amir & An, Xiaowen & Wang, Renguang & Sun, Fengchun, 2024. "Deep reinforcement learning based energy management strategies for electrified vehicles: Recent advances and perspectives," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    6. Wang, Yong & Wu, Yuankai & Tang, Yingjuan & Li, Qin & He, Hongwen, 2023. "Cooperative energy management and eco-driving of plug-in hybrid electric vehicle via multi-agent reinforcement learning," Applied Energy, Elsevier, vol. 332(C).
    7. Xie, Yunkun & Li, Yangyang & Zhao, Zhichao & Dong, Hao & Wang, Shuqian & Liu, Jingping & Guan, Jinhuan & Duan, Xiongbo, 2020. "Microsimulation of electric vehicle energy consumption and driving range," Applied Energy, Elsevier, vol. 267(C).
    8. Shi, Dehua & Liu, Sheng & Cai, Yingfeng & Wang, Shaohua & Li, Haoran & Chen, Long, 2021. "Pontryagin’s minimum principle based fuzzy adaptive energy management for hybrid electric vehicle using real-time traffic information," Applied Energy, Elsevier, vol. 286(C).
    9. Santos, Alberto & Maia, Pedro & Jacob, Rodrigo & Wei, Huang & Callegari, Camila & Oliveira Fiorini, Ana Carolina & Schaeffer, Roberto & Szklo, Alexandre, 2024. "Road conditions and driving patterns on fuel usage: Lessons from an emerging economy," Energy, Elsevier, vol. 295(C).
    10. Yi Zhang & Qiang Guo & Jie Song, 2023. "Internet-Distributed Hardware-in-the-Loop Simulation Platform for Plug-In Fuel Cell Hybrid Vehicles," Energies, MDPI, vol. 16(18), pages 1-17, September.
    11. Yang Wang & Alessandra Boggio-Marzet, 2018. "Evaluation of Eco-Driving Training for Fuel Efficiency and Emissions Reduction According to Road Type," Sustainability, MDPI, vol. 10(11), pages 1-16, October.
    12. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    13. Robaina, Margarita & Neves, Ana, 2021. "Complete decomposition analysis of CO2 emissions intensity in the transport sector in Europe," Research in Transportation Economics, Elsevier, vol. 90(C).
    14. Wojciech Adamski & Krzysztof Brzozowski & Jacek Nowakowski & Tomasz Praszkiewicz & Tomasz Knefel, 2021. "Excess Fuel Consumption Due to Selection of a Lower Than Optimal Gear—Case Study Based on Data Obtained in Real Traffic Conditions," Energies, MDPI, vol. 14(23), pages 1-15, November.
    15. Chen, Z. & Liu, Y. & Ye, M. & Zhang, Y. & Chen, Z. & Li, G., 2021. "A survey on key techniques and development perspectives of equivalent consumption minimisation strategy for hybrid electric vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    16. Juan Francisco Coloma & Marta García & Gonzalo Fernández & Andrés Monzón, 2021. "Environmental Effects of Eco-Driving on Courier Delivery," Sustainability, MDPI, vol. 13(3), pages 1-21, January.
    17. Chen, Jiaxin & Shu, Hong & Tang, Xiaolin & Liu, Teng & Wang, Weida, 2022. "Deep reinforcement learning-based multi-objective control of hybrid power system combined with road recognition under time-varying environment," Energy, Elsevier, vol. 239(PC).
    18. Li, Jie & Fotouhi, Abbas & Pan, Wenjun & Liu, Yonggang & Zhang, Yuanjian & Chen, Zheng, 2023. "Deep reinforcement learning-based eco-driving control for connected electric vehicles at signalized intersections considering traffic uncertainties," Energy, Elsevier, vol. 279(C).
    19. Matthieu Matignon & Toufik Azib & Mehdi Mcharek & Ahmed Chaibet & Adriano Ceschia, 2023. "Real-Time Integrated Energy Management Strategy Applied to Fuel Cell Hybrid Systems," Energies, MDPI, vol. 16(6), pages 1-21, March.
    20. Weiqi Zhou & Nanchi Wu & Qingchao Liu & Chaofeng Pan & Long Chen, 2023. "Research on Ecological Driving Following Strategy Based on Deep Reinforcement Learning," Sustainability, MDPI, vol. 15(18), pages 1-14, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:286:y:2024:i:c:s0360544223028669. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.