IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v251y2022ics0360544222008271.html
   My bibliography  Save this article

Deep reinforcement learning and reward shaping based eco-driving control for automated HEVs among signalized intersections

Author

Listed:
  • Li, Jie
  • Wu, Xiaodong
  • Xu, Min
  • Liu, Yonggang

Abstract

In a connected traffic environment with signalized intersections, eco-driving control needs to co-optimize fuel economy (fuel consumption), driving safety (collisions and red lights), and travel efficiency (total travel time) of automated hybrid electric vehicles. Thus, we proposed a deep reinforcement learning based eco-driving control strategy to co-optimize the fuel economy, driving safety, and travel efficiency. A twin-delayed deep deterministic policy gradient agent is implemented to plan vehicle speed in real-time. The multi-objective optimization function of the eco-driving control problem is transformed into the value function of the deep reinforcement learning algorithm by designing fuel reward, traffic light reward, and safety reward function. Specifically, we designed potential-based shaping functions to solve the problem that the intelligent agent cannot learn an optimal policy due to the sparse and delayed traffic environment. It can steer the agent to an optimal policy and guarantee policy invariance. Finally, the proposed method is verified in a real road traffic environment with signalized intersections. The results demonstrate that the proposed method can heavily reduce fuel consumption while satisfying the constraints of traffic lights and safety rules. Meanwhile, the proposed strategy shows certain feasibility for real-time application.

Suggested Citation

  • Li, Jie & Wu, Xiaodong & Xu, Min & Liu, Yonggang, 2022. "Deep reinforcement learning and reward shaping based eco-driving control for automated HEVs among signalized intersections," Energy, Elsevier, vol. 251(C).
  • Handle: RePEc:eee:energy:v:251:y:2022:i:c:s0360544222008271
    DOI: 10.1016/j.energy.2022.123924
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544222008271
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2022.123924?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Xu, Bin & Rathod, Dhruvang & Zhang, Darui & Yebi, Adamu & Zhang, Xueyu & Li, Xiaoya & Filipi, Zoran, 2020. "Parametric study on reinforcement learning optimized energy management strategy for a hybrid electric vehicle," Applied Energy, Elsevier, vol. 259(C).
    2. He, Hongwen & Wang, Yunlong & Han, Ruoyan & Han, Mo & Bai, Yunfei & Liu, Qingwu, 2021. "An improved MPC-based energy management strategy for hybrid vehicles using V2V and V2I communications," Energy, Elsevier, vol. 225(C).
    3. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    4. Xie, Shaobo & Hu, Xiaosong & Xin, Zongke & Brighton, James, 2019. "Pontryagin’s Minimum Principle based model predictive control of energy management for a plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 236(C), pages 893-905.
    5. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    6. Qu, Xiaobo & Yu, Yang & Zhou, Mofan & Lin, Chin-Teng & Wang, Xiangyu, 2020. "Jointly dampening traffic oscillations and improving energy consumption with electric, connected and automated vehicles: A reinforcement learning based approach," Applied Energy, Elsevier, vol. 257(C).
    7. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    8. Guo, Ningyuan & Zhang, Xudong & Zou, Yuan & Guo, Lingxiong & Du, Guodong, 2021. "Real-time predictive energy management of plug-in hybrid electric vehicles for coordination of fuel economy and battery degradation," Energy, Elsevier, vol. 214(C).
    9. Wang, Siyang & Lin, Xianke, 2020. "Eco-driving control of connected and automated hybrid vehicles in mixed driving scenarios," Applied Energy, Elsevier, vol. 271(C).
    10. Guo, Hongqiang & Lu, Silong & Hui, Hongzhong & Bao, Chunjiang & Shangguan, Jinyong, 2019. "Receding horizon control-based energy management for plug-in hybrid electric buses using a predictive model of terminal SOC constraint in consideration of stochastic vehicle mass," Energy, Elsevier, vol. 176(C), pages 292-308.
    11. David Silver & Julian Schrittwieser & Karen Simonyan & Ioannis Antonoglou & Aja Huang & Arthur Guez & Thomas Hubert & Lucas Baker & Matthew Lai & Adrian Bolton & Yutian Chen & Timothy Lillicrap & Fan , 2017. "Mastering the game of Go without human knowledge," Nature, Nature, vol. 550(7676), pages 354-359, October.
    12. Alshehry, Atef Saad & Belloumi, Mounir, 2017. "Study of the environmental Kuznets curve for transport carbon dioxide emissions in Saudi Arabia," Renewable and Sustainable Energy Reviews, Elsevier, vol. 75(C), pages 1339-1347.
    13. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    14. Cao, Jianfei & He, Hongwen & Wei, Dong, 2021. "Intelligent SOC-consumption allocation of commercial plug-in hybrid electric vehicles in variable scenario," Applied Energy, Elsevier, vol. 281(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Li, Jie & Wu, Xiaodong & Fan, Jiawei & Liu, Yonggang & Xu, Min, 2023. "Overcoming driving challenges in complex urban traffic: A multi-objective eco-driving strategy via safety model based reinforcement learning," Energy, Elsevier, vol. 284(C).
    2. Li, Jie & Fotouhi, Abbas & Pan, Wenjun & Liu, Yonggang & Zhang, Yuanjian & Chen, Zheng, 2023. "Deep reinforcement learning-based eco-driving control for connected electric vehicles at signalized intersections considering traffic uncertainties," Energy, Elsevier, vol. 279(C).
    3. Li, Jie & Fotouhi, Abbas & Liu, Yonggang & Zhang, Yuanjian & Chen, Zheng, 2024. "Review on eco-driving control for connected and automated vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 189(PB).
    4. Cui, Wei & Cui, Naxin & Li, Tao & Cui, Zhongrui & Du, Yi & Zhang, Chenghui, 2022. "An efficient multi-objective hierarchical energy management strategy for plug-in hybrid electric vehicle in connected scenario," Energy, Elsevier, vol. 257(C).
    5. Chen, Zheng & Wu, Simin & Shen, Shiquan & Liu, Yonggang & Guo, Fengxiang & Zhang, Yuanjian, 2023. "Co-optimization of velocity planning and energy management for autonomous plug-in hybrid electric vehicles in urban driving scenarios," Energy, Elsevier, vol. 263(PF).
    6. Zhang, Chuntao & Huang, Wenhui & Zhou, Xingyu & Lv, Chen & Sun, Chao, 2024. "Expert-demonstration-augmented reinforcement learning for lane-change-aware eco-driving traversing consecutive traffic lights," Energy, Elsevier, vol. 286(C).
    7. Liu, Chunyu & Sheng, Zihao & Chen, Sikai & Shi, Haotian & Ran, Bin, 2023. "Longitudinal control of connected and automated vehicles among signalized intersections in mixed traffic flow with deep reinforcement learning approach," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 629(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    2. Qi, Chunyang & Song, Chuanxue & Xiao, Feng & Song, Shixin, 2022. "Generalization ability of hybrid electric vehicle energy management strategy based on reinforcement learning method," Energy, Elsevier, vol. 250(C).
    3. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    4. Penghui Qiang & Peng Wu & Tao Pan & Huaiquan Zang, 2021. "Real-Time Approximate Equivalent Consumption Minimization Strategy Based on the Single-Shaft Parallel Hybrid Powertrain," Energies, MDPI, vol. 14(23), pages 1-22, November.
    5. Chen, Jiaxin & Shu, Hong & Tang, Xiaolin & Liu, Teng & Wang, Weida, 2022. "Deep reinforcement learning-based multi-objective control of hybrid power system combined with road recognition under time-varying environment," Energy, Elsevier, vol. 239(PC).
    6. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
    7. Liu, Yonggang & Wu, Yitao & Wang, Xiangyu & Li, Liang & Zhang, Yuanjian & Chen, Zheng, 2023. "Energy management for hybrid electric vehicles based on imitation reinforcement learning," Energy, Elsevier, vol. 263(PC).
    8. Zhu, Tao & Wills, Richard G.A. & Lot, Roberto & Ruan, Haijun & Jiang, Zhihao, 2021. "Adaptive energy management of a battery-supercapacitor energy storage system for electric vehicles based on flexible perception and neural network fitting," Applied Energy, Elsevier, vol. 292(C).
    9. Chen, Zheng & Gu, Hongji & Shen, Shiquan & Shen, Jiangwei, 2022. "Energy management strategy for power-split plug-in hybrid electric vehicle based on MPC and double Q-learning," Energy, Elsevier, vol. 245(C).
    10. Zhengyu Yao & Hwan-Sik Yoon & Yang-Ki Hong, 2023. "Control of Hybrid Electric Vehicle Powertrain Using Offline-Online Hybrid Reinforcement Learning," Energies, MDPI, vol. 16(2), pages 1-18, January.
    11. Guo, Ningyuan & Zhang, Xudong & Zou, Yuan & Guo, Lingxiong & Du, Guodong, 2021. "Real-time predictive energy management of plug-in hybrid electric vehicles for coordination of fuel economy and battery degradation," Energy, Elsevier, vol. 214(C).
    12. Marouane Adnane & Ahmed Khoumsi & João Pedro F. Trovão, 2023. "Efficient Management of Energy Consumption of Electric Vehicles Using Machine Learning—A Systematic and Comprehensive Survey," Energies, MDPI, vol. 16(13), pages 1-39, June.
    13. Cui, Wei & Cui, Naxin & Li, Tao & Cui, Zhongrui & Du, Yi & Zhang, Chenghui, 2022. "An efficient multi-objective hierarchical energy management strategy for plug-in hybrid electric vehicle in connected scenario," Energy, Elsevier, vol. 257(C).
    14. Connor Scott & Mominul Ahsan & Alhussein Albarbar, 2021. "Machine Learning Based Vehicle to Grid Strategy for Improving the Energy Performance of Public Buildings," Sustainability, MDPI, vol. 13(7), pages 1-22, April.
    15. Guo, Chenyu & Wang, Xin & Zheng, Yihui & Zhang, Feng, 2022. "Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    16. Chen, Zheng & Wu, Simin & Shen, Shiquan & Liu, Yonggang & Guo, Fengxiang & Zhang, Yuanjian, 2023. "Co-optimization of velocity planning and energy management for autonomous plug-in hybrid electric vehicles in urban driving scenarios," Energy, Elsevier, vol. 263(PF).
    17. Tang, Xiaolin & Zhou, Haitao & Wang, Feng & Wang, Weida & Lin, Xianke, 2022. "Longevity-conscious energy management strategy of fuel cell hybrid electric Vehicle Based on deep reinforcement learning," Energy, Elsevier, vol. 238(PA).
    18. Jiang, Yue & Meng, Hao & Chen, Guanpeng & Yang, Congnan & Xu, Xiaojun & Zhang, Lei & Xu, Haijun, 2022. "Differential-steering based path tracking control and energy-saving torque distribution strategy of 6WID unmanned ground vehicle," Energy, Elsevier, vol. 254(PA).
    19. Ju, Fei & Zhuang, Weichao & Wang, Liangmo & Zhang, Zhe, 2020. "Comparison of four-wheel-drive hybrid powertrain configurations," Energy, Elsevier, vol. 209(C).
    20. Yang, Dongpo & Liu, Tong & Song, Dafeng & Zhang, Xuanming & Zeng, Xiaohua, 2023. "A real time multi-objective optimization Guided-MPC strategy for power-split hybrid electric bus based on velocity prediction," Energy, Elsevier, vol. 276(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:251:y:2022:i:c:s0360544222008271. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.