IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v247y2019icp454-466.html
   My bibliography  Save this article

Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus

Author

Listed:
  • Wu, Yuankai
  • Tan, Huachun
  • Peng, Jiankun
  • Zhang, Hailong
  • He, Hongwen

Abstract

Hybrid electric vehicles offer an immediate solution for emissions reduction and fuel displacement under the current technique level. Energy management strategies are critical for improving fuel economy of hybrid electric vehicles. In this paper we propose a energy management strategy for a series-parallel plug-in hybrid electric bus based on deep deterministic policy gradients. Specifically, deep deterministic policy gradients is an actor-critic, model-free reinforcement learning algorithm that can assign the optimal energy split of the bus over continuous spaces. We consider that the buses are driving in a fixed bus line, where driving cycle is constrained by the traffic. The traffic information and number of passengers are also incorporated into the energy management system. The deep reinforcement learning based energy management agent is trained with a large amount of driving cycles that generated from traffic simulation. Experiments on the traffic simulation driving cycles show that the proposed approach outperforms conventional reinforcement learning approach and exhibits performance close to the global optimal dynamic programming. Moreover, it also has great generality to the standard driving cycles that are significantly different with the ones that it has been trained with. We also show some interesting attributes of learned energy management strategies through visualizations of the actor and critic. The main contribution of this study is to explore the incorporation of traffic information within hybrid electric vehicle energy managment through advanced intelligent algorithms.

Suggested Citation

  • Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
  • Handle: RePEc:eee:appene:v:247:y:2019:i:c:p:454-466
    DOI: 10.1016/j.apenergy.2019.04.021
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S030626191930652X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2019.04.021?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Song, Ziyou & Hofmann, Heath & Li, Jianqiu & Han, Xuebing & Ouyang, Minggao, 2015. "Optimization for a hybrid energy storage system in electric vehicles using dynamic programing approach," Applied Energy, Elsevier, vol. 139(C), pages 151-162.
    2. Hua, Haochen & Qin, Yuchao & Hao, Chuantong & Cao, Junwei, 2019. "Optimal energy management strategies for energy Internet via deep reinforcement learning approach," Applied Energy, Elsevier, vol. 239(C), pages 598-609.
    3. Trovão, João P. & Pereirinha, Paulo G. & Jorge, Humberto M. & Antunes, Carlos Henggeler, 2013. "A multi-level energy management system for multi-source electric vehicles – An integrated rule-based meta-heuristic approach," Applied Energy, Elsevier, vol. 105(C), pages 304-318.
    4. Hu, Xiaosong & Murgovski, Nikolce & Johannesson, Lars & Egardt, Bo, 2013. "Energy efficiency analysis of a series plug-in hybrid electric bus with different energy management strategies and battery sizes," Applied Energy, Elsevier, vol. 111(C), pages 1001-1009.
    5. Rahman, Imran & Vasant, Pandian M. & Singh, Balbir Singh Mahinder & Abdullah-Al-Wadud, M. & Adnan, Nadia, 2016. "Review of recent trends in optimization techniques for plug-in hybrid, and electric vehicle charging infrastructures," Renewable and Sustainable Energy Reviews, Elsevier, vol. 58(C), pages 1039-1047.
    6. M. Sabri, M.F. & Danapalasingam, K.A. & Rahmat, M.F., 2016. "A review on hybrid electric vehicles architecture and energy management strategies," Renewable and Sustainable Energy Reviews, Elsevier, vol. 53(C), pages 1433-1442.
    7. Shuxian Li & Minghui Hu & Changchao Gong & Sen Zhan & Datong Qin, 2018. "Energy Management Strategy for Hybrid Electric Vehicle Based on Driving Condition Identification Using KGA-Means," Energies, MDPI, vol. 11(6), pages 1-16, June.
    8. Peng, Jiankun & He, Hongwen & Xiong, Rui, 2017. "Rule based energy management strategy for a series–parallel plug-in hybrid electric bus optimized by dynamic programming," Applied Energy, Elsevier, vol. 185(P2), pages 1633-1643.
    9. Zou, Yuan & Liu, Teng & Liu, Dexing & Sun, Fengchun, 2016. "Reinforcement learning-based real-time energy management for a hybrid tracked vehicle," Applied Energy, Elsevier, vol. 171(C), pages 372-382.
    10. Hongwen, He & Jinquan, Guo & Jiankun, Peng & Huachun, Tan & Chao, Sun, 2018. "Real-time global driving cycle construction and the application to economy driving pro system in plug-in hybrid electric vehicles," Energy, Elsevier, vol. 152(C), pages 95-107.
    11. Wu, Jingda & He, Hongwen & Peng, Jiankun & Li, Yuecheng & Li, Zhanjiang, 2018. "Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus," Applied Energy, Elsevier, vol. 222(C), pages 799-811.
    12. Shen, Peihong & Zhao, Zhiguo & Zhan, Xiaowen & Li, Jingwei & Guo, Qiuyi, 2018. "Optimal energy management strategy for a plug-in hybrid electric commercial vehicle based on velocity prediction," Energy, Elsevier, vol. 155(C), pages 838-852.
    13. Xiong, Rui & Cao, Jiayi & Yu, Quanqing, 2018. "Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle," Applied Energy, Elsevier, vol. 211(C), pages 538-548.
    14. Zou Yuan & Liu Teng & Sun Fengchun & Huei Peng, 2013. "Comparative Study of Dynamic Programming and Pontryagin’s Minimum Principle on Energy Management for a Parallel Hybrid Electric Vehicle," Energies, MDPI, vol. 6(4), pages 1-14, April.
    15. Xie, Shanshan & He, Hongwen & Peng, Jiankun, 2017. "An energy management strategy based on stochastic model predictive control for plug-in hybrid electric buses," Applied Energy, Elsevier, vol. 196(C), pages 279-288.
    16. David Silver & Aja Huang & Chris J. Maddison & Arthur Guez & Laurent Sifre & George van den Driessche & Julian Schrittwieser & Ioannis Antonoglou & Veda Panneershelvam & Marc Lanctot & Sander Dieleman, 2016. "Mastering the game of Go with deep neural networks and tree search," Nature, Nature, vol. 529(7587), pages 484-489, January.
    17. Lu, Renzhi & Hong, Seung Ho, 2019. "Incentive-based demand response for smart grid with reinforcement learning and deep neural network," Applied Energy, Elsevier, vol. 236(C), pages 937-949.
    18. Torres, J.L. & Gonzalez, R. & Gimenez, A. & Lopez, J., 2014. "Energy management strategy for plug-in hybrid electric vehicles. A comparative study," Applied Energy, Elsevier, vol. 113(C), pages 816-824.
    19. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    20. Chen, Zeyu & Xiong, Rui & Wang, Chun & Cao, Jiayi, 2017. "An on-line predictive energy management strategy for plug-in hybrid electric vehicles to counter the uncertain prediction of the driving cycle," Applied Energy, Elsevier, vol. 185(P2), pages 1663-1672.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
    2. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    3. Zhuang, Weichao & Li (Eben), Shengbo & Zhang, Xiaowu & Kum, Dongsuk & Song, Ziyou & Yin, Guodong & Ju, Fei, 2020. "A survey of powertrain configuration studies on hybrid electric vehicles," Applied Energy, Elsevier, vol. 262(C).
    4. Shi, Wenzhuo & Huangfu, Yigeng & Xu, Liangcai & Pang, Shengzhao, 2022. "Online energy management strategy considering fuel cell fault for multi-stack fuel cell hybrid vehicle based on multi-agent reinforcement learning," Applied Energy, Elsevier, vol. 328(C).
    5. Zhou, Jianhao & Xue, Yuan & Xu, Da & Li, Chaoxiong & Zhao, Wanzhong, 2022. "Self-learning energy management strategy for hybrid electric vehicle via curiosity-inspired asynchronous deep reinforcement learning," Energy, Elsevier, vol. 242(C).
    6. Xiao, B. & Ruan, J. & Yang, W. & Walker, P.D. & Zhang, N., 2021. "A review of pivotal energy management strategies for extended range electric vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 149(C).
    7. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    8. Nie, Zhigen & Jia, Yuan & Wang, Wanqiong & Chen, Zheng & Outbib, Rachid, 2022. "Co-optimization of speed planning and energy management for intelligent fuel cell hybrid vehicle considering complex traffic conditions," Energy, Elsevier, vol. 247(C).
    9. Chen, Zheng & Hu, Hengjie & Wu, Yitao & Zhang, Yuanjian & Li, Guang & Liu, Yonggang, 2020. "Stochastic model predictive control for energy management of power-split plug-in hybrid electric vehicles based on reinforcement learning," Energy, Elsevier, vol. 211(C).
    10. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    11. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    12. Du, Guodong & Zou, Yuan & Zhang, Xudong & Kong, Zehui & Wu, Jinlong & He, Dingbo, 2019. "Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning," Applied Energy, Elsevier, vol. 251(C), pages 1-1.
    13. Zhu, Tao & Wills, Richard G.A. & Lot, Roberto & Ruan, Haijun & Jiang, Zhihao, 2021. "Adaptive energy management of a battery-supercapacitor energy storage system for electric vehicles based on flexible perception and neural network fitting," Applied Energy, Elsevier, vol. 292(C).
    14. Du, Jiuyu & Chen, Jingfu & Song, Ziyou & Gao, Mingming & Ouyang, Minggao, 2017. "Design method of a power management strategy for variable battery capacities range-extended electric vehicles to improve energy efficiency and cost-effectiveness," Energy, Elsevier, vol. 121(C), pages 32-42.
    15. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    16. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    17. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    18. López-Ibarra, Jon Ander & Gaztañaga, Haizea & Saez-de-Ibarra, Andoni & Camblong, Haritza, 2020. "Plug-in hybrid electric buses total cost of ownership optimization at fleet level based on battery aging," Applied Energy, Elsevier, vol. 280(C).
    19. Wang, Yue & Zeng, Xiaohua & Song, Dafeng, 2020. "Hierarchical optimal intelligent energy management strategy for a power-split hybrid electric bus based on driving information," Energy, Elsevier, vol. 199(C).
    20. Chen, Zheng & Gu, Hongji & Shen, Shiquan & Shen, Jiangwei, 2022. "Energy management strategy for power-split plug-in hybrid electric vehicle based on MPC and double Q-learning," Energy, Elsevier, vol. 245(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:247:y:2019:i:c:p:454-466. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.