LearningEMS: A Unified Framework and Open-Source Benchmark for Learning-Based Energy Management of Electric Vehicles

Yong Wang , Hongwen He , Yuankai Wu , Pei Wang , Haoyu Wang , Renzong Lian , Jingda Wu , Qin Li , Xiangfei Meng , Yingjuan Tang , Fengchun Sun , Amir Khajepour

Engineering ›› 2025, Vol. 54 ›› Issue (11) : 370 -387.

PDF (6046KB)
Engineering ›› 2025, Vol. 54 ›› Issue (11) : 370 -387. DOI: 10.1016/j.eng.2024.10.021
Article

LearningEMS: A Unified Framework and Open-Source Benchmark for Learning-Based Energy Management of Electric Vehicles

Author information +
History +
PDF (6046KB)

Abstract

An effective energy management strategy (EMS) is essential to optimize the energy efficiency of electric vehicles (EVs). With the advent of advanced machine learning techniques, the focus on developing sophisticated EMS for EVs is increasing. Here, we introduce LearningEMS: a unified framework and open-source benchmark designed to facilitate rapid development and assessment of EMS. LearningEMS is distinguished by its ability to support a variety of EV configurations, including hybrid EVs, fuel cell EVs, and plug-in EVs, offering a general platform for the development of EMS. The framework enables detailed comparisons of several EMS algorithms, encompassing imitation learning, deep reinforcement learning (RL), offline RL, model predictive control, and dynamic programming. We rigorously evaluated these algorithms across multiple perspectives: energy efficiency, consistency, adaptability, and practicability. Furthermore, we discuss state, reward, and action settings for RL in EV energy management, introduce a policy extraction and reconstruction method for learning-based EMS deployment, and conduct hardware-in-the-loop experiments. In summary, we offer a unified and comprehensive framework that comes with three distinct EV platforms, over 10 000 km of EMS policy data set, ten state-of-the-art algorithms, and over 160 benchmark tasks, along with three learning libraries. Its flexible design allows easy expansion for additional tasks and applications. The open-source algorithms, models, data sets, and deployment processes foster additional research and innovation in EV and broader engineering domains.

Keywords

Energy management / Electric vehicles / Reinforcement learning / Machine learning / Open-source benchmark

Cite this article

Download citation ▾
Yong Wang, Hongwen He, Yuankai Wu, Pei Wang, Haoyu Wang, Renzong Lian, Jingda Wu, Qin Li, Xiangfei Meng, Yingjuan Tang, Fengchun Sun, Amir Khajepour. LearningEMS: A Unified Framework and Open-Source Benchmark for Learning-Based Energy Management of Electric Vehicles. Engineering, 2025, 54(11): 370-387 DOI:10.1016/j.eng.2024.10.021

登录浏览全文

4963

注册一个新账户 忘记密码

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This work was supported in part by the National Natural Science Foundation of China (52172377). We would like to thank Editage (www.editage.cn) for English language editing.

Compliance with ethics guidelines

Yong Wang, Hongwen He, Yuankai Wu, Pei Wang, Haoyu Wang, Renzong Lian, Jingda Wu, Qin Li, Xiangfei Meng, Yingjuan Tang, Fengchun Sun, and Amir Khajepour declare that they have no conflict of interest or financial conflicts to disclose.

References

[1]

Powell S, Cezar GV, Min L, Azevedo IM, Rajagopal R.Charging infrastructure access and operation to reduce the grid impacts of deep electric vehicle adoption.Nat Energy 2022; 7(10):932-945.

[2]

Li Y, He H, Khajepour A, Chen Y, Huo W, Wang H.Deep reinforcement learning for intelligent energy management systems of hybrid- electric powertrains: recent advances, open issues, and prospects.IEEE Trans Transp Electrif. In press.

[3]

Wen CK, Ren W, Zhu QZ, Zhao CJ, Luo ZH, Zhang SL, et al.Reducing operation emissions and improving work efficiency using a pure electric wheel drive tractor.Engineering 2024; 37:230-245.

[4]

He H, Meng X, Wang Y, Khajepour A, An X, Wang R, et al.Deep reinforcement learning based energy management strategies for electrified vehicles: recent advances and perspectives.Renew Sustain Energy Rev 2024; 192:114248.

[5]

Zhang F, Hu X, Langari R, Cao D.Energy management strategies of connected HEVs and PHEVs: recent progress and outlook.Pror Energy Combust Sci 2019; 73:235-256.

[6]

Wu J, Huang C, He H, Huang H.Confidence-aware reinforcement learning for energy management of electrified vehicles.Renew Sustain Energy Rev 2024; 191:114154.

[7]

Dong P, Zhao J, Liu X, Wu J, Xu X, Liu Y, et al.Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: development stages, challenges, and future trends.Renew Sustain Energy Rev 2022; 170:112947.

[8]

Li Q, Liu P, Meng X, Zhang G, Ai Y, Chen W.Model prediction control-based energy management combining self-trending prediction and subset-searching algorithm for hydrogen electric multiple unit train.IEEE Trans Transp Electrif 2022; 8(2):2249-2260.

[9]

Ahmed M, Zheng Y, Amine A, Fathiannasab H, Chen Z.The role of artificial intelligence in the mass adoption of electric vehicles.Joule 2021; 5(9):2296-2322.

[10]

Millo F, Rolando L, Tresca L, Pulvirenti L.Development of a neural network-based energy management system for a plug-in hybrid electric vehicle.Transportation Engineering 2023; 11:100156.

[11]

Liu B, Wei X, Sun C, Wang B, Huo W.A controllable neural network-based method for optimal energy management of fuel cell hybrid electric vehicles.Int J Hydrogen Energy 2024; 55:1371-1382.

[12]

Hou J, Chen G, Huang J, Qiao Y, Xiong L, Wen F, et al.Large-scale vehicle platooning: advances and challenges in scheduling and planning techniques.Engineering 2023; 28:26-48.

[13]

Wang Y, Wu Y, Tang Y, Li Q, He H.Cooperative energy management and eco-driving of plug-in hybrid electric vehicle via multi-agent reinforcement learning.Appl Energy 2023; 332:120563.

[14]

Tan H, Zhang H, Peng J, Jiang Z, Wu Y.Energy management of hybrid electric bus based on deep reinforcement learning in continuous state and action space.Energy Convers Manage 2019; 195:548-560.

[15]

Wu J, Huang Z, Hu Z, Lv C.Toward human-in-the-loop ai: enhancing deep reinforcement learning via real-time human guidance for autonomous driving.Engineering 2023; 21:75-91.

[16]

Yuan K, Huang Y, Yang S, Zhou Z, Wang Y, Cao D, et al.Evolutionary decision-making and planning for autonomous driving based on safe and rational exploration and exploitation.Engineering 2024; 33:108-120.

[17]

Han X, He H, Wu J, Peng J, Li Y.Energy management based on reinforcement learning with double deep Q-learning for a hybrid electric tracked vehicle.Appl Energy 2019; 254:113708.

[18]

Qi X, Luo Y, Wu G, Boriboonsomsin K, Barth M.Deep reinforcement learning enabled self-learning control for energy efficient driving.Transp Res Part C Emerg Technol 2019; 99:67-81.

[19]

Yang C, Lu Z, Wang W, Wang M, Zhao J.An efficient intelligent energy management strategy based on deep reinforcement learning for hybrid electric flying car.Energy 2023; 280:128118.

[20]

Wang H, Ye Y, Zhang J, Xu B.A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle.Energy 2023; 266:126497.

[21]

Li Y, He H, Peng J, Wang H.Deep reinforcement learning-based energy management for a series hybrid electric vehicle enabled by history cumulative trip information.IEEE Trans Vehicular Technol 2019; 68(8):7416-7430.

[22]

Wu J, Wei Z, Li W, Wang Y, Li Y, Sauer DU.Battery thermal-and health-constrained energy management for hybrid electric bus based on soft actor-critic DRL algorithm.IEEE Trans Industr Inform 2021; 17(6):3751-3761.

[23]

Jia C, Zhou J, He H, Li J, Wei Z, Li K.Health-conscious deep reinforcement learning energy management for fuel cell buses integrating environmental and look-ahead road information.Energy 2024; 290:130146.

[24]

Huang R, He H, Zhao X, Gao M.Longevity-aware energy management for fuel cell hybrid electric bus based on a novel proximal policy optimization deep reinforcement learning framework.J Power Sources 2023; 561:232717.

[25]

Liu ZE, Li Y, Zhou Q, Li Y, Shuai B, Xu H, et al.Deep reinforcement learning based energy management for heavy duty HEV considering discrete-continuous hybrid action space.IEEE Trans Transp Electrif. In press.

[26]

Li Y, He H, Khajepour A, Wang H, Peng J.Energy management for a power-split hybrid electric bus via deep reinforcement learning with terrain information.Appl Energy 2019; 255:113762.

[27]

Lian R, Tan H, Peng J, Li Q, Wu Y.Cross-type transfer for deep reinforcement learning based hybrid electric vehicle energy management.IEEE Trans Vehicular Technol 2020; 69(8):8367-8380.

[28]

Huang R, He H, Su Q.Towards a fossil-free urban transport system: an intelligent cross-type transferable energy management framework based on deep transfer reinforcement learning.Appl Energy 2024; 363:123080.

[29]

Agarwal R, Schuurmans D, Norouzi M.An optimistic perspective on offline reinforcement learning.In: Proceedings of the International Conference on Machine Learning (PML R 2020); 2020 Jul 13–18; virtual event. Birmingham: Proceedings of Machine Learning Research; 2020. p. 104–14.

[30]

He H, Niu Z, Wang Y, Huang R, Shou Y.Energy management optimization for connected hybrid electric vehicle using offline reinforcement learning.J Energy Storage 2023; 72:108517.

[31]

Wang Z, He H, Peng J, Chen W, Wu C, Fan Y, et al.A comparative study of deep reinforcement learning based energy management strategy for hybrid electric vehicle.Energy Convers Manage 2023; 293:117442.

[32]

Brockman G, Cheung V, Pettersson L, Schneider J, Schulman J, Tang J, et al.OpenAI gym.2016. arXiv: 1606.01540.

[33]

Dosovitskiy A, Ros G, Codevilla F, Lopez A, Koltun V.Carla: an open urban driving simulator.In: Proceedings of the Conference on robot learning (PML R 2017); 2017 Nov 13–15; Mountain View, C A, US A. Birmingham: Proceedings of Machine Learning Research; 2020. 2017. p. 1–16.

[34]

Kato S, Tokunaga S, Maruyama Y, Maeda S, Hirabayashi M, Kitsukawa Y, et al.Autoware on board: enabling autonomous vehicles with embedded systems.In: Proceedings of the 2018 AC M/IEE E 9th International Conference on Cyber-Physical Systems (ICCP S); 2018 Apr 11–13; Porto, Portugal. New York City: IEE E; 2018. p. 287–96.

[35]

Wang Y, Tan H, Wu Y, Peng J.Hybrid electric vehicle energy management with computer vision and deep reinforcement learning.IEEE Trans Industr Inform 2021; 17(6):3857-3868.

[36]

Yuankai W, Renzong L, Yong W, Yi L..

[37]

Li Q, Yin L, Yang H, Wang T, Qiu Y, Chen W.Multiobjective optimization and data-driven constraint adaptive predictive control for efficient and stable operation of pemfc system.IEEE Trans Ind Electron 2021; 68(12):12418-12429.

[38]

Wang Z, Schaul T, Hessel M, Hasselt H, Lanctot M, Freitas N..

[39]

Fujimoto S, Hoof H, Meger D.Addressing function approximation error in actor-critic methods.In: Proceedings of the International Conference on Machine Learning; 2018 Jul 10–15; Stockholm, Sweden. Birmingham: Proceedings of Machine Learning Research (PML R); 2018. p. 1587–96.

[40]

Haarnoja T, Zhou A, Abbeel P, Levine S..

[41]

Schulman J, Wolski F, Dhariwal P, Radford A, Klimov O.Proximal policy optimization algorithms.2017. arXiv: 1707.06347.

[42]

Peng J, Ren T, Chen Z, Chen W, Wu C, Ma C.Efficient training for energy management in fuel cell hybrid electric vehicles: an imitation learning-embedded deep reinforcement learning framework.J Clean Prod 2024; 447:141360.

[43]

Prudencio RF, Maximo MR, Colombini EL.A survey on offline reinforcement learning: taxonomy, review, and open problems.IEEE Trans Neural Netw Learn Syst 2023; 35(8):10237-10257.

[44]

Kumar A, Zhou A, Tucker G, Levine S.Conservative q-learning for offline reinforcement learning.Adv Neural Inf Process Syst 2020; 33:1179-1191.

[45]

Chen X, Mu YM, Luo P, Li S, Chen J.Flow-based recurrent belief state learning for POMDPs.In: Proceedings of the International Conference on Machine Learning; 2022 Jul 17–23; Baltimore, M D, US A. Birmingham: Proceedings of Machine Learning Research (PML R); 2022. p. 3444–68.

[46]

Wang Y, Lian R, He H, Betz J, Wei H.Auto-tuning dynamics parameters of intelligent electric vehicles via Bayesian optimization.IEEE Trans Transp Electrif 2023; 10(3):6915-6927.

[47]

Zhang F, Xiao L, Coskun S, Pang H, Xie S, Liu K, et al.Comparative study of energy management in parallel hybrid electric vehicles considering battery ageing.Energy 2023; 264:123219.

[48]

Soumeur MA, Gasbaoui B, Abdelkhalek O, Ghouili J, Toumi T, Chakar A.Comparative study of energy management strategies for hybrid proton exchange membrane fuel cell four wheel drive electric vehicle.J Power Sources 2020; 462:228167.

[49]

China Society of Automotive Engineers (China SAE).Energy-saving and new energy vehicle technology roadmap 2.0. Beijing: China SAE: 2020.

AI Summary AI Mindmap
PDF (6046KB)

Supplementary files

Appendix A

2030

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/