IDEAS home Printed from https://ideas.repec.org/a/gam/jlogis/v6y2022i2p35-d826410.html
   My bibliography  Save this article

A Systematic Mapping Study on Machine Learning Techniques Applied for Condition Monitoring and Predictive Maintenance in the Manufacturing Sector

Author

Listed:
  • Thuy Linh Jenny Phan

    (Faculty of Informatics/Mathematics, Dresden University of Applied Sciences, 01069 Dresden, Germany)

  • Ingolf Gehrhardt

    (Faculty of Informatics/Mathematics, Dresden University of Applied Sciences, 01069 Dresden, Germany)

  • David Heik

    (Faculty of Informatics/Mathematics, Dresden University of Applied Sciences, 01069 Dresden, Germany)

  • Fouad Bahrpeyma

    (Faculty of Informatics/Mathematics, Dresden University of Applied Sciences, 01069 Dresden, Germany)

  • Dirk Reichelt

    (Faculty of Informatics/Mathematics, Dresden University of Applied Sciences, 01069 Dresden, Germany)

Abstract

Background: Today’s production facilities must be efficient in both manufacturing and maintenance. Efficiency enables the company to maintain the required output while reducing production effort or costs. With the increasing interest in process automation and the Internet of things since Industry 4.0 was introduced, such shop floors are growing in complexity. Every component of the production needs to be continuously monitored, which is the basis for predictive maintenance (PdM). To predict when maintenance is needed, the components’ conditions are monitored with the help of a condition monitoring (CM) system. However, this task is difficult for human employees, as the monitoring and analysis is very demanding. To overcome this, machine learning (ML) can be applied to ensure more efficient production. Methods: This paper aims to investigate the application of ML techniques for CM and PdM in the manufacturing sector. For this reason, a systematic mapping study (SMS) is conducted in order to structure and classify the current state of research and identify potential gaps for future investigation. Relevant literature was considered between January 2011 and May 2021. Results: Based on the guidelines for SMSs and previously defined research questions, existing publications are examined and a systematic overview of the current state of the research domain is provided. Conclusions: Techniques such as reinforcement learning and transfer learning are underrepresented, but increasingly attracting more attention. The findings of this study suggest that the most promising results belong to the applications of hybrid ML methods, where a set of methods are combined to build a more powerful model.

Suggested Citation

  • Thuy Linh Jenny Phan & Ingolf Gehrhardt & David Heik & Fouad Bahrpeyma & Dirk Reichelt, 2022. "A Systematic Mapping Study on Machine Learning Techniques Applied for Condition Monitoring and Predictive Maintenance in the Manufacturing Sector," Logistics, MDPI, vol. 6(2), pages 1-22, May.
  • Handle: RePEc:gam:jlogis:v:6:y:2022:i:2:p:35-:d:826410
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2305-6290/6/2/35/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2305-6290/6/2/35/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Gabriel Michau & Yang Hu & Thomas Palmé & Olga Fink, 2020. "Feature learning for fault detection in high-dimensional condition monitoring signals," Journal of Risk and Reliability, , vol. 234(1), pages 104-115, February.
    2. Wu, Jingda & He, Hongwen & Peng, Jiankun & Li, Yuecheng & Li, Zhanjiang, 2018. "Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus," Applied Energy, Elsevier, vol. 222(C), pages 799-811.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sun, Alexander Y., 2020. "Optimal carbon storage reservoir management through deep reinforcement learning," Applied Energy, Elsevier, vol. 278(C).
    2. Matteo Acquarone & Claudio Maino & Daniela Misul & Ezio Spessa & Antonio Mastropietro & Luca Sorrentino & Enrico Busto, 2023. "Influence of the Reward Function on the Selection of Reinforcement Learning Agents for Hybrid Electric Vehicles Real-Time Control," Energies, MDPI, vol. 16(6), pages 1-22, March.
    3. Li, Shuangqi & He, Hongwen & Li, Jianwei, 2019. "Big data driven lithium-ion battery modeling method based on SDAE-ELM algorithm and data pre-processing technology," Applied Energy, Elsevier, vol. 242(C), pages 1259-1273.
    4. Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
    5. Chen, Zheng & Hu, Hengjie & Wu, Yitao & Zhang, Yuanjian & Li, Guang & Liu, Yonggang, 2020. "Stochastic model predictive control for energy management of power-split plug-in hybrid electric vehicles based on reinforcement learning," Energy, Elsevier, vol. 211(C).
    6. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    7. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    8. Christian Montaleza & Paul Arévalo & Jimmy Gallegos & Francisco Jurado, 2024. "Enhancing Energy Management Strategies for Extended-Range Electric Vehicles through Deep Q-Learning and Continuous State Representation," Energies, MDPI, vol. 17(2), pages 1-21, January.
    9. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    10. Zhong, Shengyuan & Wang, Xiaoyuan & Zhao, Jun & Li, Wenjia & Li, Hao & Wang, Yongzhen & Deng, Shuai & Zhu, Jiebei, 2021. "Deep reinforcement learning framework for dynamic pricing demand response of regenerative electric heating," Applied Energy, Elsevier, vol. 288(C).
    11. Pang, Kexin & Zhou, Jian & Tsianikas, Stamatis & Coit, David W. & Ma, Yizhong, 2024. "Long-term microgrid expansion planning with resilience and environmental benefits using deep reinforcement learning," Renewable and Sustainable Energy Reviews, Elsevier, vol. 191(C).
    12. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
    13. Jichao Liu & Yanyan Liang & Zheng Chen & Wenpeng Chen, 2023. "Energy Management Strategies for Hybrid Loaders: Classification, Comparison and Prospect," Energies, MDPI, vol. 16(7), pages 1-23, March.
    14. Wang, Yue & Zeng, Xiaohua & Song, Dafeng & Yang, Nannan, 2019. "Optimal rule design methodology for energy management strategy of a power-split hybrid electric bus," Energy, Elsevier, vol. 185(C), pages 1086-1099.
    15. Sun, Wenjing & Zou, Yuan & Zhang, Xudong & Guo, Ningyuan & Zhang, Bin & Du, Guodong, 2022. "High robustness energy management strategy of hybrid electric vehicle based on improved soft actor-critic deep reinforcement learning," Energy, Elsevier, vol. 258(C).
    16. Zhuang, Weichao & Li (Eben), Shengbo & Zhang, Xiaowu & Kum, Dongsuk & Song, Ziyou & Yin, Guodong & Ju, Fei, 2020. "A survey of powertrain configuration studies on hybrid electric vehicles," Applied Energy, Elsevier, vol. 262(C).
    17. Maki, Seiya & Fujii, Minoru & Fujita, Tsuyoshi & Shiraishi, Yasushi & Ashina, Shuichi & Gomi, Kei & Sun, Lu & Budi Nugroho, Sudarmanto & Nakano, Ryoko & Osawa, Takahiro & Immanuel, Gito & Boer, Rizald, 2022. "A deep reinforced learning spatiotemporal energy demand estimation system using deep learning and electricity demand monitoring data," Applied Energy, Elsevier, vol. 324(C).
    18. Huang, Ruchen & He, Hongwen & Zhao, Xuyang & Wang, Yunlong & Li, Menglin, 2022. "Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm," Applied Energy, Elsevier, vol. 321(C).
    19. Chen, Zheng & Gu, Hongji & Shen, Shiquan & Shen, Jiangwei, 2022. "Energy management strategy for power-split plug-in hybrid electric vehicle based on MPC and double Q-learning," Energy, Elsevier, vol. 245(C).
    20. Zhengyu Yao & Hwan-Sik Yoon & Yang-Ki Hong, 2023. "Control of Hybrid Electric Vehicle Powertrain Using Offline-Online Hybrid Reinforcement Learning," Energies, MDPI, vol. 16(2), pages 1-18, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jlogis:v:6:y:2022:i:2:p:35-:d:826410. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.