Author
Listed:
- Yifan Li
- Zhi-hai Zhang
- Xiaowei Yue
- Li Zheng
Abstract
Advanced production systems, such as multi-step assembly processes, predominantly comprise of repetitive operations. The repetitive manual or human–robot integrated production operations call for new real-time process management technologies such as the increasing use of sensors and the development of system intelligence. Conventional process monitoring and management methods, which are often labor-intensive, fall short in providing immediate and actionable insights. To tackle this limitation, we develop an unsupervised embedding method to automatically delineate the process into different stages and predict real-time progress information. We propose a Contrastive Variational Autoencoder as a feature extractor to adeptly embed repetitive processes into a Gaussian Mixture Model. Based on the extracted features, we propose an adaptive change-point detection and an Iterative Dynamic Time Wrapping algorithm to identify and segment multiple standardized process stages automatically. Theoretically, we establish the asymptotic optimality of the detected change-points associated with the given precision of image and feature extractors, ensuring the high-quality process stage separation and labeling. The proposed method autonomously extracts essential features encapsulating progress information from a limited set of unlabelled process videos. Through four diverse case studies including production of an actual aircraft spoiler, our method exhibits very promising performance. Specifically, it achieves an average of 98.14% accuracy in predicting production progress and 0.9202 area under the curve in predicting progress deviation across three distinct production environments. The proposed process monitoring method in repetitive production systems has the potential to significantly improve productivity, promote standardization of repetitive operations, and predict production deviations.
Suggested Citation
Yifan Li & Zhi-hai Zhang & Xiaowei Yue & Li Zheng, 2025.
"An unsupervised embedding method based on streaming videos for process monitoring in repetitive production systems,"
IISE Transactions, Taylor & Francis Journals, vol. 57(6), pages 724-739, June.
Handle:
RePEc:taf:uiiexx:v:57:y:2025:i:6:p:724-739
DOI: 10.1080/24725854.2024.2386415
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:uiiexx:v:57:y:2025:i:6:p:724-739. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: http://www.tandfonline.com/uiie .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.