Author
Listed:
- Lichun Guo
(College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China)
- Hao Zeng
(College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China)
- Xun Shi
(College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China)
- Qing Xu
(College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China)
- Jinhui Shi
(College of Art and Design, Nanjing Audit University Jinshen College, Nanjing 210023, China)
- Kui Bai
(College of Computer and Information Engineering, Nanjing Tech University, Nanjing 211816, China
College of Artificial Intelligence, Nanjing Tech University, Nanjing 211816, China)
- Shuang Liang
(School of Internet of Things, Nanjing University of Posts and Telecommunications, Nanjing 210023, China)
- Wenlong Hang
(College of Computer and Information Engineering, Nanjing Tech University, Nanjing 211816, China
College of Artificial Intelligence, Nanjing Tech University, Nanjing 211816, China)
Abstract
Precisely identifying interior decoration styles holds substantial significance in directing interior decoration practices. Nevertheless, constructing accurate models for the automatic classification of interior decoration styles remains challenging due to the scarcity of expert annotations. To address this problem, we propose a novel pseudo-label-guided contrastive mutual learning framework (PCML) for semi-supervised interior decoration style classification by harnessing large amounts of unlabeled data. Specifically, PCML introduces two distinct subnetworks and selectively utilizes the diversified pseudo-labels generated by each for mutual supervision, thereby mitigating the issue of confirmation bias. For labeled images, the inconsistent pseudo-labels generated by the two subnetworks are employed to identify images that are prone to misclassification. We then devise an inconsistency-aware relearning (ICR) regularization model to perform a review training process. For unlabeled images, we introduce a class-aware contrastive learning (CCL) regularization to learn their discriminative feature representations using the corresponding pseudo-labels. Since the use of distinct subnetworks reduces the risk of both models producing identical erroneous pseudo-labels, CCL can reduce the possibility of noise data sampling to enhance the effectiveness of contrastive learning. The performance of PCML is evaluated on five interior decoration style image datasets. For the average AUC, accuracy, sensitivity, specificity, precision, and F1 scores, PCML obtains improvements of 1.67%, 1.72%, 3.65%, 1.0%, 4.61%, and 4.66% in comparison with the state-of-the-art method, demonstrating the superiority of our method.
Suggested Citation
Lichun Guo & Hao Zeng & Xun Shi & Qing Xu & Jinhui Shi & Kui Bai & Shuang Liang & Wenlong Hang, 2024.
"Semi-Supervised Interior Decoration Style Classification with Contrastive Mutual Learning,"
Mathematics, MDPI, vol. 12(19), pages 1-19, September.
Handle:
RePEc:gam:jmathe:v:12:y:2024:i:19:p:2980-:d:1485508
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:19:p:2980-:d:1485508. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.