IDEAS home Printed from https://ideas.repec.org/a/igg/jswis0/v20y2024i1p1-29.html
   My bibliography  Save this article

Deep Learning-Driven Multi-Stage Product Visual Emotional Design Utilizing Web Semantics

Author

Listed:
  • Xiaoqing Chen

    (Department of Art and Architecture, Southeast University Chengxian College, Nanjing, China)

  • Juanfen Wang

    (Department of Art and Architecture, Southeast University Chengxian College, Nanjing, China)

  • Lin Zhu

    (School of Electronic and Computer Engineering, Southeast University Chengxian College, Nanjing, China)

Abstract

With the rapid development of artificial intelligence, the generation of emotional expression images has become a key research field. This article introduces a novel multi-stage emotion generation model cascade method that utilizes CGAN, Pix2Pix, and CycleGAN to create images with enhanced emotional depth and visual quality. We have outlined our approach, which involves a continuous process from emotional initialization to texture refinement, and then to style transition. Our experiments on facial and automotive datasets show a significant improvement in image quality compared to traditional models, with an average increase of 40 percentage points in structural similarity (SSIM) and an average increase of 11.1 percentage points in peak signal-to-noise ratio (PSNR). The research findings emphasize the potential applications of our model in advertising, entertainment, and human-computer interaction, where visual effects of emotional resonance are crucial.

Suggested Citation

  • Xiaoqing Chen & Juanfen Wang & Lin Zhu, 2024. "Deep Learning-Driven Multi-Stage Product Visual Emotional Design Utilizing Web Semantics," International Journal on Semantic Web and Information Systems (IJSWIS), IGI Global, vol. 20(1), pages 1-29, January.
  • Handle: RePEc:igg:jswis0:v:20:y:2024:i:1:p:1-29
    as

    Download full text from publisher

    File URL: http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/IJSWIS.354735
    Download Restriction: no
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jswis0:v:20:y:2024:i:1:p:1-29. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.