IDEAS home Printed from https://ideas.repec.org/p/hal/journl/hal-04745006.html
   My bibliography  Save this paper

Anticipatory guilt in chatbot service recovery

Author

Listed:
  • Khaloud Nasser Alsaid

    (KSU - King Saud University [Riyadh])

  • Houssam Jedidi

    (HMKW University of Applied Sciences for Media, Communication and Management in Frankfurt)

  • Reza Vaezi

    (KSU - Kennesaw State University)

  • Samiha Mjahed

    (KSU - King Saud University [Riyadh])

  • Mohammed Hakimi

    (University of Prince Mugrin)

Abstract

For a better understanding of the efficacy of chatbot‐based service recovery efforts, this study responds to calls to move beyond comparisons of recovery channels (e.g., human vs. AI agent) and examine how chatbots should act, especially relative to both economic recovery and emotional recovery, to enhance customer subjective well-being. Therefore, the present study tries to fill in this gap by specifically exploring the psychological mechanism for the effectiveness of different chatbot service recovery (SR) strategies. We review the literature on intelligent machines in service, SR, and prominent psychological theories of guilt. Then, we move on to explore anticipated guilt as a mechanism for moral intention. Next, we present our theory of guilt along with a series of propositions on AI recovery, and how it affects customer's subjective well-being. Previous research on emotion regulation strategies suggests that individuals faced with negative emotions often engage in one of two emotion regulation strategies: (1). Reappraisal, which involves re-evaluating a given situation to reduce or shift the negative emotion; (2). Suppression, which simply inhibits emotion-expressive behaviors. A specific emotional regulation strategy has distinct affective consequences, with reappraisal being more helpful than suppression at decreasing negative emotional experiences and promoting individual well-being (Haga et al. 2009). In negatively valenced situations such as addressing a service failure, emotional recovery strategy can alleviate negative emotions. Instead of suppressing negative emotions (focusing just on utilitarian SR), an AI agent providing emotional recovery shows empathic concern, encourages consumers to express their emotions through active listening and acknowledgement of such emotions. These actions can shift the consumers' perspective and facilitate the reappraisal of the situation (Groth and Grandey 2012). The ability of emotional chatbot recovery to regulate negative emotion should lead to subjective well-being conferred by forgiveness. Furthermore, extending on the theory of mind (Gray et al., 2007), we posit that AI agent with both human qualities, affective and cognitive capabilities, in negative encounter context could be perceived as human-like. AI agents appearing more humanlike often become more appealing and generate positive feelings. Individuals will apply and transpose their moral intention if they perceive AI to have more humanlike characteristics. In this research, we focus on guilt, a potential reaction of complaining customer in the private textbased service chats, as a moral emotion, a moral motivator for moral intention (Giroux et al., 2022; Kim et al., 2022). We elaborate a theory that proposes guilt as a negative emotion that increases subjective well-being in negatively valenced situations such as addressing a service failure, and provides insight into how it relates to an AI service recovery through a series of propositions. Interacting with economic and emotional chatbot SR would increase the customer's likelihood of engaging in forgiveness and enhance customer's subjective well-being through eliciting anticipatory guilt.

Suggested Citation

  • Khaloud Nasser Alsaid & Houssam Jedidi & Reza Vaezi & Samiha Mjahed & Mohammed Hakimi, 2024. "Anticipatory guilt in chatbot service recovery," Post-Print hal-04745006, HAL.
  • Handle: RePEc:hal:journl:hal-04745006
    as

    Download full text from publisher

    To our knowledge, this item is not available for download. To find whether it is available, there are three options:
    1. Check below whether another version of this item is available online.
    2. Check on the provider's web page whether it is in fact available.
    3. Perform a search for a similarly titled item that would be available.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hal:journl:hal-04745006. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: CCSD (email available below). General contact details of provider: https://hal.archives-ouvertes.fr/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.