Author
Listed:
- Nicolas Pfeuffer
(Goethe University Frankfurt)
- Lorenz Baum
(Goethe University Frankfurt)
- Wolfgang Stammer
(Technical University of Darmstadt)
- Benjamin M. Abdel-Karim
(Goethe University Frankfurt)
- Patrick Schramowski
(Technical University of Darmstadt)
- Andreas M. Bucher
(Hospital of the Goethe University Frankfurt)
- Christian Hügel
(Hospital of the Goethe University Frankfurt)
- Gernot Rohde
(Hospital of the Goethe University Frankfurt)
- Kristian Kersting
(Technical University of Darmstadt)
- Oliver Hinz
(Goethe University Frankfurt)
Abstract
The most promising standard machine learning methods can deliver highly accurate classification results, often outperforming standard white-box methods. However, it is hardly possible for humans to fully understand the rationale behind the black-box results, and thus, these powerful methods hamper the creation of new knowledge on the part of humans and the broader acceptance of this technology. Explainable Artificial Intelligence attempts to overcome this problem by making the results more interpretable, while Interactive Machine Learning integrates humans into the process of insight discovery. The paper builds on recent successes in combining these two cutting-edge technologies and proposes how Explanatory Interactive Machine Learning (XIL) is embedded in a generalizable Action Design Research (ADR) process – called XIL-ADR. This approach can be used to analyze data, inspect models, and iteratively improve them. The paper shows the application of this process using the diagnosis of viral pneumonia, e.g., Covid-19, as an illustrative example. By these means, the paper also illustrates how XIL-ADR can help identify shortcomings of standard machine learning projects, gain new insights on the part of the human user, and thereby can help to unlock the full potential of AI-based systems for organizations and research.
Suggested Citation
Nicolas Pfeuffer & Lorenz Baum & Wolfgang Stammer & Benjamin M. Abdel-Karim & Patrick Schramowski & Andreas M. Bucher & Christian Hügel & Gernot Rohde & Kristian Kersting & Oliver Hinz, 2023.
"Explanatory Interactive Machine Learning,"
Business & Information Systems Engineering: The International Journal of WIRTSCHAFTSINFORMATIK, Springer;Gesellschaft für Informatik e.V. (GI), vol. 65(6), pages 677-701, December.
Handle:
RePEc:spr:binfse:v:65:y:2023:i:6:d:10.1007_s12599-023-00806-x
DOI: 10.1007/s12599-023-00806-x
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:binfse:v:65:y:2023:i:6:d:10.1007_s12599-023-00806-x. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.