TY - JOUR
T1 - Developing misinformation immunity
T2 - How to reason-check fallacious news in a human–computer interaction environment
AU - Musi, Elena
AU - Carmi, Elinor
AU - Reed, Chris
AU - Yates, Simeon
AU - O'Halloran, Kay
N1 - Special Issue: Multidisciplinary Approaches to Mis- and Disinformation Studies
This work was supported by the UK Research and Innovation Economic and Social Research Council (Grant No. ES/V003909/1).
Copyright:
© The Author(s) 2023.
PY - 2023/1
Y1 - 2023/1
N2 - To counter the fake news phenomenon, the scholarly community has attempted to debunk and prebunk disinformation. However, misinformation still constitutes a major challenge due to the variety of misleading techniques and their continuous updates which call for the exercise of critical thinking to build resilience. In this study we present two open access chatbots, the Fake News Immunity Chatbot and the Vaccinating News Chatbot, which combine Fallacy Theory and Human–Computer Interaction to inoculate citizens and communication gatekeepers against misinformation. These chatbots differ from existing tools both in function and form. First, they target misinformation and enhance the identification of fallacious arguments; and second, they are multiagent and leverage discourse theories of persuasion in their conversational design. After having described both their backend and their frontend design, we report on the evaluation of the user interface and impact on users’ critical thinking skills through a questionnaire, a crowdsourced survey, and a pilot qualitative experiment. The results shed light on the best practices to design user-friendly active inoculation tools and reveal that the two chatbots are perceived as increasing critical thinking skills in the current misinformation ecosystem.
AB - To counter the fake news phenomenon, the scholarly community has attempted to debunk and prebunk disinformation. However, misinformation still constitutes a major challenge due to the variety of misleading techniques and their continuous updates which call for the exercise of critical thinking to build resilience. In this study we present two open access chatbots, the Fake News Immunity Chatbot and the Vaccinating News Chatbot, which combine Fallacy Theory and Human–Computer Interaction to inoculate citizens and communication gatekeepers against misinformation. These chatbots differ from existing tools both in function and form. First, they target misinformation and enhance the identification of fallacious arguments; and second, they are multiagent and leverage discourse theories of persuasion in their conversational design. After having described both their backend and their frontend design, we report on the evaluation of the user interface and impact on users’ critical thinking skills through a questionnaire, a crowdsourced survey, and a pilot qualitative experiment. The results shed light on the best practices to design user-friendly active inoculation tools and reveal that the two chatbots are perceived as increasing critical thinking skills in the current misinformation ecosystem.
KW - misinformation
KW - fallacies
KW - chatbots
KW - critical thinking
KW - reason-checking
KW - human-computer interaction
KW - human–computer interaction
UR - http://www.scopus.com/inward/record.url?scp=85147299680&partnerID=8YFLogxK
U2 - 10.1177/20563051221150407
DO - 10.1177/20563051221150407
M3 - Article
SN - 2056-3051
VL - 9
SP - 1
EP - 18
JO - Social Media + Society
JF - Social Media + Society
IS - 1
ER -