Exploring Factors Affecting User Intention to Accept Explainable Artificial Intelligence

Yu-Min Wang1 and Chei-Chang Chiou2

  1. Department of Information Management, National Chi Nan University
    Puli 545301, Taiwan
    ymwang@ncnu.edu.tw
  2. Department of Accounting, National Changhua University of Education
    Changhua 500, Taiwan
    ccchiou@cc.ncue.edu.tw

Abstract

Explainable Artificial intelligence (XAI) represents a pivotal innovation aimed at addressing the “black box” problem in AI, thereby enhancing users’ understanding of AI reasoning processes and outcomes. The implementation of XAI is not merely a technological endeavor but also involves various individual factors. As XAI remains in its early developmental stages and exhibits unique characteristics, identifying and understanding the factors influencing users’ intention to adopt XAI is essential for its long-term success. This study develops a research model grounded in the characteristics of XAI and prior technology acceptance studies that consider individual factors. The model was evaluated using data collected from 252 potential XAI users. The validated model exhibits strong explanatory power, accounting for 45% of the variance in users’ intention to use XAI. Findings indicate that perceived value and perceived need are key determinants of users' intention to adopt XAI. These results provide empirical evidence and deepen the understanding of user perceptions and intentions regarding XAI adoption.

Key words

explainable artificial intelligence, artificial intelligence, user acceptance, individual differences, intention to use

How to cite

Wang, Y., Chiou, C.: Exploring Factors Affecting User Intention to Accept Explainable Artificial Intelligence. Computer Science and Information Systems