바로가기메뉴

본문 바로가기 주메뉴 바로가기

ACOMS+ 및 학술지 리포지터리 설명회

  • 한국과학기술정보연구원(KISTI) 서울분원 대회의실(별관 3층)
  • 2024년 07월 03일(수) 13:30
 

한국심리학회지: 일반

인공지능 기술의 수용성에 미치는 공정세상믿음의 효과

The effect of Belief in a Just World on the acceptance of AI technology

한국심리학회지: 일반 / Korean Journal of Psychology: General, (P)1229-067X; (E)2734-1127
2020, v.39 no.4, pp.517-542
https://doi.org/10.22257/kjp.2020.12.39.4.517
김시내 (연세대학교)
손영우 (연세대학교)
  • 다운로드 수
  • 조회수

초록

인공지능 기술의 발전으로 다양한 직업에 인공지능이 도입되어 사람을 대체하고 있다. 본 연구는 인공지능을 인간이 주도하는 현행체제의 대안으로 보고, 체제정당화 이론에 기반하여 공정세상 믿음이 인공지능 수용성에 미치는 영향을 살펴보았다. 공정세상믿음이 강할수록 인간주도의 기존의 체제를 지지하고 공정세상 믿음이 약할수록 대안적인 체제인 인공지능 행위주체(Agent)를 지지할 것으로 예상하였다. 연구1은 사람과 인공지능을 직접적으로 비교 선택하는 문항을 사용하여 공정세상믿음의 효과를 검증하였고, 연구2는 인사선발절차 맥락에서 개체 간 실험설계를 통해 행위주체유형(사람 vs. 인공지능)에 따른 결과수용성에 공정세상믿음이 미치는 조절효과를 검증하였다. 연구결과, 연구1에서는 약한 공정세상믿음은 (사람과 비교한 상대적인) 인공지능의 공정성을 매개로 인공지능의 직업 유능성을 예측하여 매개모형 가설을 지지하였다. 연구2에서는 합격조건에서는 결과수용성에 대한 공정세상믿음의 조절효과가 유의하여, 공정세상믿음이 약한 사람은 인공지능 조건에서 더 높은 결과수용성을 나타냈다. 그러나 불합격조건에서는 조절효과가 지지되지 않았다. 두 연구의 결과, 참여자들은 공정세상믿음이 약할수록 사람보다 인공지능을 공정하고 유능하게 인식하였다. 연구 결과를 바탕으로 인공지능 수용성의 차이에 영향을 미치는 요인을 살펴보고, 연구의 의의, 한계점 및 후속연구에 대한 제언을 논의하였다.

keywords
인간-인공지능 상호작용, 공정세상믿음, 체제정당화, 인공지능 기술 수용성, 인공지능 면접, HCI, Just World Belief, System Justification, Artificial intelligence technology acceptance, AI interview

Abstract

With the development of artificial intelligence (AI) technology, AI has been introduced in various jobs to replace people. This study viewed AI as an alternative to the current human-led system and examined the effect of Belief in a Just World (BJW) on AI acceptance based on the system justification theory. We expected that participants with stronger BJW would prefer the current human-led system, and those with weaker BJW would be more likely to accept the AI-based system. Study 1 examined the effect of BJW on the perception of job competence by making the participants choose between human and AI. Study 2 examined the effect of BJW on the selection outcome acceptance in a 2 (Human vs. AI) X 2 (accepted vs. rejected) between-subjects design. Results showed that BJW predicted higher competence perception of AI, mediated by higher fairness perception of AI (Study1), and those with weaker BJW showed higher acceptance of selection results based on the AI system in the accepted condition, but not in the rejected condition (Study 2). Based on our findings, we discussed the factors affecting AI acceptance, limitations of the present study, and suggestions for future research.

keywords
인간-인공지능 상호작용, 공정세상믿음, 체제정당화, 인공지능 기술 수용성, 인공지능 면접, HCI, Just World Belief, System Justification, Artificial intelligence technology acceptance, AI interview

참고문헌

1.

Ababneh, K. I., Hackett, R. D., & Schat, A. C. (2014). The role of attributions and fairness in understanding job applicant reactions to selection procedures and decisions. Journal of Business and Psychology, 29(1), 111-129. https://doi.org/10.1007/s10869-013-9304-y.

2.

Acemoglu, D., & Restrepo, P. (2017). Robots and jobs: Evidence from US labor markets. NBER working paper 23285. https://doi.org/10.3386/w23285.

3.

Aiken, L. S., West, S. G., & Reno, R. R. (1991). Multiple regression: Testing and interpreting interactions. sage.

4.

Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (May 2016). Machine Bias. There’s software used across the country to predict future criminals. And it’s biased against blacks. Propublica. Retrieved October 07, 2020, from https://www.propublica.org/article/machine-bias-r isk-assessments-in-criminal-sentencing.

5.

Anseel, F., & Lievens, F. (2009). The mediating role of feedback acceptance in the relationship between feedback and attitudinal and performance outcomes. International Journal of Selection and Assessment, 17(4), 362-376. https://doi.org/10.1111/j.1468-2389.2009.00479.x

6.

Assaker, G. (2020). Age and gender differences in online travel reviews and user -generated-content (UGC) adoption: extending the technology acceptance model (TAM) with credibility theory. Journal of Hospitality Marketing & Management, 29(4), 428-449. https://doi.org/10.1080/19368623.2019.1653807

7.

Baert, S. (2018). Facebook profile picture appearance affects recruiters’ first hiring decisions. New Media & Society, 20(3), 1220-1239. https://doi.org/10.1177/1461444816687294

8.

Bauer, T. N., Maertz Jr, C. P., Dolen, M. R., &Campion, M. A. (1998). Longitudinal assessment of applicant reactions to employment testing and test outcome feedback. Journal of applied Psychology, 83(6), 892-903. https://doi.org/10.1037/0021-9010.83.6.892

9.

Bartlett, M. S., Littlewort, G. C., Frank, M. G., &Lee, K. (2014). Automatic decoding of facial movements reveals deceptive pain expressions. Current Biology, 24(7), 738-743. https://doi.org/10.1016/j.cub.2014.02.009

10.

Beierlein, C., Werner, C. S., Preiser, S., &Wermuth, S. (2011). Are just-world beliefs compatible with justifying inequality? Collective political efficacy as a moderator. Social Justice Research, 24(3), 278-296. https://doi.org/10.1007/s11211-011-0139-2

11.

Bell, B. S., Wiechmann, D., & Ryan, A. M. (2006). Consequences of organizational justice expectations in a selection system. Journal of Applied Psychology, 91(2), 455-466. https://doi.org/10.1037/0021-9010.91.2.455.

12.

Bigman, Y. E., & Gray, K. (2018). People are averse to machines making moral decisions. Cognition, 181, 21-34. https://doi.org/10.1037/0021-9010.91.2.455.

13.

Bóo, F. L., Rossi, M. A., & Urzúa, S. S. (2013). The labor market return to an attractive face:Evidence from a field experiment. Economics Letters, 118(1), 170-172. https://doi.org/10.1016/j.econlet.2012.10.016.

14.

Chernyak-Hai, L., Halabi, S., & Nadler, A. (2014). “Justified dependency”: Effects of perceived stability of social hierarchy and level of system justification on help-seeking behavior of low-status group members. Group Processes &Intergroup Relations, 17(4), 420-435. https://doi.org/10.1177/1368430213507320.

15.

Choi, Y. (2019). Comparing Probability of Computerisation of Korean Jobs: An Empirical Analysis with KNOW Data. [Master’s thesis, Seoul National University. Research Information Sharing Service.

16.

Dastin, J. (2018). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. Retrieved August 18, 2020, from https://www.reuters.com/article/us-amazon-com-j obs-automation-insight-idUSKCN1MK08G?fbcli d=IwAR3dWIANNaEuHYRuX41UOIJeAgD7Dr2Fa_5fkCSwQ4NygjoHre10p5WIMVo.

17.

Edwards, C., Edwards, A., Spence, P. R., &Shelton, A. K. (2014). Is that a bot running the social media feed? Testing the differences in perceptions of communication quality for a human agent and a bot agent on Twitter. Computers in Human Behavior, 33, 372-376. https://doi.org/10.1016/j.chb.2013.08.013.

18.

Frey, C. B., & Osborne, M. A. (2017). The future of employment: how susceptible are jobs to computerisation?. Technological forecasting and social change, 114, 254-280. https://doi.org/10.1016/j.techfore.2016.08.019.

19.

Friedman, B., Kahn Jr, P. H., & Hagman, J. (2003, April). Hardware companions?: What online AIBO discussion forums reveal about the human-robotic relationship. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 273-280). ACM. https://doi.org/10.1145/642611.642660.

20.

Friesen, J. P., Laurin, K., Shepherd, S., Gaucher, D., & Kay, A. C. (2019). System justification:Experimental evidence, its contextual nature, and implications for social change. British Journal of Social Psychology, 58(2), 315-339. https://doi.org/10.1111/bjso.12278.

21.

Gamliel, E., & Peer, E. (2009). Effect of framing on applicants' reactions to personnel selection methods. International Journal of Selection and Assessment, 17(3), 282-289.

22.

Gessl, A. S., Schlögl, S., & Mevenkamp, N. (2019). On the perceptions and acceptance of artificially intelligent robotics and the psychology of the future elderly. Behaviour &Information Technology, 38(11), 1068-1087. https://doi.org/10.1080/0144929x.2019.1566499.

23.

Georgia State University (2018). Artificial intelligence gets its day in court. Retrieved December 12, 2018, from <https://phys.org/news/2018-03-artificial-intellig ence-day-court.html>

24.

Goetz, J., Kiesler, S., & Powers, A. (2003, October). Matching robot appearance and behavior to tasks to improve human-robot cooperation. In Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003. The 12th IEEE International Workshop on (pp. 55-60). IEEE. https://doi.org/10.1109/roman.2003.1251796.

25.

Hafer, C. L., & Sutton, R. (2016). Belief in a just world. In Handbook of social justice theory and research (pp. 145-160). Springer.

26.

Jost, J. T., & Hunyady, O. (2005). Antecedents and consequences of system-justifying ideologies. Current directions in psychological science, 14(5), 260-265. https://doi.org/10.1111/j.0963-7214.2005.00377.x.

27.

Jung, J., Song, H., Kim, Y., Im, H., & Oh, S. (2017). Intrusion of software robots into journalism: The public's and journalists'perceptions of news written by algorithms and human journalists. Computers in Human Behavior, 71, 291-298. https://doi.org/10.1016/j.chb.2017.02.022.

28.

Khosla, R., Chu, M., & Nguyen, K. (2016). Human-Robot Interaction Modelling for Recruitment and Retention of Employees. HCI in Business, Government, and Organizations:Information Systems Lecture Notes in Computer Science, 302-312. https://doi.org/10.1007/978-3-319-39399-5_29.

29.

Kim, E., Kim, D., Park, H., Kim, S., & Kim, J. (2017). Validation of the Korean version of the Belief in a Just World Scale (K-BJWS). The Korean Journal of Counseling and Psychotherapy, 29(3), 689-710. https://doi.org/10.23844/kjcp.2017.08.29.3.689.

30.

Kim, T. Y., & Leung, K. (2007). Forming and reacting to overall fairness: A cross-cultural comparison. Organizational Behavior and Human Decision Processes, 104(1), 83-95. https://doi.org/10.1016/j.obhdp.2007.01.004.

31.

Kluger, A. N., & Rothstein, H. R. (1993). The influence of selection test type on applicant reactions to employment testing. Journal of Business and Psychology, 8(1), 3-25. https://doi.org/10.1007/bf02230391.

32.

Kuo, I. H., Rabindran, J. M., Broadbent, E., Lee, Y. I., Kerse, N., Stafford, R. M. Q., &MacDonald, B. A. (2009, September). Age and gender factors in user acceptance of healthcare robots. In RO-MAN 2009-The 18th IEEE International Symposium on Robot and Human Interactive Communication (pp. 214-219). IEEE. https://doi.org/10.1109/roman.2009.5326292.

33.

Langer, M., König, C. J., & Papathanasiou, M. (2019). Highly automated job interviews:Acceptance under the influence of stakes. International Journal of Selection and Assessment, 27(3), 217-234. https://doi.org/10.1111/ijsa.12246.

34.

Laurin, K., Shepherd, S., & Kay, A. C. (2010). Restricted emigration, system inescapability, and defense of the status quo:System-justifying consequences of restricted exit opportunities. Psychological Science, 21(8), 1075-1082. https://doi.org/10.1177/0956797610375448.

35.

Lee, C. (2017). The quantitative analysis between the occupational interests, values, knowledges, skills and the occupations’computerisable probabilities due to the fourth industrial revolution. [Master’s thesis, The Cyber University of Korea]. Research Information Sharing Service.

36.

Lee, M. K. (2018). Understanding perception of algorithmic decisions: Fairness, trust, and emotion in response to algorithmic management. Big Data & Society, 5(1),1-16. https://doi.org/10.1177/2053951718756684.

37.

Lerner, M. J. (1980). The belief in a just world. In The Belief in a just World (pp. 9-30). Springer.

38.

Lind, E. A. (2001). Advances in organizational justice. In J. Greenberg & R. Cropanzano (Eds.). Fairness heuristic theory: Justice judgments as pivotal cognitions in organizational relations (pp. 56-88). Stanford University Press.

39.

Lucas, T., Zhdanova, L., & Alexander, S. (2011). Procedural and distributive beliefs for self and others: Assessment of a four-factor individual differences model. Journal of Individual Differences, 32(1), 14-25. https://doi.org/10.1027/1614-0001/a000032.

40.

Makkonen, M., Frank, L., & Koivisto, K. (2017). Age differences in technology readiness and its effects on information system acceptance and use: the case of online electricity services in Finland. Bled 2017: Proceedings of the 30th Bled eConference. Digital Transformation: From Connecting Things to Transforming Our Lives, ISBN 978-961-286-043-1. https://doi.org/10.18690/978-961-286-043-1.28.

41.

Malle, B. F., Scheutz, M., Arnold, T., Voiklis, J., & Cusimano, C. (2015, March). Sacrifice one for the good of many? People apply different moral norms to human and robot agents. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 117-124). IEEE. https://doi.org/10.1145/2696454.2696458.

42.

Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data &Society, 3(2), 1-21. https://doi.org/10.1177/2053951716679679.

43.

Min, J., Kim, S., Park, Y., & Sohn, Y. W. (2018). A Comparative Study of Potential Job Candidates' Perceptions of an AI Recruiter and a Human Recruiter. Journal of the Korea Convergence Society, 9(5), 191-202. https://doi.org/10.15207/JKCS.2018.9.5.191 .

44.

Morris, M. G., & Venkatesh, V. (2000). Age differences in technology adoption decisions:Implications for a changing work force. Personnel psychology, 53(2), 375-403. https://doi.org/10.1111/j.1744-6570.2000.tb00206.x.

45.

Nam, T. (2019). Technology usage, expected job sustainability, and perceived job insecurity. Technological Forecasting and Social Change, 138, 155-165. https://doi.org/10.1016/j.techfore.2018.08.017.

46.

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of social issues, 56(1), 81-103. https://doi.org/10.1111/0022-4537.00153.

47.

Ötting, S. K., & Maier, G. W. (2018). The importance of procedural justice in Human-Machine Interactions: Intelligent systems as new decision agents in organizations. Computers in Human Behavior, 89, 27-39. https://doi.org/10.1016/j.chb.2018.07.022.

48.

Park, J., & Jung, Y. (2020). The effects of perceived Artificial Intelligence’s competence on employee’s job insecurity and work cynicism: Moderated mediation by work meaningfulness. Korean Journal of Industrial and Organizational Psychology, 33(2), 145-175. https://doi.org/10.24230/kjiop.v33i2.145-175.

49.

Parkin, S. (2016). The artificially intelligent doctor will hear you now - MIT Technology Review. Retrieved December 12, 2018, from <https://www.technologyreview.com/s/600868/th e-artificially-intelligent-doctor-will-hear-you-now/>.

50.

Peters, M. A. (2017). Deep Learning, the final stage of automation and the end of work (Again)? Psychosociological Issues in Human Resource Management, 5(2), 154-168.

51.

Pickard, M. D., Roster, C. A., & Chen, Y. (2016). Revealing sensitive information in personal interviews: Is self-disclosure easier with humans or avatars and under what conditions? Computers in Human Behavior, 65, 23-30. https://doi.org/10.22381/pihrm5220176.

52.

Reeves, B., & Nass, C. I. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge university press.

53.

Ployhart, R. E., & Harold, C. M. (2004). The applicant attribution‐reaction theory (AART):An integrative theory of applicant attributional processing. International Journal of Selection and Assessment, 12(1 2), 84-98. https://doi.org/10.1111/j.0965-075x.2004.00266.x.

54.

Schermerhorn, P., Scheutz, M., & Crowell, C. R. (2008, March). Robot social presence and gender: Do females view robots differently than males?. In Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction (pp. 263-270). https://doi.org/10.1145/1349822.1349857.

55.

Schroth, H. A., & Pradhan Shah, P. (2000). Procedures: Do we really want to know them? An examination of the effects of procedural justice on self-esteem. Journal of Applied Psychology, 85(3), 462-471. https://doi.org/10.1037/0021-9010.85.3.462.

56.

Stone, D. L., Deadrick, D. L., Lukaszewski, K. M., & Johnson, R. (2015). The influence of technology on the future of human resource management. Human Resource Management Review, 25(2), 216-231. https://doi.org/10.1016/j.hrmr.2015.01.002.

57.

Suen, H. Y., Chen, M. Y. C., & Lu, S. H. (2019). Does the use of synchrony and artificial intelligence in video interviews affect interview ratings and applicant attitudes?. Computers in Human Behavior, 98, 93-101. https://doi.org/10.1016/j.chb.2019.04.012.

58.

Sundar, S. (2008). The MAIN model: A heuristic approach to understanding technology effects on credibility. In M. Metzger & A. Flanagin (Eds.), Digital media, youth, and credibility (pp. 73-100). MIT Press.

59.

Sundar, S. S., & Nass, C. (2001). Conceptualizing sources in online news. Journal of Communication, 51(1), 52-72. https://doi.org/10.1111/j.1460-2466.2001.tb02872.x.

60.

Tung, F. W. (2016). Child perception of humanoid robot appearance and behavior. International Journal of Human-Computer Interaction, 32(6), 493-502. https://doi.org/10.1080/10447318.2016.1172808.

61.

Van der Kaa, H. A. J., & Krahmer, E. J. (2014). Journalist versus news consumer: The perceived credibility of machine written news. In Proceedings of the computational Journalism conference. NY.

62.

Voiklis, J., Kim, B., Cusimano, C., & Malle, B. F. (2016, August). Moral judgments of human vs. robot agents. In Robot and Human Interactive Communication (RO-MAN), 2016 25th IEEE International Symposium on (pp. 775-780). IEEE. https://doi.org/10.1109/roman.2016.7745207.

63.

White, K., MacDonnell, R., & Ellard, J. H. (2012). Belief in a just world: Consumer intentions and behaviors toward ethical products. Journal of Marketing, 76(1), 103-118. https://doi.org/10.1509/jm.09.0581.

64.

Youyou, W., Kosinski, M., & Stillwell, D. (2015). Computer-based personality judgments are more accurate than those made by humans. Proceedings of the National Academy of Sciences, 112(4), 1036-1040. https://doi.org/10.1073/pnas.1418680112.

65.

Zielinksi, D. (2017, February 13). Recruiting Gets Smart Thanks to Artificial Intelligence. Retrieved September 20, 2017, from https://www.shrm.org/resourcesandtools/hrtopics/t echnology/pages/recruiting-gets-smart-thanks-toartificial-intelligence.aspx

한국심리학회지: 일반