바로가기메뉴

본문 바로가기 주메뉴 바로가기

logo

이미지 감정색인을 위한 시각적 요인 분석에 관한 탐색적 연구

An Exploratory Investigation on Visual Cues for Emotional Indexing of Image

한국문헌정보학회지 / Journal of the Korean Society for Library and Information Science, (P)1225-598X; (E)2982-6292
2014, v.48 no.1, pp.53-73
https://doi.org/10.4275/KSLIS.2014.48.1.053
정선영 (이화여자대학교)
정은경 (이화여자대학교)
  • 다운로드 수
  • 조회수

초록

감정기반 컴퓨팅 환경의 발전에 따라서 이미지를 포함한 멀티미디어 정보 자원의 감정 접근과 이용은 중요한 연구과제이다. 본 연구는 이미지의 감정색인을 위한 시각적인 요인의 탐색적 규명을 목적으로 한다. 연구목적을 성취하기 위해서 본 연구는 사랑, 행복, 슬픔, 공포, 분노의 5가지 기본감정으로 색인된 15건의 이미지를 대상으로 20명의 연구 참여자와의 인터뷰를 통해서 총 620건의 감정 시각적 요인을 추출하였다. 감정을 촉발하는 시각적 요인(5가지)과 하위 요인(18가지)의 분포와 5가지 감정별 시각적 요인 분포를 분석하여 그 결과를 제시하였다. 이미지의 감정을 인지하는 주요한 시각적 요인으로는 얼굴표정, 인물의 동작이나 행위, 선, 형태, 크기 등의 조형적 요소가 차지하는 비중이 높은 것으로 나타났다. 개별 감정과 시각적인 요인과의 관계를 살펴보면, 사랑 감정은 인물의 동작이나 행위와 밀접하게 나타났으며, 행복 감정은 인물의 얼굴 표정이 중요한 것으로 나타났다. 슬픔 감정 역시 인물의 동작이나 행위와 밀접하게 연계되어 있으며, 공포 감정은 얼굴의 표정과 깊은 관계가 있다. 분노 감정은 조형적인 요소인 선, 형태, 크기가 특징적으로 나타났다. 이러한 결과는 이미지가 지니는 내용기반 요소와 개념기반 요소의 복합적인 접근이 효과적인 감정색인에 있어서 중요하다는 것을 제시한다.

keywords
Image Retreival, Emotion, Emotional Indexing, Image Indexing, Visual Cue, 이미지 검색, 감정, 감정색인, 이미지 색인, 시각적 요인

Abstract

Given that emotion-based computing environment has grown recently, it is necessary to focus on emotional access and use of multimedia resources including images. The purpose of this study aims to identify the visual cues for emotion in images. In order to achieve it, this study selected five basic emotions such as love, happiness, sadness, fear, and anger and interviewed twenty participants to demonstrate the visual cues for emotions. A total of 620 visual cues mentioned by participants were collected from the interview results and coded according to five categories and 18 sub-categories for visual cues. Findings of this study showed that facial expressions, actions/behaviors, and syntactic features were found to be significant in terms of perceiving a specific emotion of the image. An individual emotion from visual cues demonstrated distinctive characteristics. The emotion of love showed a higher relation with visual cues such as actions and behaviors, and the happy emotion is substantially related to facial expressions. In addition, the sad emotion was found to be perceived primarily through actions and behaviors and the fear emotion is perceived considerably through facial expressions. The anger emotion is highly related to syntactic features such as lines, shapes, and sizes. Findings of this study implicated that emotional indexing could be effective when content-based features were considered in combination with concept-based features.

keywords
Image Retreival, Emotion, Emotional Indexing, Image Indexing, Visual Cue, 이미지 검색, 감정, 감정색인, 이미지 색인, 시각적 요인

참고문헌

1.

유소영, 문성빈. 2004. 심미적 인상을 이용한 이미지 검색에 관한 실험적 연구. 정보관리학회지 , 21(4): 187-208.

2.

이지연. 2002. 이용자 관점에서 본 이미지 색인의 객관성에 대한 연구. 정보관리학회지 , 19(3):123-143.

3.

정선영. 1997. 전자미술관을 위한 회화작품의 주제색인 방안에 관한 연구 . 석사학위논문, 이화여자대학교 대학원 문헌정보학과.

4.

Arnheim, R. 2004. Art and visual perception: A psychology of the creative eye. University of California Press.

5.

Burford, B., Briggs, P. and Eakins, J.P. 2003. “A taxonomy of the image: on the classification of content for image retrieval.” Visual Communication, 2(2): 123-161.

6.

Fidel, R. 1997. “The image retrieval task: Implications for the design and evaluation of image databases.” The New Review Hypermedia and Multimedia, 3: 181-200.

7.

Greisdorf, H. and O'Connor, B. 2002. “Modeling what users see when they look at images:a cognitive viewpoint.” Journal of Documentation, 58(1): 6-29.

8.

Holsti, O. 1969. Content analysis for the social sciences and humanities. Reading, MA: Addison-Wesley.

9.

Itten, J. 1973. The Art of Color: The Subjective Experience and Objective Rationale of Color. Wiley.

10.

Jörgensen, C. 2003. Image retrieval: theory and research. Lanham. MD: Scarecrow Press.

11.

Jörgensen, C., Jaimes, A., Benitez, A.B. and Chang, S.F. 2001. “A conceptual framework and empirical research for classifying visual descriptors.” Journal of the American Society for Information Science and Technology, 52(11): 938-947.

12.

Keister, L. 1994. User types and queries: impact on image access systems. In R. Fidel (Ed.), Challenges in indexing electronic text and images, (pp.7-19). Medford, NJ: Learned Information, Inc.

13.

Knautz, K. and Stock, W.G. 2011. “Collective indexing of emotions in videos.” Journal of Documentation, 67(6): 975-994.

14.

Kobayashi, Y. and Kato, P. 2000. Multi-contrast based texture model for understanding human subjectivity. 15th International Conference on Pattern Recognition, Barcelona, Spain, 3: 917-922.

15.

Lee, H-J. and Neal, D. 2010. “A new model for semantic photograph description combining basic levels and user-assigned descriptors.” Journal of Information Science, 36(5): 547-565.

16.

Mao, X., Chen, B. and Muta, I. 2003. “Affective property of image and fractional dimension.”Chaos Solitons and Fractals, 13: 905-910.

17.

Neal, D. 2010. “Emotion-based tags in photographic documents: The interplay of text, image, and social influence.” Canadian Journal of Information and Library Science, 34(3): 329-353.

18.

Picard, R.W. and Klein, J. 2002. “Computers that recognize and respond to user emotion:theoretical and practical implications.” Interacting with Computers, 14(2): 141-169.

19.

Picard, R.W., Vyzas, E. and Healey, J. 2001. “Toward machine emotional intelligence: analysis of affective physiological state.” IEEE Transactions Pattern Analysis and Machine Intelligence, 23(10): 1175-1191.

20.

Rorissa, A. 2008. “User-generated descriptions of individual images versus labels of groups of images: A comparison using basic level theory.” Information Processing and Management, 44: 1741-1753.

21.

Rosch, E. 1977. Human categorization. In Neil Warren (Ed.), Advances in Cross Cultural Psychology, 1, 1-72. London: Academic Press.

22.

Scherer, K R. 2005. “What are emotions? And how can they be measured?” Social Science Information, 44(4): 695-729.

23.

Schmidt, S. and Stock, W.G. 2009. “Collective indexing of emotions in images. A study in emotional information retrieval.” Journal of the American Society for Information Science and Technology, 60(5): 863-876.

24.

Tanaka, S., Iwadate, Y. and Inokuchi, S. 2000. “An attractiveness evaluation model based on the physical features of image regions.” Pattern Recognition, 2: 793-796.

25.

Wang, S. and Wang, X. 2005. “Emotion semantics image retrieval: an brief overview.” Lecture Notes for Computer Science, 3784: 490-497.

26.

Wild, B., Erb, M. and Bartels, M. 2001. “Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: quality, quantity, time course and gender differences.”Psychiatry Research, 102: 109-124.

27.

Yoon, J. 2010. “Utilizing quantitative users' reactions to represent affective meanings of an image.” Journal of the American Society for Information Science and Technology, 61(7):1345-1359.

28.

Yoon, J. 2011. “A comparative study of methods to explore searchers' affective perceptions of images.” Information Research, 16(2): 475. [online]<http://informationr.net/ir/16-2/paper475.html>

한국문헌정보학회지