Given that emotion-based computing environment has grown recently, it is necessary to focus on emotional access and use of multimedia resources including images. The purpose of this study aims to identify the visual cues for emotion in images. In order to achieve it, this study selected five basic emotions such as love, happiness, sadness, fear, and anger and interviewed twenty participants to demonstrate the visual cues for emotions. A total of 620 visual cues mentioned by participants were collected from the interview results and coded according to five categories and 18 sub-categories for visual cues. Findings of this study showed that facial expressions, actions/behaviors, and syntactic features were found to be significant in terms of perceiving a specific emotion of the image. An individual emotion from visual cues demonstrated distinctive characteristics. The emotion of love showed a higher relation with visual cues such as actions and behaviors, and the happy emotion is substantially related to facial expressions. In addition, the sad emotion was found to be perceived primarily through actions and behaviors and the fear emotion is perceived considerably through facial expressions. The anger emotion is highly related to syntactic features such as lines, shapes, and sizes. Findings of this study implicated that emotional indexing could be effective when content-based features were considered in combination with concept-based features.
유소영, 문성빈. 2004. 심미적 인상을 이용한 이미지 검색에 관한 실험적 연구. 정보관리학회지 , 21(4): 187-208.
이지연. 2002. 이용자 관점에서 본 이미지 색인의 객관성에 대한 연구. 정보관리학회지 , 19(3):123-143.
정선영. 1997. 전자미술관을 위한 회화작품의 주제색인 방안에 관한 연구 . 석사학위논문, 이화여자대학교 대학원 문헌정보학과.
Arnheim, R. 2004. Art and visual perception: A psychology of the creative eye. University of California Press.
Burford, B., Briggs, P. and Eakins, J.P. 2003. “A taxonomy of the image: on the classification of content for image retrieval.” Visual Communication, 2(2): 123-161.
Fidel, R. 1997. “The image retrieval task: Implications for the design and evaluation of image databases.” The New Review Hypermedia and Multimedia, 3: 181-200.
Greisdorf, H. and O'Connor, B. 2002. “Modeling what users see when they look at images:a cognitive viewpoint.” Journal of Documentation, 58(1): 6-29.
Holsti, O. 1969. Content analysis for the social sciences and humanities. Reading, MA: Addison-Wesley.
Itten, J. 1973. The Art of Color: The Subjective Experience and Objective Rationale of Color. Wiley.
Jörgensen, C. 2003. Image retrieval: theory and research. Lanham. MD: Scarecrow Press.
Jörgensen, C., Jaimes, A., Benitez, A.B. and Chang, S.F. 2001. “A conceptual framework and empirical research for classifying visual descriptors.” Journal of the American Society for Information Science and Technology, 52(11): 938-947.
Keister, L. 1994. User types and queries: impact on image access systems. In R. Fidel (Ed.), Challenges in indexing electronic text and images, (pp.7-19). Medford, NJ: Learned Information, Inc.
Knautz, K. and Stock, W.G. 2011. “Collective indexing of emotions in videos.” Journal of Documentation, 67(6): 975-994.
Kobayashi, Y. and Kato, P. 2000. Multi-contrast based texture model for understanding human subjectivity. 15th International Conference on Pattern Recognition, Barcelona, Spain, 3: 917-922.
Lee, H-J. and Neal, D. 2010. “A new model for semantic photograph description combining basic levels and user-assigned descriptors.” Journal of Information Science, 36(5): 547-565.
Mao, X., Chen, B. and Muta, I. 2003. “Affective property of image and fractional dimension.”Chaos Solitons and Fractals, 13: 905-910.
Neal, D. 2010. “Emotion-based tags in photographic documents: The interplay of text, image, and social influence.” Canadian Journal of Information and Library Science, 34(3): 329-353.
Picard, R.W. and Klein, J. 2002. “Computers that recognize and respond to user emotion:theoretical and practical implications.” Interacting with Computers, 14(2): 141-169.
Picard, R.W., Vyzas, E. and Healey, J. 2001. “Toward machine emotional intelligence: analysis of affective physiological state.” IEEE Transactions Pattern Analysis and Machine Intelligence, 23(10): 1175-1191.
Rorissa, A. 2008. “User-generated descriptions of individual images versus labels of groups of images: A comparison using basic level theory.” Information Processing and Management, 44: 1741-1753.
Rosch, E. 1977. Human categorization. In Neil Warren (Ed.), Advances in Cross Cultural Psychology, 1, 1-72. London: Academic Press.
Scherer, K R. 2005. “What are emotions? And how can they be measured?” Social Science Information, 44(4): 695-729.
Schmidt, S. and Stock, W.G. 2009. “Collective indexing of emotions in images. A study in emotional information retrieval.” Journal of the American Society for Information Science and Technology, 60(5): 863-876.
Tanaka, S., Iwadate, Y. and Inokuchi, S. 2000. “An attractiveness evaluation model based on the physical features of image regions.” Pattern Recognition, 2: 793-796.
Wang, S. and Wang, X. 2005. “Emotion semantics image retrieval: an brief overview.” Lecture Notes for Computer Science, 3784: 490-497.
Wild, B., Erb, M. and Bartels, M. 2001. “Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: quality, quantity, time course and gender differences.”Psychiatry Research, 102: 109-124.
Yoon, J. 2010. “Utilizing quantitative users' reactions to represent affective meanings of an image.” Journal of the American Society for Information Science and Technology, 61(7):1345-1359.
Yoon, J. 2011. “A comparative study of methods to explore searchers' affective perceptions of images.” Information Research, 16(2): 475. [online]<http://informationr.net/ir/16-2/paper475.html>