바로가기메뉴

본문 바로가기 주메뉴 바로가기

logo

The Elaboration of Emoticon Features and the Underlying Mechanism of Facial Expression Perception

The Korean Journal of Cognitive and Biological Psychology / The Korean Journal of Cognitive and Biological Psychology, (P)1226-9654; (E)2733-466X
2019, v.31 no.3, pp.223-229
https://doi.org/10.22172/cogbio.2019.31.3.003


  • Downloaded
  • Viewed

Abstract

Facial expressions are visual information that reflects our inner states and plays an important role in how people communicate with others. Recently, facial expressions have been actively used through emoticons via online communication modes such as chatting on social media. The current study investigated the underlying mechanism of the perception of facial expressions in emoticons. There are two hypotheses regarding the processing of facial expressions. First, the feature-based processing hypothesis suggests facial expressions are perceived based on information gathered from each facial feature such as eyes, nose, and mouth, all of which are processed independently. Second, according to the holistic processing hypothesis, facial expressions are perceived with configural information such as the distance between the eyes. For the perception of facial expressions in a real human face, most studies have supported the holistic processing hypothesis. However, the results of studies for facial expressions in emoticons are inconsistent. The purpose of the current study is to explore whether the underlying mechanism of the facial expression perception in emoticons can be influenced by the elaboration level, related to how facial features are described in detail. In the experiment, we employed two types of emoticons to manipulate the elaboration level: simple and elaborated. In a simple emoticon, all facial features are represented with a few line segments, whereas in an elaborated emoticon, the features are elaborately described at the drawing level. Participants were asked to perform an emotion recognition task in which a human face or an emoticon with one of the five basic facial expressions (anger, fear, happiness, sadness, and surprise) was presented in an aligned or misaligned state. As a result, whereas in the case of the elaborated emoticons the accuracy of facial expression recognition was higher in the aligned condition than in the misaligned condition, there was not significant differences between these two conditions in the case of the simple emoticons. It suggests that the higher the level of elaboration of features in emoticons, the stronger the effect of the holistic processing.

keywords
emoticon, perception of facial expression, holistic processing, feature-based processing, 이모티콘, 표정 지각, 전역적 처리, 세부특징 기반 처리

Reference

1.

김선진 (2014). 모바일 메신저의 이모티콘 특성에 관한 비교 연구. 디지털디자인학연구, 14(1), 87-96.

2.

정찬섭, 오경자, 이일병, 변혜란 (1998). 감성 인터페이스: 얼굴 표정의 인식 및 합성 모형. 1998, 14(1), 87-96.

3.

정찬섭, 오경자, 이일병, 변혜란 (1998). 감성 인터페이스: 얼굴 표정의 인식 및 합성 모형. 1998년도 한국심리학회 동계 연구 세미나 논문집, 121-160.

4.

Bombari, D., Schmid, P. C., Schmid Mast, M., Birri, S., Mast, F. W., & Lobmaier, J. S. (2013). Emotion recognition: The role of featural and configural face information. The Quarterly Journal of Experimental Psychology, 66(12), 2426-2442.

5.

Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: transforming numbers into movies. Spatial Vision, 10(4), 437–442.

6.

Piepers, D., & Robbins, R. (2012). A review and clarification of the terms “holistic,” “configural,” and “relational” in the face perception literature. Frontiers in Psychology, 3, 559.

7.

Prazak, E. R., & Burgund, E. D. (2014). Keeping it real:Recognizing expressions in real compared to schematic faces. Visual Cognition, 22(5), 737-750.

8.

Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2003). Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nature Neuroscience, 6(6), 624-631.

9.

Yin, R. K. (1969). Looking at upside-down faces. Journal of Experimental Psychology, 81(1), 141-145. Vision, 10(4), 433–436.

10.

Calder, A. J., Young, A. W., Keane, J., & Dean, M. (2000). Configural information in facial expression perception. Journal of Experimental Psychology: Human Perception and Performance, 26(2), 527-551.

11.

Ekman, P. (1982). Methods for measuring facial action. In K. R. Scherer & P. Ekman (Eds.). Handbook if methods in nonverbal behavior research. Cambridge: Cambridge University Press.

12.

Fallshore, M., & Bartholow, J. (2003). Recognition of emotion from inverted schematic drawings of faces. Perceptual and Motor Skills, 96, 236-244.

13.

Koffka, K. (1935). Principles of Gestalt psychology. London:Kegan, Paul, Trench & Trubner.

14.

Leppänen, J. M., & Hietanen, J. K. (2004). Positive facial expressions are recognized faster than negative facial expressions, but why?. Psychological Research, 69, 22-29.

15.

Lipp, O. V., Price, S. M., & Tellegen, C. L. (2009). No effect of inversion on attentional and affective processing of facial expressions. Emotion, 9(2), 248

16.

Marković, S., & Gvozdenovi, V. (2001). Symmetry, complexity and perceptual economy: Effects of minimum and maximum simplicity conditions. Visual Cognition, 8(3-5), 305-327.

17.

McKelvie, S. J. (1995). Emotional expression in upside‐down faces: Evidence for configurational and componential processing. British Journal of Social Psychology, 34(3), 325-334.

18.

Öhman, A., Lundqvist, D., & Esteves, F. (2001). The face in the crowd revisited: a threat advantage with schematic stimuli. Journal of Personality and Social Psychology, 80(3), 381.

The Korean Journal of Cognitive and Biological Psychology