바로가기메뉴

본문 바로가기 주메뉴 바로가기

logo

얼굴이모티콘의 구성적 정보처리: 범주수준 반복억압효과

Configural Information Processing of Face Emoticon: Category-Level Repetition Suppression Effects

한국심리학회지: 인지 및 생물 / The Korean Journal of Cognitive and Biological Psychology, (P)1226-9654; (E)2733-466X
2020, v.32 no.1, pp.111-124
https://doi.org/10.22172/cogbio.2020.32.1.008
박태진 (전남대학교)
최동혁 (전남대학교)
김정희 (전남대학교)
  • 다운로드 수
  • 조회수

초록

얼굴이모티콘의 사용이 흔한 일상이 되었음에도 불구하고 얼굴이모티콘의 지각과정을 다룬 연구는 아직 드물다. 본 연구는 ERP 연구방법을 이용하여 얼굴이모티콘과 실제 얼굴의 구성적 정보처리 특성을 비교 조사하였다. 얼굴자극들만 연이어 제시하면 얼굴자극을 다른 범주의 자극들과 섞어서 제시하는 경우에 비해 N170 진폭이 감소하는데, 이러한 범주수준 반복억압효과는 얼굴자극들에 공통적인 구성적 정보처리가 얼굴자극의 반복 제시에 의해 억제되는데 기인한다(Maurer et al., 2008; Mercure, 2011). 본 연구는 얼굴이모티콘의 구성적 처리기전을 밝히기 위해 범주수준 반복억압효과를 조사하였는데, 얼굴이모티콘, 얼굴사진(실제 얼굴), 그리고 집아이콘(비얼굴 사물)을 각각 독립된 블록에서 제시하거나(동질맥락) 동일한 블록 내에서 함께 섞어서 제시하고서(혼합맥락), 이 자극맥락이 얼굴이모티콘과 얼굴사진의 처리에 미치는 영향을 N170과 P1에서 분석하였다. 그 결과, P1 진폭은 얼굴사진 > 얼굴이모티콘 = 집아이콘의 순서로 컸고 자극맥락과 관련된 효과는 관찰되지 않았는데, 이는 P1이 저수준 물리적 속성에 민감하며 얼굴 민감 ERP성분은 아니라는 관점을 지지한다. 얼굴사진과 얼굴이모티콘 모두 집아이콘과 달리 혼합맥락보다 동질맥락에서 N170 진폭이 더 작았는데(범주수준 반복억압효과), 이는 얼굴이모티콘 처리가 실제 얼굴 처리와 유사하게 구성적 정보처리에 의존한다는 것을 시사한다. 한편 혼합맥락에서는 얼굴이모티콘이 얼굴사진과 동등하게 집아이콘보다 더 큰 N170 진폭을 일으켰지만(얼굴민감 N170효과), 동질맥락에서는 얼굴사진과 달리 얼굴이모티콘이 얼굴민감 N170효과를 보이지 않았다. 이는 실제 얼굴의 처리에는 구성적 정보와 얼굴성분정보 양자가 기여하는데 반해, 얼굴이모티콘 처리에는 구성적 정보만 기여한다는 것을 시사하는 것으로서, 매우 단순한 형태의 얼굴이모티콘의 지각이라 할지라도 실제 얼굴과 유사하게 구성적 정보가 큰 역할을 한다는 것을 보여준다.

keywords
얼굴이모티콘, 구성적 처리, 범주수준 반복억압효과, N170, P1, face emoticon, configural processing, category-level repetition suppression effect, N170, P1

Abstract

Although the use of face emoticons has become a common daily life, studies on the perception of facial emoticons are still rare. This study investigated the configural information processing of face emoticons and real faces (face photos) using ERP research methods. The successive presentation of different facial stimuli reduced N170 amplitude to facial stimuli compared to the mixed presentation of facial stimuli with other non-facial stimuli. This category-level repetition suppression effect for faces would occur when the same configural information processing underlying the N170 were repeated for each subsequent face (Maurer et al., 2008; Mercure, 2011). To examine the configural processing of face emoticon, we investigated category-level repetition suppression effect by manipulating the stimulus context. Face emoticons, face photos, and house icons were presented separately in separate blocks (homogeneous context) or were presented together in the same block (mixed context), and the effects of stimulus contexts on N170 and P1 were analyzed. The results showed that no context effect on P1 amplitude was found, and the magnitudes of P1 amplitude were in the order of face photo > face emoticon = house icon, which supports that P1 is sensitive to low-level perceptual properties and is not face-sensitive. The N170 amplitudes of face emoticons and face pictures except house icons were larger in the mixed context compared to the homogeneous context (category-level repetition suppression effect) which suggest that the processing of face emoticon rely on configural information processing similar to real face. N170 amplitudes of face emoticons and face photos were larger than those of house icons (face-sensitive N170 effect) in the mixed context, but only face photos except face emoticons showed face-sensitive N170 effect in the homogeneous context. Our findings suggest that the processing of real face rely on both configural information and face components information, but the processing of face emoticon rely on only configural information. Taken together, configural information plays a big role for the perception of even very simple emoticon faces, similar to real faces.

keywords
얼굴이모티콘, 구성적 처리, 범주수준 반복억압효과, N170, P1, face emoticon, configural processing, category-level repetition suppression effect, N170, P1

참고문헌

1.

Albonico, A., Furubacke, A., Barton, J. J., & Oruc, I. (2018). Perceptual efficiency and the inversion effect for faces, words and houses. Vision Research, 153, 91-97.

2.

Bentin, S., Allison, T., Puce, A., Perez, E., & McCarthy, G. (1996). Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience, 8, 551-565.

3.

Bentin, S., Taylor, M. J., Rousselet, G. A., Itier, R. J., Caldara, R., Schyns, P. G., . . . Rossion, B. (2007). Controlling interstimulus perceptual variance does not abolish N170 face sensitivity. Nature Neuroscience, 10, 801-802.

4.

Boehm, S. G., Dering, B., & Thierry, G. (2011). Categorysensitivity in the N170 range: A question of topography and inversion, not one of amplitude. Neuropsychologia, 49, 2082-2089.

5.

Bötzel, K., Schulze, S., & Stodieck, S. R. (1995). Scalp topography and analysis of intracranial sources of face-evoked potentials. Experimental Brain Research, 104, 135-143.

6.

Carmel, D., & Bentin, S. (2002). Domain specificity versus expertise: factors influencing distinct processing of faces. Cognition, 83, 1-29.

7.

Eimer, M. (2000a). Effects of face inversion on the structural encoding and recognition of faces: Evidence from event-related brain potentials. Cognitive Brain Research, 10, 145-158.

8.

Eimer, M. (2000b). The face-specific N170 component reflects late stages in the structural encoding of faces. Neuroreport, 11, 2319-2324.

9.

Eimer, M. (2011). The face-sensitive N170 component of the event-related brain potential. In A. J. Calder, G. Rhodes, M. Johnson, & J. Haxby, (Eds.) The Oxford Handbook of Face Perception (pp. 329-344). Oxford: Oxford University Press.

10.

Fang, F., & He, S. (2005). Viewer-centered object representation in the human visual system revealed by viewpoint aftereffects. Neuron, 45, 793-800.

11.

Gantiva, C., Sotaquira, M., Araujo, A., & Cuervo, P. (2019). Cortical processing of human and Emoji faces: An ERP analysis. Behaviour & Information Technology, 9, 1362-1370.

12.

Gauthier, I., & Tarr, M. J. (1997). Becoming a “Greeble”expert: Exploring mechanisms for face recognition. Vision Research, 37, 1673-1682.

13.

Goffaux V, Gauthier I., & Rossion B. (2003). Spatial scale contribution to early visual differences between face and object processing. Cognitive Brain Research, 16, 416-424.

14.

Grill-Spector, K., Henson, R., & Martin, A. (2006). Repetition and the brain neural models of stimulus-specific effects. Trends in Cognitive Sciences, 10, 14-23.

15.

Halit, H., de Haan, M., Schyns, P. G., & Johnson, M. H. (2006). Is high-spatial frequency information used in the early stages of face detection?. Brain Research, 1117, 154-161.

16.

Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. (2000). The distributed human neural system for face perception. Trends in Cognitive Sciences, 4, 223-233.

17.

Henson, R. N. (2003). Neuroimaging studies of priming. Progress In Neurobiology, 70, 53-81.

18.

Henson, R. N. A., & Rugg, M. D. (2003). Neural response suppression, haemodynamic repetition effects, and behavioural priming. Neuropsychologia, 41, 263-270.

19.

Henson, R. N., Goshen-Gottstein, Y., Ganel, T., Otten, L. J., Quayle, A., & Rugg, M. D. (2003). Electrophysiological and haemodynamic correlates of face perception, recognition and priming. Cerebral Cortex, 13, 793-805.

20.

Henson, R. N., Mattout, J., Singh, K. D., Barnes, G. R., Hillebrand, A., & Friston, K. (2007). Population-level inferences for distributed MEG source localisation under multiple constraints: Application to face-evoked fields. Neuroimage, 38, 422-438.

21.

Itier, R. J., & Taylor, M. J. (2002). Inversion and contrast polarity reversal affect both encoding and recognition processes of unfamiliar faces: A repetition study using ERPs. Neuroimage, 15, 353-372.

22.

Itier, R. J., & Taylor, M. J. (2004). N170 or N1? Spatiotemporal differences between object and face processing using ERPs. Cerebral Cortex, 14, 132-142.

23.

Itier, R. J., Alain, C., Sedore, K., & McIntosh, A. R. (2007). Early face processing specificity: It's in the eyes!. Journal of Cognitive Neuroscience, 19, 1815-1826.

24.

Kanwisher, N., & Yovel, G. (2006). The fusiform face area: A cortical region specialized for the perception of faces. Philosophical Transactions of the Royal Society B:Biological Sciences, 361, 2109-2128.

25.

Kanwisher, N., McDermott, J., & Chun, M. M. (1997). The fusiform face area: A module in human extrastriate cortex specialized for face perception. Journal of Neuroscience, 17, 4302-4311.

26.

Kendall, L. N., Raffaelli, Q., Kingstone, A., & Todd, R. M. (2016). Iconic faces are not real faces: Enhanced emotion detection and altered neural processing as faces become more iconic. Cognitive Research: Principles and Implications, 1, 19.

27.

Kovács, G., Zimmer, M,, Volberg, G,, Lavric, I., & Rossion, B. (2013). Electrophysiological correlates of visual adaptation and sensory competition. Neuropsychologia, 51, 1488- 1496.

28.

Kovács, G., Zimmer, M., Harza, I., & Vidnyánszky, Z. (2007). Adaptation duration affects the spatial selectivity of facial aftereffects. Vision Research, 47, 3141-3149.

29.

Kuefner, D., De Heering, A., Jacques, C., Palmero-Soler, E., & Rossion, B. (2010). Early visually evoked electrophysiological responses over the human brain (P1, N170) show stable patterns of face-sensitivity from 4 years to adulthood. Frontiers in Human Neuroscience, 3, 67.

30.

Leopold, D. A., O'Toole, A. J., Vetter, T., & Blanz, V. (2001). Prototype-referenced shape encoding revealed by high-level aftereffects. Nature Neuroscience, 4, 89-94.

31.

Leopold, D. A., Rhodes, G., Müller, K. M., & Jeffery, L. (2005). The dynamics of visual adaptation to faces. Proceedings of the Royal Society B: Biological Sciences, 272, 897-904.

32.

Maurer, U., Rossion, B., & McCandliss, B. D. (2008). Category specificity in early perception: Face and word N170responses differ in both lateralization and habituation properties. Frontiers in Human Neuroscience, 2, 18.

33.

Mehrabian, A., & Wiener, M. (1967). Decoding of inconsistent communications. Journal of Personality and Social Psychology, 6, 109-114.

34.

Mercure, E., Cohen Kadosh, K., & Johnson, M. (2011). The N170 shows differential repetition effects for faces, objects and orthographic stimuli. Frontiers in Human Neuroscience, 5, 6.

35.

Nakashima, T., Kaneko, K., Goto, Y., Abe, T., Mitsudo, T., Ogata, K., . . . Tobimatsu, S. (2008). Early ERP components differentially extract facial features: Evidence for spatial frequency-and-contrast detectors. Neuroscience Research, 62, 225-235.

36.

Palumbo, R., Laeng, B., & Tommasi, L. (2013). Gender-specific aftereffects following adaptation to silhouettes of human bodies. Visual Cognition, 21, 1-12.

37.

Park, T., Yang, Y., & Kim, J. (2018). Inversion Effects on Face Emoticon Processing: An ERP Study. The Korean Journal of Cognitive and Biological Psychology, 30, 113-139.

38.

Rhodes, G. (1993) Configural coding, expertise, and the right hemisphere advantage for face recognition. Brain and Cogntion, 22, 19-41.

39.

Rhodes, G., Robbins, R., Jaquet, E., McKone, E., Jeffery, L., &Clifford, C. W. G. (2005). Adaptation and face perception:How aftereffects implicate norm-based coding of faces. In C. W. G. Clifford & G. Rhodes (Eds.), Fitting the Mind to the World: Adaptation and After-Effects in High-Level Vision (pp. 213-240). OxFord: Oxford University Press.

40.

Rossion, B., Gauthier, I., Goffaux, V., Tarr, M. J., &Crommelinck, M. (2002). Expertise training with novel objects leads to left-lateralized facelike electrophysiological responses. Psychological Science, 13, 250-257.

41.

Rossion, B., Gauthier, I., Tarr, M. J., Despland, P., Bruyer, R., Linotte, S., & Crommelinck, M. (2000). The N170 occipitotemporal component is delayed and enhanced to inverted faces but not to inverted objects: An electrophysiological account of face-specific processes in the human brain. Neuroreport, 11, 69-72.

42.

Rossion, B., Joyce, C. A., Cottrell, G. W., & Tarr, M. J. (2003). Early lateralization and orientation tuning for face, word, and object processing in the visual cortex. Neuroimage, 20, 1609-1624.

43.

Sagiv, N., & Bentin, S. (2001). Structural encoding of human and schematic faces: Holistic and part-based processes. Journal of Cognitive Neuroscience, 13, 937-951.

44.

Schendan, H. E., Ganis, G., & Kutas, M. (1998). Neurophysiological evidence for visual perceptual categorization of words and faces within 150 ms. Psychophysiology, 35, 240-251.

45.

Tanaka, J. W., & Curran, T. (2001). A neural basis for expert object recognition. Psychological Science, 12, 43-47.

46.

Tovée, M. J. (1998). Face processing: Getting by with a little help from its friends. Current Biology, 8, R317-R320.

47.

Watanabe, S., Kakigi, R., & Puce, A. (2003). The spatiotemporal dynamics of the face inversion effect: A magneto-and electro-encephalographic study. Neuroscience, 116, 879-895.

48.

Webster, M. A. (2011). Adaptation and visual coding. Journal of Vision, 11, 3.

49.

Webster, M. A., & Maclin, O. H. (1999). Figural aftereffects in the perception of faces. Psychonomic Bulletin & Review, 6, 647-653.

50.

Webster, M. A., Kaping, D., Mizokami, Y., & Duhamel, P. (2004). Adaptation to natural facial categories. Nature, 428, 557-561.

51.

Wiggs, C. L., & Martin, A. (1998). Properties and mechanisms of perceptual priming. Current Opinion in Neurobiology, 8, 227-233.

한국심리학회지: 인지 및 생물