바로가기메뉴

본문 바로가기 주메뉴 바로가기

logo

Vol.34 No.2

초록보기
Abstract

Emotion words are often categorized into emotion-label words which refer to specific emotional states (e.g., happy, sad) and emotion-laden words which do not directly mention but embed emotional states (e.g., success, fail). This study aimed to investigate whether emotion words belonging to different categories are processed differently. Participants performed a lexical decision task on emotion-label and emotion-laden words in Experiment 1 and a valence decision task on the same words in Experiment 2. A series of linear mixed-effect models were conducted on response latencies (RTs) while lexical frequency, concreteness and emotionality of words were controlled for. The model results yielded that emotion-laden words were processed slower than emotion-label words, and that positive emotion words were processed faster than negative emotion words regardless of word categories. The slower RTs to emotion-laden words imply that the valence of the words is constructed through emotional experience requiring additional information processing. We insist that the informational processing of emotion-laden words is different from that of emotion-label words, independent of the concreteness and emotionality of emotional words.

초록보기
Abstract

Trait impulsivity is known as a psychological construct that explains behavioral problems, including lack of attention, personality disorders, and substance use disorders in children and adolescents. While neurobiological mechanisms of adolescents’ impulsivity measured with behavioral lab tasks have been revealed using functional brain imaging techniques, little has been known whether trait impulsivity measured with self-reporting scales reflects a relevant neurobiological entity. In this study, multi-dimensional trait impulsivity scale (UPPS-P) and resting-state fMRI were acquired from 63 typically developing adolescents. We used a canonical correlation analysis (CCA) to examine whether trait impulsivity reflects the brain connectivities of impulsivity network by targeting the brain regions with the Neurosynth meta-analysis system. We found that the linear combination of UPPS-P item-level responses was associated with the linear combination of impulsivity-relevant brain connectivities in the factor items of negative urgency, sensation seeking, and positive urgency. On the contrary, the brain connectivities that were not based on the meta-analysis of impulsivity did not show a meaningful canonical correlation. These results suggest that specific questionnaires that compose the trait impulsivity scale are reflective of individual differences of adolescents’ neural circuitry of impulsiveness.

초록보기
Abstract

This study aimed to explore the characteristics of cognitive control deficits in adult ADHD from the perspective of Dual Mechanism of Control (DMC), which posits that cognitive control operates via two distinct dimensions, i.e., proactive and reactive control. AX-CPT, including nogo trials, was utilized to tap cognitive control processes in a sample of 27 adults with ADHD, 17 adults with depression, and 29 healthy adults. The performance of adults with ADHD was compared to that of adults with depression as well as healthy adults to investigate the effects of depression, which is a common comorbid condition in adult ADHD. AX-CPT is a continuous performance test consisting of four types of cue-probe pairs (AX, AY, BX, BY) and requires participants to respond to a probe based on a preceding cue. AX pairs are target pairs matched with a target response, and other pairs are non-target pairs matched with a non-target response. The proactive control demands were manipulated with the proportion of AX trials(AX-40 vs. AX-70) and the reactive control demands were manipulated with nogo trials(base vs. nogo). The order of the two AX-proportion conditions were counterbalanced, and the base condition and the nogo condition were implemented in order within each AX-proportion condition. The adult ADHD group showed lower accuracy than the depression group and the healthy control group in the performance of BX trials that require proactive inhibitory control. Both the ADHD group and the depression group performed lower than the healthy control group in nogo trials that require reactive inhibitory control. These results suggest that adults with ADHD have a deficit in overall cognitive control, including proactive and reactive control, and the reactive control deficit of adult ADHD may be associated with comorbid depression as well as ADHD itself.

Taehoon Kim(Department of Psychology, Yonsei University) ; Do-Joon Yi(Department of Psychology, Yonsei University) pp.83-98 https://doi.org/10.22172/cogbio.2022.34.2.004
초록보기
Abstract

The reactivation of memories can transiently render them vulnerable to and updated with newly learned information. Recent evidence implicates prediction error as necessary to trigger such reconsolidation processes. However, it is unknown how the prediction error level relates to memory updating. Using a 3-day object learning paradigm, we tested the updating of memories as a function of prediction error during reactivation. On Day 1, participants learned the first list of objects divided into a few subsets. On Day 2, the experimental group was reminded of the first list and then learned the second list while the control group went through reverse order. Notably, when the memories of the first list were reactivated, different levels of prediction error occurred for the subsets. On Day 3, the experimental group was more likely to misattribute the source of objects from the second list as being from the first list, the extent of which was prominent with a moderate level of prediction error. No such pattern was observed for the control group. These results indicate that only a moderate prediction error is required for memory updating and that a new memory may be formed when there is too much prediction error. The current study suggests that prediction error is a necessary but not sufficient condition for memory updating.

초록보기
Abstract

Valid cues have an effect of reducing detection time of targets, which is called cueing effect. This study manipulated validity of central and peripheral cues between 25%, 40%, 55%, 70%, 85%, 100% and measured target detection time. In both of cue conditions overall response time (RT) decreased non-linearly when cue validity increased, and tended to be longer than expected linearly at corresponding validity level. As cue validity increased valid trials also showed similar nonlinear decreasing pattern of RTs, but cueing effect increased linearly. Further analysis indicates that 6 levels of validity could be grouped into three sections: 25% and 40% section, 55% to 85% section, and 100% level. In summary, RTs of valid trials did not match levels of cue validity sensitively, and there might be more strategic factors involved in utilizing cue validity information.

초록보기
Abstract

Contextual effects on emotion recognition have been well evidenced on various contextual stimuli. However, previous studies mostly focused on facial expressions. Although the facial expressions are important cue in emotional communication, bodily expressions are also one of the fundamental sources of emotional information especially when faces are hard to recognize. The present study examined contextual effects on emotional processing of the bodily expression. Given that the bodily expressions are inseparable from the facial expression, we aimed to examine contextual effects of faces on bodily expression. For bodily expression, we used point-light biological motion(BM) that depicts human movement and convey emotional information. In the affective priming paradigm, emotionally neutral BMs were presented as targets following the prime which was either happy, angry, or neutral face. To compare with the emotional processing of faces, the same experiment but with neutral face target was additionally conducted. Participants were told to rate emotional valence of the targets. The results showed that both the BM and face targets were affected by priming effects. Notably, the effects were much greater on BM than the face targets, indicating that BM was more influenced by contextual emotion. These suggest that BM perception may need greater integration of the emotional contextual information possibly due to insufficient its own emotional information compared to faces. To summarize, the present study newly revealed the affective contextual effect of facial expression on BM, and identified distinct property of emotional valence convey by BM compared with facial expressions.

초록보기
Abstract

In recent years of neuroscience area, intersubject correlation (ISC) has been regarded as a suitable method for considering individuals' differences. In this study, we measured the individual's affective responses on the distinctive ASMR stimuli and investigated if there are differences of intersubject correlation values between the experimental conditions. We used Kim and Kim (2020)'s data of the participants' affective responses to ASMR stimuli. We were able to find the consistency of affective responses across subjects by computing the ISC matrix. The participants tended to respond similarly when they watched emotionally similar stimuli. For follow-up analyses, the elements of the matrix were processed to make three different data sets, which consist of the same stimulus pairs set, different stimulus pairs within same emotion sets, and different stimulus pairs between emotion sets. Those data were analyzed using two-way repeated measures ANOVA to test the effects of affective valence and sensory modalities on ISCs. Results showed that when ASMR was positive or audiovisual, the participants tended to respond more similarly to the stimulus. The subjects did not respond similarly in audiovisual condition when they watched negative ASMR stimuli. Although the negative scenes were semantically corresponded to the negative auditory stimuli, the subjects reponses were not consistent.

초록보기
Abstract

Emotion words or emotional faces presented before a person's facial expressions can be used as contextual information to identify that person's emotion. Using a psychophysical method, we compared the context modulation effects of emotional words and faces on emotion judgment of the target faces. We presented emotion words (“happiness”, “anger”) or emotional faces (typical happy and angry face) as context, and then participants performed a two alternative forced choice task (2-AFC) to determine the emotion of the target face of which emotion was gradually morphed from happiness to anger. As a result of two experiments with different context presentation times (Experiment 1: 200 ms, Experiment 2: 1500 ms), the emotional word context induced an assimilation context modulation effect that shifts the emotion judgment threshold in a direction consistent with the context emotion regardless of the presentation time. On the other hand, the facial expression context induced a contrasting context-regulating effect that shifted the judgment threshold in the opposite direction to the contextual emotion. At this time, the context modulation effect of happy expression and anger expression was affected by the presentation time. When the context duration was short (200ms), only contrastive effect of happy face was observed, whereas when the context duration was long enough (1500ms), both contrastive effect of the happy and angry face were observed. The results of this study imply that emotional words as context activate emotional concepts, whereas facial expressions activate structural information, which can lead to different contextual effects on the emotional perception of subsequent facial expressions.

The Korean Journal of Cognitive and Biological Psychology