“情緒面孔”和“情緒肢體語言”文互認知的神經(jīng)電生理研究
發(fā)布時間:2018-07-05 11:26
本文選題:情緒認知研究 + 面孔識別 ; 參考:《上海交通大學(xué)》2012年碩士論文
【摘要】:人類的非語言性情緒認知主要依賴多渠道的信息感知來實現(xiàn),包括面部表情、肢體姿勢、身體動作、聲音、甚至是生理信號。情緒肢體語言,和面孔表情,作為整個身體不可或缺的部分,,共同表達個體的情緒狀態(tài)。隨著“情緒肢體語言”研究的展開,以及研究界對于“面孔識別”和“肢體識別”神經(jīng)機制的認識深入,情緒肢體語言和情緒面孔的交互認知研究也逐漸興起。目前,“情緒肢體語言和面孔情緒交互作用”的研究結(jié)論集中在行為學(xué)結(jié)果和ERP早期成分(P1)的特征上,對其神經(jīng)電生理機制的認知仍然不十分清晰。 因此,本研究通過設(shè)計情緒認知實驗,利用事件相關(guān)電位和腦地形圖分析方法,探索面孔情緒和肢體情緒交互認知的神經(jīng)電生理機制。研究內(nèi)容包括: (1)、設(shè)計面部情緒和肢體情緒交互認知實驗 將四種面部表情(傷心、微傷心、開心、微開心)和兩種肢體情緒(傷心、開心)刺激進行組合,設(shè)計面孔情緒識別任務(wù)的電生理認知實驗 (2)、驗證面部表情和肢體情緒交互影響作用的行為學(xué)結(jié)果 從行為學(xué)的打分值、反應(yīng)時間等結(jié)果角度,驗證情緒肢體語言對面孔表情識別的影響。 (3)、探索面孔表情和肢體情緒交互作用的神經(jīng)生理機制 通過對ERP早(N1,N170)、晚期(P2,LPc)成分的分析,以及ERP差異波腦地形圖分析,比較不同肢體語言對相同的面部情緒認知的影響,解釋該交互影響作用的神經(jīng)機制。 本研究的結(jié)論包括: (1)、行為學(xué)結(jié)果表明,當搭配開心的肢體情緒時,被試對情緒面孔的打分值顯得高一些,判斷會偏向于“開心”,而搭配悲傷的肢體情緒時,判斷會偏向于“悲傷”,證明肢體語言的存在影響了被試對面孔情緒的判斷;并且,肢體情緒對面孔情緒判斷的影響與面孔情緒程度相關(guān),情緒越強烈,交互影響作用越小,情緒越模糊,交互影響作用越明顯。 (2)、N170研究結(jié)果表明,被試對于開心的肢體語言更加敏感,當搭配開心的肢體語言時,情緒面孔將誘發(fā)幅值更大、潛伏期更短的N170成分。本研究提出了肢體語言對情緒面孔識別的影響與結(jié)構(gòu)化處理特異的N170相關(guān)的可能性。 (3)、P2研究結(jié)果表明,相比悲傷的肢體情緒,面孔表情搭配開心的肢體情緒時,顯著地激發(fā)幅值更大、潛伏期更短的P2成分,證明了P2是肢體、面孔情緒交互的一個標志成分。 (4)、差異波腦地形圖結(jié)果顯示,肢體情緒和面孔情緒的交互作用在額中央?yún)^(qū)的100 300ms時間窗內(nèi)啟動發(fā)生。這個結(jié)論與我們的預(yù)期比較一致,也與其他的研究結(jié)論一致。 (5)、我們在ERP波形的后期(300 800ms),觀察到了情緒一致/不一致效應(yīng),具體來說,悲傷開心、開心悲傷這兩個“不一致”組合刺激比悲傷悲傷、開心開心的“一致”組合刺激,激活了更強的全腦活動。這個結(jié)果從一定程度上驗證了之前的研究結(jié)論。 該研究是對面孔、肢體雙模態(tài)情緒認知研究的一次探索,在先前研究結(jié)論的基礎(chǔ)上,提供了新的神經(jīng)電生理機制證據(jù)。
[Abstract]:Human nonverbal emotion cognition is mainly dependent on multi-channel information perception, including facial expression, body posture, body movements, sound, even physiological signals. Emotional body language, and face expression, as an integral part of the whole body, together form individual emotional state. With the study of "emotional body language". As well as the understanding of "face recognition" and "limb recognition", the research on the interaction of emotional body language and emotional face is gradually rising. At present, the conclusion of "emotional body language and face mood interaction" is focused on the characteristics of behavioral and ERP early components (P1). The cognition of its electrophysiological mechanism is still not very clear.
Therefore, by designing emotional cognitive experiments and using event related potential and brain map analysis, the study explored the electrophysiological mechanism of the interactive cognition of face and limb emotions.
(1) design facial emotion and emotional interaction cognitive experiments.
The combination of four facial expressions (sad, sad, happy, micro happy) and two kinds of emotional (sad, happy) stimuli to design the electrophysiological cognitive experiment of face emotion recognition task
(2) behavioral results to verify the interactive effects of facial expressions and body emotions.
The influence of emotional body language on facial expression recognition is verified from the scores of behavioral scoring and reaction time.
(3) explore the neurophysiological mechanism of facial expression and emotional interaction.
By analyzing the components of ERP early (N1, N170), late (P2, LPc) and ERP differential wave brain topographic map, the influence of different body language on the same facial emotion cognition was compared and the neural mechanism of the interaction effect was explained.
The conclusions of this study include:
(1) the behavioral results show that when they match the happy body mood, the scores of the subjects appear to be higher, and the judgment tends to be "happy", and when they match the sad limbs, the judgment tends to be "sad", proving that the existence of body language affects the judgment of the face, and the emotion is opposite to the body. The influence of hole emotion judgment is related to the emotional level of the face. The stronger the emotion, the smaller the interaction effect. The more ambiguous the emotion is, the more obvious the interaction effect is.
(2) the results of the N170 study showed that the subjects were more sensitive to the happy limb language. When they matched the happy body language, the emotional face would induce greater amplitude and shorter N170 components in the incubation period. This study suggested the possibility of the influence of body language on the recognition of emotional face and the possibility of structural specific N170.
(3) the results of the P2 study showed that compared to the sad limb mood, the face expression with the happy limb emotion significantly stimulated the larger amplitude and the shorter P2 component in the incubation period, which proved that P2 was a symbol of the emotional interaction of the limbs and the faces.
(4) the results of differential wave brain topography showed that the interaction of limb and face mood started in the 100 300ms time window of the central area. This conclusion is in accordance with our expectations, and is consistent with other research conclusions.
(5) in the later period of the ERP waveform (300 800ms), we observed the emotional consistency / disagreement effect. Specifically, the two "inconsistent" combination stimuli, sad and happy, happy and sad, were more exciting than sad, happy and happy, and activated the stronger whole brain activity. This result proved to a certain extent the previous study. Conclusion.
This study is an exploration of the research on the dual modality emotional cognition of faces and limbs. Based on previous studies, it provides new evidence for electrophysiological mechanism.
【學(xué)位授予單位】:上海交通大學(xué)
【學(xué)位級別】:碩士
【學(xué)位授予年份】:2012
【分類號】:R338
【參考文獻】
相關(guān)期刊論文 前1條
1 白露,馬慧,黃宇霞,羅躍嘉;中國情緒圖片系統(tǒng)的編制——在46名中國大學(xué)生中的試用[J];中國心理衛(wèi)生雜志;2005年11期
本文編號:2100055
本文鏈接:http://sikaile.net/xiyixuelunwen/2100055.html
最近更新
教材專著