情感腦機(jī)交互研究
發(fā)布時(shí)間:2021-06-24 11:18
情緒在日常生活人與人交流中扮演著重要角色。除了邏輯智能,情感智能也被認(rèn)為是人類(lèi)智能的重要組成部分。情感智能是指機(jī)器感知,理解和調(diào)控人的情緒的能力。然而,現(xiàn)有人機(jī)交互系統(tǒng)仍然缺乏情感智能。情感腦機(jī)交互研究的目的是通過(guò)構(gòu)建情感計(jì)算模型來(lái)建立人與機(jī)器的情感交流通路。在本論文中,我們探討了情感腦機(jī)交互的理論基礎(chǔ),模型,算法,實(shí)現(xiàn)技術(shù),實(shí)驗(yàn)驗(yàn)證以及原型應(yīng)用。主要工作包括以下三個(gè)方面:1)我們利用腦電,眼電和眼動(dòng)信號(hào)以及深度神經(jīng)網(wǎng)絡(luò)構(gòu)建了多模態(tài)情緒識(shí)別和警覺(jué)度估計(jì)系統(tǒng)。相對(duì)于傳統(tǒng)淺層模型,深度神經(jīng)網(wǎng)絡(luò)能有效提高識(shí)別性能,并揭示情緒識(shí)別中關(guān)鍵頻段和關(guān)鍵腦區(qū),從而給出在實(shí)際應(yīng)用中具有更少電極的配置方案。我們通過(guò)跨個(gè)體的在不同時(shí)間的多次實(shí)驗(yàn),揭示了對(duì)于三類(lèi)情緒(高興、悲傷和中性)的穩(wěn)定神經(jīng)模式。我們發(fā)現(xiàn)高興情緒在顳葉腦區(qū)具有更強(qiáng)的beta頻段和gamma頻段的腦電信號(hào)反應(yīng),中性和悲傷情緒的神經(jīng)模式比較相似,中性情緒在頂葉和枕葉腦區(qū)具有更強(qiáng)的alpha頻段腦電信號(hào)反應(yīng),而悲傷情緒在頂葉和枕葉腦區(qū)具有更強(qiáng)的delta頻段腦電信號(hào)反應(yīng)以及前額腦區(qū)更強(qiáng)的gamma頻段腦電信號(hào)反應(yīng)。2)我們提出了利用腦電和眼動(dòng)信...
【文章來(lái)源】:上海交通大學(xué)上海市 211工程院校 985工程院校 教育部直屬院校
【文章頁(yè)數(shù)】:226 頁(yè)
【學(xué)位級(jí)別】:博士
【文章目錄】:
摘要
ABSTRACT
Chapter 1 Introduction
1.1 Motivation
1.2 Contributions
1.3 Thesis Overview
Chapter 2 Research Background
2.1 Emotion Definition and Emotion Models
2.1.1 Discrete Model
2.1.2 Continuous Model
2.2 Brain Mechanism of Emotion
2.3 Electroencephalography
2.4 Emotion Elicitation and Emotion Experiment
2.5 Emotion Recognition
2.5.1 EEG-based Emotion Recognition
2.5.2 Multimodal Emotion Recognition
2.5.3 Public Emotion EEG Datasets
2.6 Driving Fatigue and Vigilance Estimation
2.7 Summary
Chapter 3 Experimental Setups
3.1 SJTU Emotion EEG Dataset (SEED) for Three Emotions
3.1.1 Emotion Stimuli
3.1.2 Subjects
3.1.3 Experiment Protocol
3.2 SJTU Emotion EEG Dataset (SEED-IV) for Four Emotions
3.2.1 Emotion Stimuli
3.2.2 Subjects
3.2.3 Experiment Protocol
3.3 Data Processing for Multimodal Emotion Recognition
3.3.1 Feature Extraction for EEG
3.3.2 Feature Smoothing for EEG
3.3.3 Dimensionality Reduction for EEG
3.3.4 Feature Extraction for Eye Movements
3.4 Multimodal Vigilance Estimation Dataset (SEED-VIG)
3.4.1 Experimental setup
3.4.2 Vigilance Annotation
3.5 Wearable Device for Vigilance Estimation
3.5.1 Flexible Dry Electrodes
3.5.2 EOG Acquisition Board
3.5.3 Laboratory Driving Simulations
3.5.4 Real-World Driving Experiments
3.6 Summary
Chapter 4 EEG-based Emotion Recognition
4.1 EEG-based Emotion Classification Using Deep Neural Networks
4.1.1 Introduction
4.1.2 Deep Belief Networks
4.1.3 Classifier Training
4.1.4 Classification Performance
4.1.5 Critical Frequency Bands and Channels
4.1.6 Electrode Reduction
4.2 Stable EEG Patterns over Time for Emotion Recognition
4.2.1 Introduction
4.2.2 Discriminative Graph Regularized Extreme Learning Machine
4.2.3 Experiment Results on DEAP Data
4.2.4 Experiment Results on SEED Data
4.2.5 Neural Signatures and Stable Patterns
4.2.6 Stability of the Emotion Recognition Model over Time
4.3 Summary
Chapter 5 Multimodal Emotion Recognition with EEG and Eye Movements
5.1 Introduction
5.2 Multimodal Deep Learning
5.3 Modality Fusion Methods
5.4 Experimental Results on SEED for Three Emotions
5.4.1 Eye Movement-Based Emotion Recognition
5.4.2 Performance of Modality Fusion
5.4.3 Analysis of Complementary Characteristics
5.5 Experimental Results on SEED-IV for Four Emotions
5.5.1 EEG-Based Emotion Recognition
5.5.2 Analysis of Modality Fusion and Complementary Characteristics
5.5.3 Analysis of Stability Across Sessions
5.6 Summary
Chapter 6 Personalizing Affective Models with Transfer Learning
6.1 Introduction
6.2 Transfer Learning
6.2.1 Transfer Component Analysis
6.2.2 Kernel Principle Component Analysis
6.2.3 Transductive Parameter Transfer
6.3 Experiment Setup
6.4 Experiment Results
6.5 Heterogeneous Knowledge Transfer From Eye Tracking To EEG
6.5.1 Introduction
6.5.2 Spatiotemporal Scanpath Analysis
6.5.3 Heterogeneous Transfer Learning
6.5.4 Evaluation Details
6.5.5 Experimental Results
6.6 Summary
Chapter 7 Multimodal Vigilance Estimation: From Simulated To Real Scenarios
7.1 Introduction
7.2 Feature Extraction
7.2.1 Preprocessing for Forehead EOG
7.2.2 Feature Extraction for Forehead EOG
7.2.3 Forehead EEG Signal Extraction
7.2.4 Feature Extraction from EEG
7.3 Incorporating Temporal Dependency into Vigilance Estimation
7.4 Evaluation Metrics
7.5 Experimental Results on SEED-VIG
7.5.1 Forehead EOG-based Vigilance Estimation
7.5.2 EEG-based Vigilance Estimation
7.5.3 Modality Fusion with Temporal Dependency
7.5.4 Complementary Characteristics
7.6 Experimental Setups with Wearable Device
7.7 Experimental Results on Wearable Device
7.7.1 Laboratory Driving Simulations
7.7.2 Real-World Driving Experiments
7.8 Discussion
7.9 Summary
Chapter 8 Conclusions and Future Work
8.1 Summary of Contributions
8.2 Future Work
References
Acknowledgements
Publications
Project Paticipation
List of Patents
Resume
本文編號(hào):3246997
【文章來(lái)源】:上海交通大學(xué)上海市 211工程院校 985工程院校 教育部直屬院校
【文章頁(yè)數(shù)】:226 頁(yè)
【學(xué)位級(jí)別】:博士
【文章目錄】:
摘要
ABSTRACT
Chapter 1 Introduction
1.1 Motivation
1.2 Contributions
1.3 Thesis Overview
Chapter 2 Research Background
2.1 Emotion Definition and Emotion Models
2.1.1 Discrete Model
2.1.2 Continuous Model
2.2 Brain Mechanism of Emotion
2.3 Electroencephalography
2.4 Emotion Elicitation and Emotion Experiment
2.5 Emotion Recognition
2.5.1 EEG-based Emotion Recognition
2.5.2 Multimodal Emotion Recognition
2.5.3 Public Emotion EEG Datasets
2.6 Driving Fatigue and Vigilance Estimation
2.7 Summary
Chapter 3 Experimental Setups
3.1 SJTU Emotion EEG Dataset (SEED) for Three Emotions
3.1.1 Emotion Stimuli
3.1.2 Subjects
3.1.3 Experiment Protocol
3.2 SJTU Emotion EEG Dataset (SEED-IV) for Four Emotions
3.2.1 Emotion Stimuli
3.2.2 Subjects
3.2.3 Experiment Protocol
3.3 Data Processing for Multimodal Emotion Recognition
3.3.1 Feature Extraction for EEG
3.3.2 Feature Smoothing for EEG
3.3.3 Dimensionality Reduction for EEG
3.3.4 Feature Extraction for Eye Movements
3.4 Multimodal Vigilance Estimation Dataset (SEED-VIG)
3.4.1 Experimental setup
3.4.2 Vigilance Annotation
3.5 Wearable Device for Vigilance Estimation
3.5.1 Flexible Dry Electrodes
3.5.2 EOG Acquisition Board
3.5.3 Laboratory Driving Simulations
3.5.4 Real-World Driving Experiments
3.6 Summary
Chapter 4 EEG-based Emotion Recognition
4.1 EEG-based Emotion Classification Using Deep Neural Networks
4.1.1 Introduction
4.1.2 Deep Belief Networks
4.1.3 Classifier Training
4.1.4 Classification Performance
4.1.5 Critical Frequency Bands and Channels
4.1.6 Electrode Reduction
4.2 Stable EEG Patterns over Time for Emotion Recognition
4.2.1 Introduction
4.2.2 Discriminative Graph Regularized Extreme Learning Machine
4.2.3 Experiment Results on DEAP Data
4.2.4 Experiment Results on SEED Data
4.2.5 Neural Signatures and Stable Patterns
4.2.6 Stability of the Emotion Recognition Model over Time
4.3 Summary
Chapter 5 Multimodal Emotion Recognition with EEG and Eye Movements
5.1 Introduction
5.2 Multimodal Deep Learning
5.3 Modality Fusion Methods
5.4 Experimental Results on SEED for Three Emotions
5.4.1 Eye Movement-Based Emotion Recognition
5.4.2 Performance of Modality Fusion
5.4.3 Analysis of Complementary Characteristics
5.5 Experimental Results on SEED-IV for Four Emotions
5.5.1 EEG-Based Emotion Recognition
5.5.2 Analysis of Modality Fusion and Complementary Characteristics
5.5.3 Analysis of Stability Across Sessions
5.6 Summary
Chapter 6 Personalizing Affective Models with Transfer Learning
6.1 Introduction
6.2 Transfer Learning
6.2.1 Transfer Component Analysis
6.2.2 Kernel Principle Component Analysis
6.2.3 Transductive Parameter Transfer
6.3 Experiment Setup
6.4 Experiment Results
6.5 Heterogeneous Knowledge Transfer From Eye Tracking To EEG
6.5.1 Introduction
6.5.2 Spatiotemporal Scanpath Analysis
6.5.3 Heterogeneous Transfer Learning
6.5.4 Evaluation Details
6.5.5 Experimental Results
6.6 Summary
Chapter 7 Multimodal Vigilance Estimation: From Simulated To Real Scenarios
7.1 Introduction
7.2 Feature Extraction
7.2.1 Preprocessing for Forehead EOG
7.2.2 Feature Extraction for Forehead EOG
7.2.3 Forehead EEG Signal Extraction
7.2.4 Feature Extraction from EEG
7.3 Incorporating Temporal Dependency into Vigilance Estimation
7.4 Evaluation Metrics
7.5 Experimental Results on SEED-VIG
7.5.1 Forehead EOG-based Vigilance Estimation
7.5.2 EEG-based Vigilance Estimation
7.5.3 Modality Fusion with Temporal Dependency
7.5.4 Complementary Characteristics
7.6 Experimental Setups with Wearable Device
7.7 Experimental Results on Wearable Device
7.7.1 Laboratory Driving Simulations
7.7.2 Real-World Driving Experiments
7.8 Discussion
7.9 Summary
Chapter 8 Conclusions and Future Work
8.1 Summary of Contributions
8.2 Future Work
References
Acknowledgements
Publications
Project Paticipation
List of Patents
Resume
本文編號(hào):3246997
本文鏈接:http://sikaile.net/kejilunwen/zidonghuakongzhilunwen/3246997.html
最近更新
教材專(zhuān)著