版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
1、 1Affective Intelligent Car Interfaces with Emotion Recognition Christine L. Lisetti Department of Multimedia Communications, Institut Eurecom Sophia-Antipolis, France christine.lisetti@eurecom.fr Fatma Nasoz Schoo
2、l of Computer Science, University of Central Florida Orlando, FL fatma@cs.ucf.edu Abstract In this paper, we uncover a new potential application for multi-media technologies: affective intelligent car interfaces for
3、 enhanced driving safety. We also describe the experiment we conducted in order to map certain physiological signals (galvanic skin response, heart beat, and temperature) to certain driving-related emotions and states
4、(Frustration/Anger, Panic/Fear, and Boredom/Sleepiness). We demonstrate the results we obtained and describe how we use these results to facilitate a more natural Human-Computer Interaction in our Multimodal Affective
5、Car Interface for the drivers of the future cars. 1 Introduction and Motivation Humans are social beings that emote and their cognition is affected by their emotions. Emotions influence various cognitive processes in h
6、umans, including perception and organization of memory (Bower, 1981), categorization and preference (Zajonc, 1984), goal generation, evaluation, and decision-making (Damasio, 1994), strategic planning (Ledoux, 1992), f
7、ocus and attention (Derryberry Ekman Chovil 1991), and learning (Goleman, 1995). Previous studies also suggest that people emote while they are interacting with computers (Reeves Gross & Levenson, 1997). Howe
8、ver, interpreting the data with statistical methods and algorithms is beneficial in terms of actually being able to map them to specific emotions. Studies have demonstrated that algorithms can be very successfully impl
9、emented for recognition of emotions from physiological signals. Collet et al. (Collet, Vernet-Maury, Delhomme, & Dittmar, 1997) showed neutral and emotionally loaded pictures to participants in order to elicit happ
10、iness, surprise, anger, fear, sadness, and disgust. The physiological signals measured were: Skin conductance (SC), skin potential (SP), skin resistance (SR), skin blood flow (SBF), skin temperature (ST), and Instantan
11、eous respiratory frequency (IRF). Statistical comparison of data signals was performed pair-wise, where 6 emotions formed 15 pairs. Out of these 15 emotion-pairs, electrodermal responses (SR, SC, and SP) distinguished
12、13 pairs, and similarly combination of thermo-circulatory variables (SBF and ST) and Respiration could distinguish 14 emotion pairs successfully. Picard et al. (Picard, Healey, & Vyzas, 2001) used personalized imag
13、ery and emotionally loaded pictures to elicit happiness, sadness, anger, fear, disgust, surprise, neutrality, platonic love, and romantic love. The physiological signals measured were GSR, heartbeat, respiration, and e
14、lectrocardiogram. The algorithms used to analyze the data were Sequential Forward Floating Selection (SFFS), Fisher Projection, and a hybrid of these two. The best classification achievement was gained by the hybrid
15、 method, which resulted in 81% overall accuracy. Healey’s research (Healey, 2000) was focused on recognizing stress levels of drivers by measuring and analyzing their physiological signals (skin conductance, heart activ
16、ity, respiration, and muscle activity). During the experiment participants of this study drove in a parking garage, in a city, and on a highway. Results showed that the drivers’ stress could be recognized as being rest
17、 (i.e. resting in the parking garage), city (i.e. driving in the Boston streets), and highway (i.e. two lane merge on the highway) with 96% accuracy. 2.2 Our Preliminary Emotion Elicitation and Recognition Experiments
18、 In our emotion elicitation experiment we used movie clips and difficult mathematical questions to elicit six emotions: sadness, anger, surprise, fear, frustration, and amusement and a non-invasive wireless wearable com
19、puter – BodyMedia SenseWear Armband (Figure 2) – to collect the physiological signals of our participants: galvanic skin response, heart rate, and temperature. Figure 2 BodyMedia SenseWear Armband Mathematical question
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 眾賞文庫(kù)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- 車輛駕駛外文翻譯--智能車的情緒與情感識(shí)別接口
- 車輛駕駛外文翻譯--智能車的情緒與情感識(shí)別接口
- 車輛駕駛外文翻譯--智能車的情緒與情感識(shí)別接口(中文)
- 車輛駕駛外文翻譯--智能車的情緒與情感識(shí)別接口(中文).docx
- 車輛駕駛外文翻譯--智能車的情緒與情感識(shí)別接口(中文).docx
- 智能車外文翻譯---自治智能車在模擬車輛列隊(duì)中的設(shè)計(jì)
- 疲勞駕駛外文翻譯---駕駛疲勞的檢測(cè)與防治技術(shù) 英文
- 外文翻譯--基于語(yǔ)音識(shí)別的智能門控系統(tǒng)設(shè)計(jì)(英文)
- 外文翻譯---駕駛疲勞的檢測(cè)與防治技術(shù) 英文.pdf
- 外文翻譯---駕駛疲勞的檢測(cè)與防治技術(shù) 英文.pdf
- 外文翻譯--駕駛員駕駛基礎(chǔ)上的眼睛分析(英文)
- 情緒與情感
- [雙語(yǔ)翻譯]語(yǔ)音識(shí)別外文翻譯--自動(dòng)語(yǔ)音識(shí)別錯(cuò)誤檢測(cè)與糾正綜述(英文)
- 基于cmos攝像頭的智能車路徑識(shí)別與方向控制畢業(yè)論文外文翻譯
- 外文翻譯--基于dubins路徑的智能車輛路徑規(guī)劃算法研究 英文版
- 2013年--外文翻譯---借助藍(lán)牙設(shè)備實(shí)現(xiàn)智能手機(jī)與車輛通信的安全協(xié)議(英文)
- [雙語(yǔ)翻譯]無人駕駛外文翻譯--正面和負(fù)面信息對(duì)消費(fèi)者乘坐無人駕駛車輛的意愿的影響(中英文)
- [雙語(yǔ)翻譯]酒后駕駛外文翻譯--影響酒后駕駛和非酒后駕駛車禍嚴(yán)重程度的因素研究(英文)
- 外文翻譯--汽車—駕駛系統(tǒng)的研究處理英文版
- 2012年車輛工程外文翻譯--汽車座椅的舒適模型(英文).PDF
評(píng)論
0/150
提交評(píng)論