• No results found

Multi-modal affect induction for affective brain-computer interfaces

N/A
N/A
Protected

Academic year: 2021

Share "Multi-modal affect induction for affective brain-computer interfaces"

Copied!
11
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Affective Brain-Computer Interfaces

Christian M¨uhl1, Egon L. van den Broek1,3,4, Anne-Marie Brouwer2,

Femke Nijboer1, Nelleke van Wouwe2, and Dirk Heylen1

1 Human Media Interaction, University of Twente, Enschede, The Netherlands 2 TNO Behavioural and Societal Sciences, Soesterberg, The Netherlands

3 Human-Centered Computing Consultancy, Vienna, Austria 4 Karakter University Center, Radboud University Medical Center Nijmegen,

The Netherlands

{cmuehl,nijboerf,d.k.j.Heylen}@ewi.utwente.nl, {nelleke.vanwouwe,anne-marie.brouwer}@tno.nl,

vandenbroek@acm.org http://hmi.ewi.utwente.nl/

Abstract. Reliable applications of affective brain-computer interfaces

(aBCI) in realistic, multi-modal environments require a detailed under-standing of the processes involved in emotions. To explore the modality-specific nature of affective responses, we studied neurophysiological responses (i.e., EEG) of 24 participants during visual, auditory, and audiovisual affect stimulation. The affect induction protocols were vali-dated by participants’ subjective ratings and physiological responses (i.e., ECG). Coherent with literature, we found modality-specific responses in the EEG: posterior alpha power decreases during visual stimulation and increases during auditory stimulation, anterior alpha power tends to decrease during auditory stimulation and to increase during visual stimulation. We discuss the implications of these results for multi-modal aBCI.

Keywords: affective brain-computer interfaces, emotion, ECG, EEG,

visual, auditory, multi-modal.

1

Introduction

Affective computing aims to enrich the interaction with adaptive applications and devices by taking into account information about the user’s affective state [21]. To date, several sources of information about the affective state have been successfully exploited, most prominently verbal and non-verbal behavior [27] and physiological signals from the peripheral nervous system [14]. With the advent of BCI, emotion assessment from neurophysiological activity gained prominence. Affective brain-computer interfaces (aBCI) aim at emotion detection through real-time electroencephalographical (EEG) signal processing and classification. Research in aBCI has explored various paradigms and modalities to elicit emo-tions (e.g., visual, auditory, and tactile as well as self-induction). Multi-modal S. D´Mello et al. (Eds.): ACII 2011, Part I, LNCS 6974, pp. 235–245, 2011.

c

(2)

emotion elicitation, however, has hardly been explored, although this is most prominent in real life (cf. [5]). To fill this knowledge gap in aBCI, the current study explores modality-specificity of EEG signals, using visual, auditory, and audio-visual affect induction protocols within a single experiment. We hypoth-esize that correlates of affect are (partially) modality-specific - they depend on the stimulation modality by which emotions are induced.

Functional neuroimaging studies provide evidence for stimulus-specific cog-nitive responses to affective stimulation. They show that correlates of affective responses can be found in core affective areas in the limbic system and associated frontal brain regions, but also in modality-specific areas associated with stimu-lus processing in general [15]. Emotional postures [10] and facial expressions [22] lead to a stronger activation of areas known to be involved in (visual) posture and face processing than their neutral counterparts. Similarly, affect-inducing sounds [8] activate auditory cortical regions stronger than neutral sounds, and auditorily induced affective states can be classified on the basis of the activa-tions measured within these regions [7]. It should be noted that these responses are similar in their nature to purely cognitive responses, as observed during attentional orienting to a specific modality [25].

Most research on EEG responses toward affect uses visual affect induction by pictures. The late positive potential is a hallmark of visual affective correlates that is strongest over posterior cortical sites [11] and can be traced back to the activation of a network of visual cortices [23]. Unfortunately, in the realm of aBCI applications such time-domain based ERP correlates of affect are prob-lematic, as they require averaging over many trials defined by clear stimulus onsets - something not available if affect is to be estimated during real-time in-teraction. Therefore, we and most other aBCI studies focus on features in the frequency domain, for which studies found indeed correlations during affective manipulations. Among other frequency bands, the power of the alpha band (8 -13 Hz) - especially over posterior regions - was shown to respond to visual affec-tive manipulation [1,2,9], which is of special interest as it also responds strongly to sensory manipulation in a modality-specific way [13,20].

Less is known about EEG correlates of auditory stimulation, as the activa-tion of auditory areas is less easy to assess by EEG compared to that of visual areas [20]. However, magnetoencephalographic measurements have shown that alpha activity in the superior-temporal cortex correlates with auditory stimulus processing [13], and is supposed to be reflected in fronto-central EEG activity [12]. Consistent with this expectation, auditory-related cognitive processes were associated with differences in the amplitude of anterior and temporal alpha os-cillations in the EEG [16]. A link between affective auditory manipulation and alpha band power was suggested by [2,19].

Summarizing, posterior alpha power has been associated with visual process-ing and with visual affective stimulation, whereas anterior alpha power has been linked to auditory processing and might be associated with affect. In general, alpha decreases during active processing, and increases otherwise [20]. To study the differentiation of posterior and anterior alpha power according to visual and

(3)

auditory affective stimulation, respectively, we induce affective states differing in valence and arousal via visual, auditory, and audio-visual stimuli.

In accordance with the literature, we expect affect-related responses to pic-tures mainly for the alpha power over posterior cerebral regions. We define a region of interest (ROI) pa for parietal that comprises electrodes P3,Pz,P4 and formulate our hypothesis for the expected response to the modality-specific af-fective stimulation: In the case of visual affect induction, alpha band power at pa is decreasing, as expected for increased visual processing. During auditory af-fect induction, alpha power at pa will increase, as expected for the inhibition of visual processing. During the audio-visual induction, however, a strong decrease of alpha at pa is expected.

Main effects of auditory affective stimulation might be anticipated in the activity over anterior cerebral regions. We define a ROI fc comprising the fronto-central electrodes of FC1,Cz,FC2 and formulate the following hypotheses: In the case of visual affect induction, alpha band power at fc is increasing, as expected for decreased auditory processing. During auditory affect induction, alpha power at fc will decrease, as expected during increased auditory processing. During the audio-visual induction, a strong decrease of alpha at fc is expected.

To verify whether we manipulated emotional states as expected, we also recorded subjective ratings of the participants’ emotional states and electrocar-diographical (ECG) signals. We expect a decrease of heart rate during negative and arousing emotions [3,6].

2

Methods

2.1 Participants

Twelve female and twelve male participants (mean age: 27 years, standard de-viation: 3.8 years) were recruited from a subject pool of the research group and via advertisement at the university campus. All participants, but one, were right-handed. Participants received a reimbursement of 6 Euro per hour or the alternative in course credits.

2.2 Apparatus

The stimuli were presented with “Presentation” software (Neurobehavioral sys-tems) using a dedicated stimulus PC, which sent markers according to stimulus onset and offset to the EEG system (Biosemi ActiveTwo Mk II). The visual stim-uli were presented on a 19" monitor (Samsung SyncMaster 940T). The auditory stimuli were presented via a pair of custom computer speakers (Philips) located at the left and right sides of the monitor. The distance between participants and monitor/speakers was about 90 cm.

To assess the neurophysiological responses, 32 active silver-chloride electrodes were placed according to the 10-20 system. Additionally, 4 electrodes were ap-plied to the outer canthi of the eyes and above and below the left eye to derive

(4)

horizontal EOG and vertical EOG, respectively. To record the electrocardio-gram (ECG), active electrodes were attached with adhesive disks at the left, fifth intercostal space and 5 to 8 cm below. Additionally, participants’ skin con-ductance, blood volume pulse, temperature, and respiration were recorded for later analysis. All signals were sampled with 512 Hz.

2.3 Stimuli

We used 50 pictures and 50 sounds validated for arousal and valence from the affective stimuli databases IAPS [17] and IADS [4]. For both stimulus modalities we selected 10 stimuli for each of 4 categories varying in valence (pleasant and unpleasant) and arousal (high and low). Additionally, a neutral class (i.e., low arousal, neutral valence) with 10 stimuli for each stimulus modality was con-structed. Table 11 presents the mean arousal and valence values as reported in [17,4] for each condition (values in bold font). They were matched as good as possible between the emotion conditions, and between modalities. Footnote 1 includes the labels of the specific stimuli used. For the audio-visual conditions, visual and auditory stimuli of the same emotion conditions were paired with spe-cial attention to match the content of picture and sound (e.g., pairing of “aimed gun” picture and “gun shot” sound). The pictures and sounds of the unimodal conditions were paired in the order they are listed in Footnote 1.

Table 1. Mean (std) valence and arousal ratings for the stimuli of each

emo-tion condiemo-tion1 computed from the norm ratings [17,4] and from our participants’

ratings

Condition Visual (IAPS) Auditory (IADS) Audio-visual Valence Arousal Valence Arousal Valence Arousal (1) Unpleas. 2.58(0.60) 5.24(0.54) 3.05(0.51) 5.81(0.43) - -low arousal 1.99(0.79) 4.98(1.60) 2.84(0.81) 4.55(1.73) 2.28(0.78) 5.08(1.56) (2) Unpleas. 2.26(0.34) 6.50(0.22) 2.70(0.51) 6.79(0.31) - -high arousal 1.97(0.83) 5.73(1.84) 2.55(0.77) 5.32(1.64) 2.02(0.90) 5.82(1.61) (3) Pleasant 7.53(0.44) 5.26(0.52) 7.09(0.43) 5.59(0.39) - -low arousal 6.88(0.70) 5.24(1.45) 6.17(0.71) 4.97(1.50) 6.69(0.81) 5.37(1.50) (4) Pleasant 7.37(0.31) 6.67(0.38) 7.19(0.44) 6.85(0.39) - -high arousal 6.29(0.93) 5.50(1.60) 6.28(0.71) 5.67(1.62) 6.40(0.69) 5.92(1.65) (5) Neutral 4.92(0.54) 5.00(0.51) 4.82(0.44) 5.42(0.42) - -low arousal 4.52(0.64) 4.41(1.24) 4.86(0.60) 4.10(1.29) 4.54(0.60) 4.38(1.38)

2.4 Design and Procedure

Before the start of the experiment, participants signed an informed consent form. Next, the sensors were placed. Before the start of the recording, the participants were shown the online view of their EEG to make them conscious of the influence of movement artifacts. They were instructed to restrict the movements to the

1 (1) IAPS:2141,2205,2278,3216,3230,3261,3300,9120,9253,8230; IADS:280,250,296,703,241,242,730,699,295,283 (2) IAPS:2352.2,2730,3030,6360,3068,6250,8485,9050,9910,9921; IADS:600,255,719,284,106,289,501,625,713,244 (3) IAPS:1811,2070,2208,2340,2550,4623,4676,5910,8120,8496; IADS:226,110,813,221,721,820,816,601,220,351 (4) IAPS:4660,5629,8030,8470,8180,8185,8186,8200,8400,8501; IADS:202,817,353,355,311,815,415,352,360,367 (5) IAPS:2220,2635,7560,2780,2810,3210,7620,7640,8211,9913; IADS:724,114,320,364,410,729,358,361,500,425.

(5)

breaks, and to watch and listen to the stimuli, while fixating the fixation cross at the centre of the screen.

Stimuli were presented in three separate modality blocks in a balanced order over all participants (Latin square design). Each modality block started with a baseline recording in which the participants looked at a black screen with a white fixation cross for 60 seconds. Between the modality blocks were breaks of approximately 2 minutes in which the signal quality was checked.

Each modality block consisted of the five emotion conditions (see Table 1) presented in a pseudo-randomized order, ensuring an approximate balancing of the order of emotion conditions within each modality condition over all subjects. Between the emotion blocks there were breaks of 20 seconds. Each emotion con-dition consisted of the presentation of the respective 10 stimuli, in a randomized order. Each stimulus was presented for about 6 seconds. Stimuli were separated by 2 seconds blank screens. Finally, all stimuli were presented again to the par-ticipants, to rate their affective experience on 9-point SAM scales of arousal, valence, and dominance as used in [4,17].

2.5 Data Processing and Analysis

After referencing to the common average, the EEG data was high-pass FIR filtered with a cut-off of 1 Hz and underwent a three-step artifact removal pro-cedure: (1) it was visually screened to identify and remove segments of excessive EMG and bad channels, (2) eye artifacts were removed via the AAR toolbox in EEGlab, and finally (3) checked for residuals of the EOG. For the estimation of the power within the alpha frequency band, a FFT (Welch’s method with 1 s overlapping Hamming-windows) was applied to each of the 15 blocks and their preceding baseline intervals for every participant separately. The resulting power values with a resolution of 1 Hz were averaged for the bins of 8 to 13 Hz for parietal and fronto-central regions. To approach normal distribution, the natural logarithm was computed, and baselining was performed by subtraction of the power of the preceding resting period. To compute heart rate (HR), the ECG signal was filtered by a 2-200 Hz bandpass 2-sided Butterworth filter and peak latencies extracted with the BIOSIG toolbox in Matlab. Next, the inter-beat intervals were converted to HR and averaged for each participant and each of the 15 stimulus blocks (3 modality× 5 emotional blocks). For the analysis of the effects of affect induction on ratings, HR, and alpha activity, repeated measures ANOVAs (rmANOVA) were conducted. Where appropriate, Greenhouse-Geisser corrected results are reported. As effect size measure we report the partial eta-squaredη2

p.

3

Results

3.1 Subjective Ratings

The affective manipulations yielded the expected differences in subjective rat-ings (see Figure 1). A 3(modality)×5(emotion) rmANOVA on the valence ratings

(6)

Fig. 1. The mean valence and arousal ratings for all 5 emotion conditions for visual,

auditory, and audio-visual modality of induction (error bars: standard error of mean)

showed a main effect of both emotion (F(4,92) = 246.100,p< 0.001,η2p= 0.915) and modality (F(2,46) = 6.057,p = 0.005,η2

p= 0.208). A 3(modality)×3(valence)

rmANOVA on mean ratings of negative, neutral, and positive conditions, showed a main effect of valence (F(2,46) = 264.100, p < 0.001,η2

p = 0.920). Pairwise

t-tests showed that the differences between negative, neutral, and positive con-ditions were highly significant. Furthermore, a main effect for modality (F(2,46) = 7.078, p = 0.002,η2

p= 0.235) reflected more positive ratings for states induced

via the auditory modality compared to visual and audio-visual. An interaction effect indicates less extreme ratings for auditory stimuli (F(4,92) = 14.813, p< 0.001,η2

p = 0.392) and, hence, a weaker efficacy.

A similar pattern was found in a 3(modality)×5(emotion) rmANOVA on the arousal ratings, showing a main effect of emotion (F(4,92) = 12.588,p < 0.001,η2p = 0.354), and of modality (F(2,46) = 9.177,p < 0.001,η2p = 0.285). A 3(modality)×2(arousal) rmANOVA on mean ratings of low and high arousing conditions, showed a main effect of arousal (F(1,23) = 27.180, p< 0.001,ηp2 = 0.542). A main effect for modality (F(2,46) = 9.344, p< 0.001,η2p= 0.289) indi-cated, that auditorily induced affect was rated less arousing than visual, which in turn was rated as less arousing compared to audio-visual.

3.2 Heart Rate

The data of 3 participants was excluded from analysis, as ECG artifacts resulted in difficulties for adequate QRS-peak detection. The analysis of HR change rel-ative to the resting baseline indicated differences resulting from the valence and arousal manipulations. A 3(modality)×5(emotion) rmANOVA showed main ef-fects of modality (F(2,40) = 6.063, p = 0.005,η2

p= 0.233) and emotion (F(4,80)

= 4.878, p = 0.001,η2

p = 0.196). A 3(modality)×3(valence) rmANOVA on low

arousing negative, neutral, and positive conditions, showed a main effect of modality (F(2,40) = 5.829, p = 0.006,η2

p = 0.228) with significantly higher HR

for the auditory conditions; as well as a main effect of valence (F(2,40) = 5.641, p = 0.007,η2p = 0.220), with a lower HR for negative compared to neutral and positive valence. For arousal, a 3(modality)×2(arousal) rmANOVA on averaged

(7)

Fig. 2. The parietal and fronto-central alpha power (error bars: standard error of mean)

during visual, auditory, and audio-visual affect (emotion – neutral), averaged over all subjects that entered the respective analyses (A); and the 2nd level contrasts of visual – auditory (B), visual – audio-visual (C), audio-visual – auditory affect (D), showing modality-specific responses averaged over all subjects

low versus high arousing emotion conditions, showed a main effect of modality (F(2,40) = 7.029, p = 0.002,ηp2 = 0.260), with higher HR for the auditory con-ditions; and a main effect of arousal (F(1,20) = 5.360, p = 0.031,η2

p = 0.211),

with a HR decrease for higher arousing stimuli.

3.3 EEG

For each ROI, the data of 4 cases was excluded according to the outlier cri-terium (1.5×interquartile range) to guarantee normality, which was potentially violated by the effects of residual artifacts on the alpha band power. To test the effect of modality-specific affective stimulation, we contrasted for all 3 modality conditions neutral against mean emotion response with a 3(modality)× 2(emo-tion) rmANOVA. We did so for the alpha power over the “visual” parietal pa and “auditory” fronto-central fc region (see Figure 2), separately. In accordance with our prediction for pa, we found an interaction effect between modality and emotion (F(2,38) = 3.373, p = 0.045,ηp2= 0.151). To specify the nature of the interaction, we computed the second-level emotion contrasts (emotion - neutral) for the visual, auditory, and audio-visual conditions. Pairwise t-tests confirmed the difference between audio-visual versus auditory affect (t = -2.651, p = 0.016, Figure 2D), and a trend for visual versus auditory affect (t = -2.093, p = 0.052, Figure 2B). Alpha power at pa decreases for visual and audio-visual affect, but increases for auditory affect. For fc, we found an interaction between modality and emotion (F(2,38) = 4.891, p = 0.013,η2

p = 0.205). As for the pa, we

com-puted the second-level emotion contrasts for all conditions over the fc. Pairwise t-tests showed a significant difference between audio-visual versus visual affect (t = -3.155, p = 0.005, Figure 2B), and a trend for the visual versus auditory affect (t = -1.817, p = 0.085, Figure 2C). Alpha power at fc decreases for au-ditory and audio-visual affect, but increases during visual affect induction. A main effect of modality was observed for both regions (pa: F(2,38) = 3.821, p = 0.043,ηp2 = 0.167; fc: F(2,38) = 6.110, p = 0.005, η2p = 0.243), resulting from

(8)

lower alpha power for general visual and audio-visual stimulation, compared to auditory stimulation (independent of affective conditions).

4

Discussion

In the current study, we explored neurophysiological responses to visual, auditory, and combined affective stimulation. We predicted, that parietal and fronto-central alpha power respond differently during visually and auditorily induced emotion.

The effects of valence and arousal on ratings and HR suggest that in general, the affect induction was successful. The affective stimulation had strong effects on the subjective experience of the participants. Despite our efforts to keep the emotional valence and arousal comparable over modalities, the ratings showed a slightly weaker efficacy of auditory stimuli to induce the affective states. This pattern was confirmed by the participants’ HR. While arousal and valence con-trasts showed the expected significant decreases of HR for arousing and negative stimuli [3,6], respectively, the overall HR change to stimulation was smallest for auditory, followed by visual, and strongest for audio-visual stimulation.

The observed responses of posterior (paα) and anterior (fcα) alpha power (see Figure 2) match our expectations as formulated in the introduction. We found a lowerpaαfor visual compared to auditory affective stimulation, especially when visual stimuli were paired with auditory stimuli. Complementary to paα, fcα was lower for auditory compared to visual affective stimulation, especially when auditory stimuli were paired with visual stimuli. With regard to the literature on sensory processing [13,20], the effects might result from a decrease of alpha band power during “appropriate” stimulation (activation), and an increase during “in-appropriate” stimulation (inhibition). The “appropriateness” is defined by the modality-specificity of the underlying cortices - the posterior visual cortices for ROI pa [20], and the temporal auditory cortices for ROI fc [12]. It should be noted, that the alpha increase to “inappropriate” stimulation seems more marked than the decrease during “appropriate” stimulation, stressing the inhibitory na-ture of the potential sensory gating process [13]. The relatively strong alpha decrease during audio-visual affective stimulation, might result from extra acti-vation due to more intense processing of bimodal stimuli. Alternatively, it might be an additive effect resulting from an overlap ofpaα and fcα due to volume conduction. Please note that these modality-specific effects were yielded after correcting emotion by neutral conditions for each modality, removing responses that are common to any visual or auditory stimulation, leaving the correlates of the affective response.

These results suggest that affective stimulation via different modalities has dif-ferent, even opposing effects which can be observed in the EEG. These adhere to theories of modality-specific activation and inhibition of the respective, function-ally specialized brain regions also involved in non-affective cognitive processes. Therefore, it is conceivable that these correlates are stimulus-specific cognitive consequences of affective stimulation, comparable to enhanced processing during attention [25], rather than primary correlates of affect. This is also in line with

(9)

cognitive theories of emotion (e.g. the Component Process Theory [24]). These theories postulate the involvement of core affective processes such as affective self-monitoring (feeling) and of rather cognitive processes related to stimulus evaluation and behavior planning.

The complex response of the brain to emotional stimulation has consequences for those trying to detect affective user states from neurophysiological activity. Firstly, classifiers based on stimulus-specific neurophysiological responses, will be limited in their capability to generalize to affect evoked by other stimuli. Secondly, if these responses are of a general cognitive nature, also occurring in non-affective contexts, such classifiers are prone to confuse purely cognitive and affective events. The nature of the neurophysiological correlates enabling the successful detection of affective states should be taken into account when considering the application of such classifiers in real-world scenarios, to avoid threads to the reliability and validity of aBCIs.

The observed modality-specific responses might be of interest in their own right: assuming that a part of the response to affective manipulation can be at-tributed to the way (i.e. the type of stimulation) emotions are evoked, it might be possible to distinguish the origin of the affective response via neurophysiolog-ical activity. Another interesting consequence of the presumed cognitive nature of the responses is the discrimination of the attended modality in general, not only restricted to affect.

The current study also has its limitations. One is the creation of artificial multi-modal combinations of IADS and IAPS stimuli. The restricted choice of stimuli led to partial mismatches of the content, which could induce additional processes and correlates, such as mismatch negativity [18] or multi-sensory in-tegration (cf. [5]). To prevent such effects, most often faces with accompanying speech are used. However, this limits the results of these studies for multi-modal affective processing. The results show, conveniently, that such effects are not likely to have occurred in the current study as for both modalities similar uni-modal and multi-uni-modal effects on the EEG signals were found. However, these effects are only two of many possible. Most likely very early and short processes or effects have occurred not reported in this study (e.g., [26]). Multi-sensory in-tegration is a challenging endeavor on itself, its mapping on neurophysiology is relatively little explored, and the combination of these two factors with emotion is hardly touched upon, in particular in applied and ambulatory research.

5

Conclusion

Taken together, we have shown that neurophysiological correlates of affective responses measured by EEG are partially modality-specific: alpha band power over posterior and anterior regions differs in a systematic way between visual, auditory, and audio-visual stimulation. The responses in the alpha band suggest that visual processes are mainly reflected over parietal regions, while auditory processes are reflected over fronto-central regions. These results pose potential problems for generalization and specificity of affect classifiers relying on such

(10)

modality-specific, rather cognitive neurophysiological effects. They also imply the possibility to detect the modality, visual or auditory or combined, through which the affective response was elicited. Further studies on the exploration and exploitation of the context-specificity of neurophysiological responses are needed.

Acknowledgments. The authors gratefully acknowledge the support of the

BrainGain Smart Mix Programme of the Netherlands Ministry of Economic Affairs and the Netherlands Ministry of Education, Culture, and Science.

References

1. Aftanas, L.I., Reva, N.V., Varlamov, A.A., Pavlov, S.V., Makhnev, V.P.: Analysis of evoked eeg synchronization and desynchronization in conditions of emotional activation in humans: temporal and topographic characteristics. Neuroscience and Behavioral Physiology 34(8), 859–867 (2004)

2. Baumgartner, T., Esslen, M., Jancke, L.: From emotion perception to emotion experience: Emotions evoked by pictures and classical music. International Journal of Psychophysiology 60(1), 34–43 (2006)

3. Bradley, M.M., Lang, P.J.: Affective reactions to acoustic stimuli. Psychophysiol-ogy 37(2), 204–215 (2000)

4. Bradley, M.M., Lang, P.J.: The international affective digitized sounds (IADS-2): Affective ratings of sounds and instruction manual. Technical report, University of Florida, Center for Research in Psychophysiology, Gainesville, Fl, USA (2007) 5. Brefczynski-Lewis, J., Lowitszch, S., Parsons, M., Lemieux, S., Puce, A.:

Audio-visual non-verbal dynamic faces elicit converging fMRI and ERP responses. Brain Topography 21(3-4), 193–206 (2009)

6. Codispoti, M., Ferrari, V., Bradley, M.M.: Repetitive picture processing: autonomic and cortical correlates. Brain Research 1068(1), 213–220 (2006)

7. Ethofer, T., Van De Ville, D., Scherer, K., Vuilleumier, P.: Decoding of emotional information in voice-sensitive cortices. Current Biology 19(12), 1028–1033 (2009) 8. Grandjean, D., Sander, D., Pourtois, G., Schwartz, S., Seghier, M.L., Scherer, K.,

Vuilleumier, P.: The voices of wrath: brain responses to angry prosody in mean-ingless speech. Nature Neuroscience 8(2), 145–146 (2005)

9. Guntekin, B., Basar, E.: Emotional face expressions are differentiated with brain oscillations. International Journal of Psychophysiology 64(1), 91–100 (2007) 10. Hadjikhani, N., de Gelder, B.: Seeing fearful body expressions activates the fusiform

cortex and amygdala. Current Biology 13(24), 2201–2205 (2003)

11. Hajcak, G., MacNamara, A., Olvet, D.M.: Event-related potentials, emotion, and emotion regulation: An integrative review. Developmental Neuropsychology 35(2), 129–155 (2010)

12. Hari, R., Salmelin, R., Makela, J.P., Salenius, S., Helle, M.: MEG cortical rhythms. International Journal of Psychophysiology 26, 51–62 (1997)

13. Jensen, O., Mazaheri, A.: Shaping functional architecture by oscillatory alpha ac-tivity: gating by inhibition. Frontiers in Human Neuroscience 4 (2010)

14. Kim, J., Andr´e, E.: Emotion recognition based on physiological changes in music listening. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(12), 2067–2083 (2008)

15. Kober, H., Feldman Barrett, L., Joseph, J., Bliss-Moreau, E., Lindquist, K., Wager, T.D.: Functional grouping and cortical-subcortical interactions in emotion: a meta-analysis of neuroimaging studies. NeuroImage 42(2), 998–1031 (2008)

(11)

16. Krause, C.M.: Cognition- and memory-related erd/ers responses in the auditory stimulus modality. Progress in brain research 159, 197–207 (2006)

17. Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International affective picture system (IAPS): Technical manual and affective ratings. Technical report, University of Florida, Center for Research in Psychophysiology, Gainesville, Fl, USA (1999) 18. N¨a¨at¨anena, R., Paavilainen, P., Rinne, T., Alho, K.: The mismatch negativity

(MMN) in basic research of central auditory processing: A review. Clinical Neuro-physiology 118(12), 2544–2590 (2007)

19. Panksepp, J., Bernatzky, G.: Emotional sounds and the brain: the neuro-affective foundations of musical appreciation. Behavioral Processes 60, 133–155 (2002) 20. Pfurtscheller, G., Lopes da Silva, F.H.: Event-related EEG/MEG synchronization

and desynchronization: basic principles. Clinical Neurophysiology 110(11), 1842– 1857 (1999)

21. Picard, R.W.: Affective Computing. The MIT Press, Cambridge (1997)

22. Pourtois, G., Vuilleumier, P.: Dynamics of emotional effects on spatial attention in the human visual cortex. Progress in Brain Research, vol. 156, pp. 67–91. Elsevier, Amsterdam (2006)

23. Sabatinelli, D., Lang, P.J., Keil, A., Bradley, M.M.: Emotional perception: Cor-relation of functional MRI and event-related potentials. Cerebral Cortex 17(5), 1085–1091 (2007)

24. Sander, D., Grandjean, D., Scherer, K.R.: A systems approach to appraisal mech-anisms in emotion. Neural Networks 18(4), 317–352 (2005)

25. Vuilleumier, P.: How brains beware: neural mechanisms of emotional attention. Trends in Cognitive Sciences 9(12), 585–594 (2005)

26. Wang, J., Nicol, T., Skoe, E., Sams, M., Kraus, N.: Emotion modulates early auditory response to speech. Journal of Cognitive Neuroscience 21(11), 2121–2128 (2009)

27. Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pat-tern Analysis and Machine Intelligence 31(1), 39–58 (2009)

Referenties

GERELATEERDE DOCUMENTEN

Wanneer respondent Robert (68 jaar) wordt gevraagd wat hij vindt van het zichtbaar maken van seksuele geaardheid door bijvoorbeeld meer uitgesproken, niet-mannelijke kleding te

These questions will be asked concerning meaning making of collective nostalgia, language construction of Self and Others, and the (major and marginal) themes and actions that

3) Are fraudster characteristics related to the three types of fraud, as predicted by the Rational Choice Model of crime?.. 4) With what degree of accuracy can one predict the

It is possible that other constituents present in whey protein isolate, such as α-lactalbumin, bovine serum albumin, lactose, or lipids, are responsible for the observed

In the sense of cyber physical production systems this includes innovative approaches for data gathering and processing, for data analytics combined with (simulation

The results of the study indicate the following as the factors that influence sport participation among students in selected secondary schools in Pretoria: Sports conflicting

Nurse-initiated management of ART- trained nurses’ negative attitudes towards TB/HIV guidelines entailed lack of agreement with the guideline, poor motivation, support

variables affects the level of government expenditure and revenue so that the budget deficit can remain sustained.. 'The Relationship between Government Revenue and Expenditures