In psychology and philosophy, emotion is defined as “a subjective, conscious experience characterized primarily by psychophysiological expressions, biological reactions, and mental states, which is also influenced by hormones and neurotransmitters such as dopamine, noradrenaline, serotonin, oxytocin, cortisol and GABA.”
It appears that this definition indeed sets the bases for experimental psychology, precisely because emotions are triggered by a set of external stimulation, which can be characterized, quantified, and controlled for the sake of a scientific investigation. Likewise, the biological and mental responses can be capture by the mean of traditional bio-signal recording techniques: respiratory belt, electrocardiogram (ECG), galvanic skin response (GSR), and electroencephalograph (EEG) to cite a few.
On the other-hand, this definition points at great challenges. Mostly, emotions are defined as subjective, which forecasts their difficult characterization and suggests a high inter-individual variance. Moreover, Emotion is often associated and considered reciprocally influential with mood, temperament, personality, disposition, and motivation, which will translate into a high intra-individual variation [Gaulin 2003].
To conclude, the definition of emotions explicitly sets the frame for its study, the borders of which are defined by fundamentally inherent limitations being a high inter-individuals and intra-individual variations.
2 The study of emotions
2.1 The arousal-valence hypothesis
The physiology of emotion is closely linked to arousal of the nervous system with various states and strengths of arousal relating, apparently, to particular emotions while emotion is often the driving force behind motivation, positive or negative [Schacter 2011]. An alternative definition of emotion is a “positive or negative experience that is associated with a particular pattern of physiological activity” [Cacioppo 1999].
In order to study emotions, researchers can stimulate subjects and record their physiological response.
Stimulation: A broadly used tool for stimulations is the International Affective Picture System (IAPS), which is a database of pictures used to elicit a range of emotions. There are currently over 1,000 colored pictures in the database. These pictures are representative of daily experiences such as household furniture to extreme encounters such as a mutilated body. The familiarity of the human experience is what evokes such an array of emotions. IAPS is widely used in experiments studying emotion and attention [Bradley2007]. Alternative affective database of pictures such as the Geneva Affective PicturE System (GAPED) can be used [Dan-Glauser2011]. Such tools are a good way to study arousal and valence in a standardized fashion. While such database constitute an excellent tool for calibration, any other stimulation (video, image, sounds) will trigger an emotion that can subsequently be characterized.
Recording: The tools use for the exploring the response to stimulation are two-folds. On the one-hand, subjective scales are questionnaires designed to estimate the subjective side of the participant’s emotional activity. There is a large variety of tools referenced in literature: Self-Assessment Manikin (SAM) rating scale [Kemp2012], Mood Adjective Checklist, the Profile Mood States, the Expanded Form of the Positive and Negative Affect Schedule, and the Differential Emotions Scale. See [Gray 2007] for a complete list. While it is necessary to capture the subjective experience, that is inherent to emotions, it is tempting to explore more objective metrics: the recording of biosignals before, during, and after affective stimulations does this. The signals mostly reported are respiration, galvanic skin response (GSR), heart rate (HR) and its variability (HRV), eye movements and pupil size, as well as various brain imaging modalities: electroencephalography (EEG), electromagnetography (EMG), facial and body electromyography (EMG), and functional magnetic resonance imaging (fMRI). Of all, EEG is the only modality that offers low cost, consumer availability, and high-temporal resolution. The limitations of EEG are: a low spatial resolution and a sensitivity that is limited to the activity of cortical areas (the activity of deeper brain structures cannot be seen with the EEG).
Despite their inherent subjectivity, emotions seem to involve anatomical regions that are common to most individuals, in a way that neurophysiologcial patterns elicited can be characterized by neuroimaging.
As believed for a long time, deep structures of the brain such as the limbic brain, the amygdala, and the hippocampus are an important substrate for the cerebral processing of emotions [Davindson, 2000]. More recently however, the role of cortical areas was underlined by functional brain imaging studies with, in the first place, the prefrontal cortex [Torro et al. 2008], making the EEG a particularly relevant tool for its analysis. More precisely, the following areas are involved: dorsolateral prefrontal cortex (DLPFC), ventromedial prefrontal cortex (vmPFC), orbitofrontal cortex (OFC), anterior cingulate cortex (ACC), and the insular cortex.
The identification of functional areas of the brain during the generation and processing of emotions soon led to the emergence of anatomical-functional theories that are in relation with the characterization of emotions given by [Russel 1980]. Amongst these, two theories seem of particular interest.
Initially, the “right hemisphere hypothesis” stated that emotions were located in the right hemisphere, which was subsequently refined into the “valence hypothesis” where positive emotions are hosted in the left hemisphere, while negative emotions in the right one [Brown 2011]. This last hypothesis is broadly accepted and used in clinical research even though recent evidence suggest an important subject-specific differences [Mühl 2014].
The quantification of arousal on the other hand seems more global than local and is characterized by an overall lower alpha activity associated with an increase beta frequency bands [Aftanas 2004; chopping 2000; Keil 2001; Gray 2001]. Indeed, alpha rhythm (8-12Hz) is usually associated with a stand-by mode (neurons are looping at this frequency by default) while activation at higher frequencies (beta rhythm over 15Hz for instance) is associated with a specific processing. To illustrate, large alpha oscillations can be seen in the visual cortex when eyes are closed and limb movements is preceded by an increase in beta activity in the primary motor cortex. Arguably, similar patterns in the frontal cortex may relate to arousal.
4 Real time bio-markers of emotions
Many non-invasively collected biological time series (facial and body EMG, GSR, ECG, EMG,…) have been used to monitor participants’ response to an affective stimulation. However, apart from those derived from the EEG, literature still reports important contradictions so that such biomarkers cannot be considered reliable at this stage [Muhl 2014].
4.2 Real time neuromarkeurs of emotions
The exploration of the electric activity of the brain in real time during various conditions thanks to the EEG allowed the emergence of novel neuromarkers for emotions. We present here the main trending approaches: temporal and spectral markers.
4.2.1 Temporal neuromarkers: the Steady-State Topography (SST)
Event Related Potentials (ERPs) are cerebral patterns obtained by averaging brain responses directly following the presentation of similar stimulations. For instance, in the well-known oddball paradigm [Farwell1988], rare and random auditory or visual stimulations elicit a positive central-parietal wave peaking at around 300ms from the onset of the stimulation – the P300. Likewise, a flashing light will generate small peaks (P100, N200) in the visual cortex. It is commonly accepted that only the stimulation’s parameters (amplitude and frequency) modulates early ERPs (peak latency <250ms), thereby called exogenous potentials. Recent research however suggests that frontal cortical activity may in reality modulate the phase and amplitude of these responses [Silberstein2012].
Steady-state topography (SST) estimates the visual (although it could be auditory) SSVEP component generated by a peripheral stimulation during a task. Metrics derived from these neuromarkers have long been used as a proxy for motivational valence, which somehow denotes some form of interest arousal in predefined situations. The typical use-case is as follow:
This imaging technique has proved reliable [Silberstein2012] and offers interesting properties:
This technique was implemented by [Silberstein2012] and [Kemp2012] who reported interesting results on n = 16 subjects using a 64 electrodes headset. On the graph below, the topographic difference in valence and amplitude with respect to a baseline activity are shown in the presence of positively and negatively charged emotional stimulations. Statistical analysis (Hostellings T) reveals that positive and negative valences are associated with higher frontal and left central activations, respectively.
Such technique is called synchronous because it requires the presence of an external stimulation synchronized with the data acquisition system. This badly impairs the practical implementation of such monitoring tools because it requires: 1) a stimulation interface and 2) a precise synchronization between the recording device and the stimulation. On the other hand, Spectral analysis, introduced in the following section, can be handled in a completely asynchronous fashion, which enables more practical implementation.
Figure 3: Difference in amplitude and latency of the SSVEP compoenent observed presented with stimuli of postive (left) and negative (right) valence. Statistical analysis (Hostellings T) reveal statistically significant differences for both conditions in the frontal and left-temporal areas for pleasant and unpleasant stimuli, respectively. In the middle section of the figure, ERPs plots shows the raw data at Fp1 for pleasant, unpleasant and neutral conditions.
4.2.1 Spectral neuromarkers
There is more research in spectral analysis in relation to emotionally loaded content for both theoretical and practical reason. In the first place, both arousal and valence are characterized by local levels of activations that translate into spatio-temporal variations in the subject’s spectral activity.
Other spectral approaches take advantages of the valence hypotheses by comparing spectral content between the two hemispheres. Extracting statistics of these real-time neuro-markers extracted over a larger time-window seemed to increase discriminatory performance [Brown 2011]. This however clearly discards the possible use for real-time applications. Maximum and kurtosis seem to particularly relate to the valence. Interestingly, frequency bands were split over alpha I, II, and III at the following cut-off frequencies 6-8-10-12Hz. Similar results were recently replicated by [Reuderink2013] on n=12 subjects using a 32 electrodes headset. In this work, the most predictive value of valence was found to be the difference in log power in alpha bands between symmetric values in frontal areas. Interestingly, this work also explored the importance of the third dimension of Russel’s model: dominance.
Figure 4: the hypothesis presented relate q specific frequency band in a specific location of the brain; for instance arrousal corresponds to a global decrease in global alpha and a frontal activation in the same frequency band.
Interestingly, the different affective dimensions do not seem to be orthogonal (independent) so that similar areas of the brain might be involved simultaneously. Specifically, the valence and dominance ratings are highly correlated, which can result in effects found in the EEGs that are related to one affective dimension, and attributed to the other.
The alpha asymmetry is often used as a measure for valence. These results indicate that a more careful interpretation is perhaps needed, but the value of the alpha asymmetry remains. In addition, right fronto-central theta power seems to be a good indicator of valence, and right frontal alpha power and the absence of right parietal delta power are indicators of arousal. These effects, and specifically, the stronger narrow-band effects, can be used to construct an automatic recognizer for affect.
5 Mensia’s Neuro Emotions Web Tools (NEW-T)
The elements presented in the previous section present the state-of-the-art for the study of emotions. Emotions were defined and the tools required to study them introduced: stimulation with standardized picture database, self-assessment questionnaires, and biosignal monitoring. Among these, the EEG was found to be a tool of predilection because:
Mensia Technology owns the most advanced software platform for the online processing of EEG signals. All algorithms deployed run in a non-supervised fashion and have real time ability. To illustrate, Mensia’s denoising technology automatically handles artefactual EEG segments during recording, so that signal is neither degraded nor contaminated. The implementation of such techniques, which greatly benefits to data quality, usually is not implemented in alternative cognitive stand-alone applications that cannot afford the burden of such advanced algorithm development. Likewise, Mensia owns a big data platform with online data collection and analysis capacities. In fact, distributing data collection over the web, while preserving real time and perfect synchronization of signals is a truly unique feature of Mensia’s technology. The combination of Mensia’s knowledge in EEG processing and modeling has led to an efficient and practical implementation of neuromarkers related to emotions as described in this document.
Figure 5: Mensia offers of the STT protocol on its NEW-T platform that can load a YouTube video and generate a blinking area around it (the white frame on this illustration). On this example, the “11:30 appointment” Coke advertisement, which was used by Silberstein.
Our Neuro Emotions Web portal offers synchronous and asynchronous neuromarkers, which can be collected simultaneously with the following stimulations: a YouTube video, IAPS or GAPED images, or any video/image database loaded in the system. For synchronous stimulations, a frame surrounds these stimulations while flashing at a pre-defined frequency in order to allow for the extraction of STT markers. Likewise, asynchronous systems will return in real time generic estimation of valence and arousal. These estimates can be calibrated for a subject or population using a preliminary recorded session on a standardized database. The data from these recordings can subsequently be downloaded together appropriate flags and triggers in the data. In addition to this, the web interface will implement any aforementioned subjective questionnaire that your study may require so that your data can be related to objective and subjective information during and directly following the stimulations, respectively. Finally, these real time estimates of arousal and valence can be used directly in our neuroscience authoring tool, the NeuroRT Studio, allowing you to develop and deploy your bespoke standalone or online applications.
While these existing web applications from Mensia provide a first hands-on set of tools for the exploration of emotions in relation to specific emotional contexts, we are collecting a dedicated dataset on which our novel mathematical models will be applied. These will soon provide you with cutting-edge estimate of specific emotions that will work reliably and in a completely standardized fashion so that slight changes in settings (electrode location, quality, and number) may not impact your results dramatically.
Last and not least, models for arousal and valence can specifically be trained for a particular subject so that therapeutic effect is optimized. A right combination of both approach will certainly offer the required trade-off between ease of use and reliability.
Figure 6: Example of a scenario in NeuroRT showing the combination of automated real time pre-processing algorithms together with the extraction and the display of valence and arousal extracted from the data
Gaulin, Steven J. C. and Donald H. McBurney. Evolutionary Psychology. Prentice Hall. 2003. ISBN 978-0-13-111529-3, Chapter 6, p 121-142.
Schacter, Daniel L. (2011). Psychology Second Edition. 41 Madison Avenue, New York, NY 10010: Worth Publishers. p. 310. ISBN 978-1-4292-3719-2.
Cacioppo, J.T & Gardner, W.L (1999). Emotion. “Annual Review of Psychology”, 191.
Russell, James A. “A circumplex model of affect.” Journal of personality and social psychology 39.6 (1980): 1161.
Silberstein Richard B. and Geoffrey E. Nield, Monitoring Emotion in advertising research, Digital Object Identifier 10.1109/MPUL.2012.2189172 Date of publication: 31 May 2012
Bekhtereva, Valeria, et al. “Effects of EEG-vigilance regulation patterns on early perceptual processes in human visual cortex.” Clinical Neurophysiology125.1 (2014): 98-107.
Christian Andreas Kothe and Scott Makeig and Julie Anne Onton, Emotion Recognition from EEG During Self-Paced Emotional Imagery, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction
Lindsay Brown, Bernard Grundlehner, Julien Penders, Towards wireless emotional valence detection from EEG, 33rd Annual International Conference of the IEEE EMBS Boston, Massachusetts USA, August 30 – September 3, 2011
Kemp AH, Gray MA, Eide P, Silberstein RB, Nathan PJ. Steady-state visually evoked potential topography during processing of emotional valence in healthy subjects. Neuroimage. 2002 Dec;17(4):1684-92.
Reuderink, Boris, Christian Mühl, and Mannes Poel. “Valence, arousal and dominance in the EEG during game play.” International Journal of Autonomous and Adaptive Communications Systems 6.1 (2013): 45-62.
Mühl, Christian, et al. “A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges.” Brain-Computer Interfaces ahead-of-print (2014): 1-19.
Aftanas, L. I., Reva, N. V., Varlamov, A. A., Pavlov, S. V., and Makhnev, V. P. (2004). Analysis of evoked EEG synchronization and desynchronization in conditions of emotional activation in humans: temporal and topographic characteristics. Neuroscience and behavioral physiology, 34(8), pages 859-867
Choppin, A. (2000). EEG-based human interface for disabled individuals: Emotion expression with neural networks. Unpublished master’s thesis. Information processing, Tokyo institute of technology, Yokohama, Japan.
Gray, E. and Watson, D. (2007). Assessing positive and negative affect via self-report. In J. A. Coan, and J. J. B. Allen (Eds.), Handbook of Emotion Elicitation and Assessment. New York: Oxford University Press, USA
Keil, A., Müller, M. M., Gruber, T., Wienbruch, C., Stolarova, M., and Elbert, T. (2001). Effects of emotional arousal in the cerebral hemispheres: a study of oscillatory brain activity and event-related potentials. Clinical Neurophysiology, 112(11), pages 2057-2068
Olbrich S, Sander C, Matschinger H, Mergl R, Trenner M, Schönknecht P, et al. Brain and body: associations between EEG-vigilance and the autonomous nervous system activity during rest. J Psychophysiol 2011;25:190–200.
R. B. Silberstein, M. A. Schier, A. Pipingas, J. Ciorciari, S. R. Wood, and D. G. Simpson, “Steady state visually evoked potential topog- raphy associated with a visual vigilance task,” Brain Topogr., vol. 3, no. 2, pp. 337–347, 1990.
N. Torro Alves, S. S. Fukusima, J. Antonio Aznar- Casanova, “Models of brain asymmetry in emotional processing,” Psychology & Neuroscience, Vol. 1, No. 1, pp. 63-66, 2008
Davidson, R. J. 2000. Affective style, psychopathology, and resil- ience: Brain mechanisms and plasticity. Am. Psychol. 55: 1196– 1214.
Russell, J.A. (1980) ‘A circumplex model of affect’, Journal of Personality and Social Psychology, Vol. 39, No. 6, pp.1161–1178.
Bradley, M. M. & Lang, P. J. (2007). The International Affective Picture System (IAPS) in the study of emotion and attention. In J. A. Coan and J. J. B. Allen (Eds.), Handbook of Emotion Elicitation and Assessment (pp. 29-46). Oxford University Press[brmlab] http://brmlab.cz/project/brain_hacking/tdcs/pfc
Farwell, Lawrence Ashley, and Emanuel Donchin. “Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials.” Electroencephalography and clinical Neurophysiology 70.6 (1988): 510-523.
Dan-Glauser, E. S., & Scherer, K. R. (2011). The Geneva affective picture database (GAPED): a new 730-picture database focusing on valence and normative significance. Behavior Research Methods, 43(2), 468-477. doi: 10.3758/s13428-011-0064-1