PLoS ONE
Home The relevant resting-state brain activity of ecological microexpression recognition test (EMERT)
The relevant resting-state brain activity of ecological microexpression recognition test (EMERT)
The relevant resting-state brain activity of ecological microexpression recognition test (EMERT)

Competing Interests: The authors have declared that no competing interests exist.

Article Type: research-article Article History
Abstract

Zhang, et al. (2017) established the ecological microexpression recognition test (EMERT), but it only used white models’ expressions as microexpressions and backgrounds, and there was no research detecting its relevant brain activity. The current study used white, black and yellow models’ expressions as microexpressions and backgrounds to improve the materials ecological validity of EMERT, and it used eyes-closed and eyes-open resting-state fMRI to detect relevant brain activity of EMERT for the first time. The results showed: (1) Two new recapitulative indexes of EMERT were adopted, such as microexpression M and microexpression SD. The participants could effectively identify almost all the microexpressions, and each microexpression type had a significantly background effect. The EMERT had good retest reliability and calibration validity. (2) ALFFs (Amplitude of Low-Frequency Fluctuations) in both eyes-closed and eyes-open resting-states and ALFFs-difference could predict microexpression M. The relevant brain areas of microexpression M were some frontal lobes, insula, cingulate cortex, hippocampus, parietal lobe, caudate nucleus, thalamus, amygdala, occipital lobe, fusiform, temporal lobe, cerebellum and vermis. (3) ALFFs in both eyes-closed and eyes-open resting-states and ALFFs-difference could predict microexpression SD, and the ALFFs-difference was more predictive. The relevant brain areas of microexpression SD were some frontal lobes, insula, cingulate cortex, cuneus, amygdala, fusiform, occipital lobe, parietal lobe, precuneus, caudate lobe, putamen lobe, thalamus, temporal lobe, cerebellum and vermis. (4) There were many similarities and some differences in the relevant brain areas between microexpression M and SD. All these brain areas can be trained to enhance ecological microexpression recognition ability.

Yin,Zhang,Shu,Liu,and Wang: The relevant resting-state brain activity of ecological microexpression recognition test (EMERT)

1 Introduction

1.1 The ecological microexpression recognition test (EMERT)

Microexpressions are very transitory expressions lasting about 1/25–1/2 s, which can reveal people’s true emotions they try to hide or suppress [1, 2]. Matsumoto et al. [3] developed the Japanese and Caucasian Brief Affect Recognition Test (JACBART, classical microexpressions recognition) to measure microexpression recognition. The participants would see a microexpression presented for a little time between two neutral expression backgrounds for 2000ms before or after it. Participants needed to check out the microexpression type. The neutral expression backgrounds could eliminate the visual aftereffects of the microexpressions. But it did not examine the influence of backgrounds with emotional expressions. Therefore, Zhang, Fu, Chen and Fu [4] explored the background effect on microexpressions and found that all microexpressions (anger, disgust, fear, surprise and happiness) recognition accuracies under negative (sadness) backgrounds were significantly lower than those under positive (happiness) or neutral backgrounds; when the backgrounds and the microexpressions were consistent in the property (negative or positive), microexpression recognition accuracies were significantly lower than those when they were inconsistent in the property. The research has broken through the JACBART paradigm. But it did not explore all backgrounds or all microexpressions and needed to be further developed.

Yin, Zhang, Shi, and Liu [5] for the first time proposed that all basic expression kinds for both backgrounds and microexpressions needed to be detected to set up the ecological microexpression recognition test. Therefore, Zhang et al. [6] examined the recognition characteristics of six basic expression kinds of microexpressions (sadness, fear, anger, disgust, surprise, happiness) under seven basic expression kinds of backgrounds (the six basic expressions and neutral) to establish an ecological microexpression recognition test—EMERT, and found that EMERT had good retest reliability, criterion validity and ecological validity: (1) EMERT was generally significantly related to JACBART. (2) The backgrounds main effect of sadness, fear, anger and disgust microexpressions were significant; the backgrounds main effect of surprise and happiness microexpressions were not significant, but there was a wide difference between them with the common expressions. (3) The ecological microexpression recognition had stable fluctuation. Zhu et al. [7] used the simplified edition of EMERT to find microexpression recognition difference between depressive patients and normal people. Yin, Tian, Hua, Zhang, and Liu [8] extended EMERT to WEMERT (weak ecological microexpression recognition test).

But EMERT by Zhang et al. [6] only used white models’ expressions as microexpressions and backgrounds, so white, black and yellow models’ expressions need to be used as microexpressions and backgrounds to improve the ecological validity of materials.

1.2 Brain activation of ecological microexpression recognition

There were few published types of research detecting brain activation of ecological microexpression recognition. Shen [9] in Xiaolan Fu's team used fNIRS to find that the brain area responsible for JACBART microexpressions recognition was in the left frontal lobe, while the brain area responsible for common expressions recognition was in the right frontal lobe. Zhang [10] in Xiaolan Fu's team used fMRI to find that for anger and neutral microexpressions, the inferior parietal lobule was activated more in the negative expression backgrounds than in the neutral expression backgrounds, while the right precuneus was activated more in the positive expression backgrounds than in the neutral expression backgrounds. For happiness microexpressions, the parahippocampal gyrus was activated more in positive backgrounds. These studies revealed the brain mechanisms of classical microexpressions and three ecological microexpression recognition, but more ecological microexpression recognition needs further research.

As there were 36 ecological microexpressions in EMERT [6], it is neither feasible nor economical to adopt task-state fMRI. Resting-state fMRI is a viable and economical option. Resting-state fMRI investigates spontaneous activity or functional connections within the brain at rest. If a certain cognitive task is associated with certain brain areas that are active in a resting state, then these brain areas are associated with the cognitive task. If brain areas whose activity in resting state related to two cognitive tasks differ, then the brain mechanisms underlying execution of these two cognitive tasks are different [1113]. Brain spontaneous activity in resting-state is a stable index to measure the individual cognitive characteristics [14]. One of the classic indexes is ALFFs value (the Amplitude of Low-Frequency Fluctuations, 0.01 ~ 0.1 HZ), including most of the psychological, cognitive process. The higher and lower amplitudes are background noise such as physiological activity. There were eyes-closed and eyes-open resting-states. Nakano, etc. [15, 16] found that in eyes-closed, subjects focused on internal feeling and self-consciousness, while in eyes-open, subjects turned to external stimulus processing, and the transition from eyes-closed to eyes-open was from internal feeling and self-consciousness to external stimulus processing. However, there was no microexpression research using resting-state fMRI.

1.3 Improvements made in the current study

The current study would use white, black and yellow models’ expressions as microexpressions and backgrounds to improve the materials ecological validity of EMERT. It would use eyes-closed and eyes-open resting-state fMRI to detect relevant resting-state brain activity of EMERT for the first time.

2 Methods

2.1 Participants

Sixty-five college students were selected to participate in the study. Males and females were 32 and 33. The age M ± SD = 21.71 ± 2.58. They were all right-handed with normal or corrected-to-normal eyesight and without colour blindness. They all volunteered and could quit at any time. Each participant completed an informed consent form before the experiments. They got corresponding rewards after completing the experiments. The experiments were by the ethical guidelines of the Declaration of Helsinki and were approved by the Scientific Review Committee of Faculty of Psychology, Southwest University, China.

2.2 Experimental apparatus and materials

Seven kinds of basic expression opened mouth pictures of eight models (four male and four female, including white, black and yellow people) from the NimStim face expression database [17] were used as the backgrounds, namely, neutral, sadness, fear, anger, disgust, surprise, and happiness. Except for neutral expression, the other six kinds of expressions were used as microexpressions. The pixels of all images were modified to be 338 × 434 with grey background (GRB: 127, 127, 127) [6]. A custom experimental program ran under E-prime 2.0 on a PC (Lenovo LX-GJ556D), with a 17-inch colour display (resolution 1024 × 768, refresh rate 60 Hz).

2.3 Experimental design and procedures

The experiment was 7 (expression backgrounds: neutral vs sadness vs. fear vs anger vs disgust vs surprise vs happiness) × 6 (microexpressions: sadness vs fear vs. anger vs disgust vs surprise vs happiness) × 2(test times: first EMERT vs the second EMERT) within-subject design. As there were seven kinds of expression backgrounds, to balance the sequential effect, the Latin square design was used to set up seven groups with about nine participants (four or five females and males) in each group. Each dependent variable in seven groups was averaged in the result analysis [6].

Participants were 70 cm away from the screen. On the computer keyboard, six keys of SDF-JKL corresponded with ‘anger’, ‘disgust’, ‘fear’, ‘sadness’, ‘surprise’ and ‘happiness’. Before the experiment, the participants were asked to put the ring finger, middle finger, index finger of their left hands on the SDF keys respectively while the index finger, middle finger, ring finger of their right hands on JKL keys. And then they did key pressing practices. First, one of the six kinds of expressions (except neutral) was presented 1000 ms; then six labels “anger, disgust, fear, sadness, surprise, happiness” appeared on the screen, and the participants needed to recognise it and press the right key as accurately as possible. There were 30 trials, and six kinds of expressions were pseudo-randomly presented for 5 times.

After the key pressing practice was completed, the instructor informed the participants of the procedure. First, the centre of the screen would show the “+” for 400 ms; second, the empty screen lasted 200 ms; then the front background expression image was presented for 800 ms, after which the microexpression image would appear for 133 ms, followed by 800 ms of back background expression image [3, 6]. The front and back backgrounds and microexpressions were of the same model’s face, and the front and back backgrounds were the same. Participants needed to try to identify the briefly-presented microexpression between front and back backgrounds. Later, six labels “anger, disgust, fear, sadness, surprise, happiness” appeared on the screen. The participants were asked to press a key according to the microexpression they saw as accurately as possible instead of as soon as possible (no time limit). After the participants pressed the key, an empty screen would show for 1000 ms. Then the fixation point “+” was presented for 400 ms, and the next trial started. The experiment procedure is shown in Fig 1.

The picture of experiment procedure.
Fig 1

The picture of experiment procedure.

Note: These images are licensed by the copyright owner, Tottenham et al [17].

The participants practised the experimental procedure after understanding the instructions. There was a total of 14 trials, of which 7 kinds of backgrounds appeared 2 times, and 6 kinds of microexpressions each appeared 2 to 3 times. The participants were asked to determine the type of microexpressions. After the experimental procedure practice was completed, they started a formal trial. To allow the participants to get enough rest, the experiment was divided into seven blocks. Rest between every two blocks was 1 minute. The experiment had 7 (backgrounds) × 6 (microexpressions) × 8 (models) = 336 trails.

The participants needed to do two EMERT measurements, of which interval time was at least one day.

2.4 Resting-state data collection and analysis

The fMRI data were collected using a Siemens 3.0 T magnetic resonance imaging scanner, and an 8-channel phased front head coil. Eyes-closed and eyes-open resting-state imaging used gradient echo (GRE) single-excitation echo-planar imaging (EPI). Scan parameters were as follows: TR = 2000 ms, TE = 30 ms, FA = 90°, FOV = 220 × 220 mm2, matrix size = 64 × 64 mm2, depth = 3 mm, planar resolution = 3.13 × 3.13 mm2, interval scanning, 33 layers, layer spacing = 0.6 mm, total 240 layers. Structural imaging used a 3D TlWI (MP-RAGE) sequence with sagittal scans. Scan parameters were the following: TR = 2600 ms, TE = 3.02 ms, FA = 8°, no interval, FOV = 256 × 256 mm2, matrix size = 256 × 256 mm2, total 176 layers. All the participants first received the structural scan, then half received the eyes-closed and eyes-open resting-state scans, and half received the eyes-open and eyes-closed resting-state scans.

Pretreatment and analysis of resting-state data used DPARSF 3.0 Advanced Edition Calculate [18] in Original Space (Warp by DARTEL), following standard procedures: (1) Conversion of raw DICOM-format data to NIFTI format. To allow for signal stabilisation of the image, the first 10 TR images were removed, after which time layer correction (slice timing) and head motion correction (realign) were conducted. If head motion greater than 2 mm occurred during resting-state, the data were deleted. (2) The new segment + DARTEL was used to split the structural T1 data without standardisation, and register the T1 split data directly to the resting-state functional images. Before registration of structural and functional data, the AC-PC line of each participant's T1 image and the resting-state function was registered, and then automatic registration was applied. Therefore, the resting-state analysis took place in the original T1 space. (3) Head motion (adopting Friston 24), linear drift, white matter, and cerebrospinal fluid via regression were adjusted for. (4) Low frequency fluctuations ALFFs (filter range: 0.01 to 0.1 Hz) were calculated. (5) The resting-state function was registered to the standard MNI space (normalisation), using a 3 × 3 × 3 mm3 voxel size, with 4 × 4 × 4 mm3 full widths at half maximum (FWHM) smoothing.

REST1.8 [19] was first used to extract the amplitude of low-frequency fluctuations (ALFFs) during resting-states in 116 Anatomical Automatic Labeling (AAL) brain areas. Second, SPSS19.0 was used to implement correlation analyses between ALFFs in 116 AAL brain areas and the scores of EMERT. The ALFFs-difference of eyes-open minus eyes-closed was used as an index of transition from internal feeling and self-consciousness to external stimulus processing. We detected its psychological significance by correlation analyses between it and the scores of EMERT.

3 Results

SPSS 19.0 was used for statistics. There were sixty-five valid participants in EMERT, fifty-eight valid participants in eyes-closed resting-state, and sixty two valid participants in eyes-open resting-state because seven participants’ head movement were greater than 2 mm in eyes-closed resting-state, and three participants in eyes-open resting-state.

3.1 Behavioral data

The scores of EMERT were showed in Table 1. Because the accuracy of microexpression recognition in the second EMERT might contain training effect, the accuracy of that in the first EMERT was taken as the microexpression recognition ability. Since the participants have 6 keys to choose for each trial, the random level is 1/6. A single sample t test was made for each microexpression recognition accuracy with random level 1/6, and it was found that almost all the microexpression recognition accuracies in the first EMERT were significantly higher than random (ps<0.001), except that fear under surprise, was not significantly higher than random (p>0.05).

Table 1
The scores of EMERT.
microexpressionsfirst EMERT M±SD (n = 65)tCohen's dsecond EMERT M±SD (n = 65)
sadness0.60±0.3111.04***1.400.74±0.27
fear0.50±0.328.35***1.040.57±0.33
anger0.68±0.2814.47***1.830.78±0.25
disgust0.60±0.3210.87***1.350.74±0.31
surprise0.78±0.2718.18***2.270.78±0.31
happiness0.90±0.2424.35***3.060.92±0.23
sadness under fear0.29±0.244.20***0.510.39±0.29
sadness under anger0.34±0.284.87***0.620.44±0.30
sadness under disgust0.28±0.224.15***0.520.37±0.25
sadness under neutral0.38±0.247.26***0.890.43±0.25
sadness under surprise0.47±0.288.84***1.080.56±0.30
sadness under happiness0.34±0.255.84***0.690.43±0.27
fear under sadness0.28±0.184.99***0.630.28±0.23
fear under anger0.27±0.204.25***0.520.30±0.21
fear under disgust0.29±0.214.76***0.590.38±0.26
fear under neutral0.34±0.216.93***0.830.36±0.25
fear under surprise0.18±0.200.46-0.19±0.21
fear under happiness0.44±0.259.08***1.090.39±0.26
anger under sadness0.72±0.2517.96***2.210.76±0.24
anger under fear0.70±0.2715.97***1.980.72±0.26
anger under disgust0.49±0.299.24***1.110.55±0.29
anger under neutral0.76±0.2320.65***2.580.75±0.24
anger under surprise0.70±0.2914.64***1.840.73±0.30
anger under happiness0.62±0.2613.88***1.740.66±0.26
disgust under sadness0.47±0.2310.93***1.320.56±0.25
disgust under fear0.55±0.2711.60***1.420.59±0.27
disgust under anger0.40±0.267.46***0.900.54±0.28
disgust under neutral0.54±0.2710.97***1.380.62±0.25
disgust under surprise0.44±0.239.54***1.190.49±0.28
disgust under happiness0.49±0.2510.40***1.290.59±0.24
surprise under sadness0.67±0.2416.93***2.100.66±0.25
surprise under fear0.73±0.2518.37***2.250.72±0.29
surprise under anger0.67±0.2914.21***1.740.61±0.30
surprise under disgust0.65±0.2813.82***1.730.62±0.28
surprise under neutral0.78±0.2420.38***2.560.80±0.25
surprise under happiness0.72±0.3014.80***1.840.72±0.29
happiness under sadness0.88±0.2522.89***2.850.90±0.21
happiness under fear0.86±0.2522.26***2.770.89±0.23
happiness under anger0.85±0.2919.16***2.360.88±0.25
happiness under disgust0.83±0.2918.72***2.290.87±0.25
happiness under neutral0.91±0.2424.98***3.100.95±0.15
happiness under surprise0.85±0.2819.30***2.440.90±0.24

Note:

* p < 0.05

** p < 0.01

*** p < 0.001. The same below.

It can be seen that the EMERT indexes were too many to be recapitulative enough for both participants and researchers. Then the mean of accuracy rates of a microexpression type under six backgrounds (except the same expression grounds as the microexpression, because in that case, it was a normal expression rather than a microexpression) was used as the index of this microexpression type recognition. It was abbreviated as microexpression M. The standard deviation of accuracy rates of this microexpression type under six backgrounds (except the same expression grounds as the microexpression) was used as the background effect index of this microexpression type recognition, which was called the fluctuations of the microexpression type recognition [6, 8], and it was abbreviated as microexpression SD.

Therefore we got two new recapitulative indexes of EMERT. A single sample t test was made for each microexpression M with random level 1/6, and it was found that all were significantly higher than random (ps<0.001). A single sample t test was made for each microexpression SD with random level 0, and it was found that all were significantly higher than random (ps<0.001).

Pearson correlation was made between the two EMERT. It was found that each microexpression M in the first EMERT was significantly positively related to the corresponding one in the second EMERT and the rs (the plural of r, the same below) were high; and that each microexpression SD except surprise SD in the first EMERT was significantly positively related to the corresponding one in the second EMERT.

Pearson correlation was made between the first EMERT and the first JACBART (microexpressions under neutral backgrounds belong to the first EMERT), and it was found that each microexpression M in the first EMERT was significantly positively related to the corresponding microexpression in the first JACBART. The new indexes of the two EMERT and their statistical results were shown in Table 2.

Table 2
The new scores of EMERT.
microexpressionsfirst EMERT M±SD (n = 53)tCohen's dsecond EMERT M±SD (n = 53)r1first JACBART M±SD (n = 53)r2
sadness M0.35±0.207.59***0.920.43±0.240.79**0.38±0.240.68**
fear M0.30±0.138.07***1.030.32±0.180.66**0.34±0.210.52**
anger M0.67±0.2218.19***2.290.69±0.220.83**0.76±0.230.77**
disgust M0.48±0.2112.13***1.490.56±0.230.82**0.54±0.270.90**
surprise M0.70±0.2219.98***2.420.69±0.240.79**0.78±0.240.79**
happiness M0.86±0.2423.32***2.890.90±0.200.91**0.91±0.240.81**
sadness SD0.17±0.0718.59***2.430.16±0.070.43**
fear SD0.18±0.0720.78***2.570.17±0.060.41**
anger SD0.17±0.0719.21***2.430.15±0.090.55**
disgust SD0.15±0.0619.20***2.500.13±0.060.26*
surprise SD0.16±0.0815.48***3.200.15±0.07-
happiness SD0.08±0.16.95***0.800.07±0.090.68**

Note: r1 was the r between first and second EMERT. r2 was the r between first EMERT and first JACBART.

3.2 Brain imaging data

Pearson correlation analysis was made between ALFFs of resting-state and microexpression M (see Table 3 and Fig 2). (1) In the eyes-closed resting state, ALFFs in the frontal lobe, insula, cingulate cortex, hippocampal, caudate nucleus, thalamus and vermis were significantly correlated with some microexpression M. (2) In the eyes-open resting state, ALFFs in the frontal lobe, insula, cingulate cortex, hippocampus, parietal lobe, caudate nucleus, thalamus, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression M. (3) In the ALFFs-difference of eyes-open minus eyes-closed resting-states, ALFFs-difference in the frontal lobe, insula, amygdala, occipital lobe, fusiform, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression M.

AAL brain areas whose ALFFs in eyes-closed and eyes-open resting-state and ALFFs-difference were related to the microexpression M.
Fig 2

AAL brain areas whose ALFFs in eyes-closed and eyes-open resting-state and ALFFs-difference were related to the microexpression M.

Note: The brain areas were visualised with the BrainNet Viewer (http://www.nitrc.org/projects/bnv/) [20], the same below.

Table 3
The rs between ALFFs of resting-state and microexpression M.
resting-stateAAL brain areaALFF (M±SD)sadness Mfear Manger Mdisgust Msurprise Mhappiness M
eyes-closedPrecentral_L0.84±0.05-0.39**
eyes-closedPrecentral_R0.88±0.07-0.294*
eyes-closedFrontal_Inf_Tri_R0.78±0.04-0.31*
eyes-closedFrontal_Inf_Orb_L0.88±0.060.31*
eyes-closedRolandic_Oper_R0.86±0.030.27*0.30*
eyes-closedInsula_L0.91±0.040.28*
eyes-closedInsula_R0.96±0.050.28*
eyes-closedCingulum_Ant_L0.98±0.060.27*
eyes-closedCingulum_Mid_L0.94±0.030.33*0.29*0.30*
eyes-closedCingulum_Mid_R0.92±0.030.33*0.30*
eyes-closedCingulum_Post_L0.98±0.050.26*0.262*
eyes-closedCingulum_Post_R0.93±0.040.30*
eyes-closedParaHippocampal_L1.11±0.090.27*
eyes-closedCaudate_R0.86±0.05-0.28*-0.27*
eyes-closedThalamus_L1.01±0.09-0.29*-0.36**-0.37**
eyes-closedThalamus_R1.01±0.08-0.26*-0.39**-0.45**-0.39**
eyes-closedVermis_60.94±0.07-0.29*
eyes-closedVermis_70.81±0.07-0.32*-0.26*-0.41**
eyes-closedVermis_102.28±0.67-0.29*-0.29*
eyes-openPrecentral_L0.82±0.04-0.38**
eyes-openPrecentral_R0.85±0.05-0.28*
eyes-openFrontal_Sup_L0.87±0.050.27*
eyes-openFrontal_Sup_Orb_R0.83±0.07-0.33**
eyes-openFrontal_Inf_Oper_R0.89±0.040.26*
eyes-openFrontal_Inf_Orb_L0.89±0.050.31*0.27*
eyes-openFrontal_Inf_Orb_R0.81±0.040.25*0.27*0.28*
eyes-openRolandic_Oper_R0.86±0.030.37**0.39**0.27*0.26*
eyes-openFrontal_Sup_Medial_L0.97±0.070.31*0.31*0.25*0.32*0.32*
eyes-openFrontal_Sup_Medial_R0.94±0.060.27*
eyes-openInsula_L0.92±0.030.34**
eyes-openInsula_R0.97±0.040.30*0.34**0.31*0.25*
eyes-openCingulum_Ant_L0.99±0.060.27*0.26*0.29*0.31*
eyes-openCingulum_Mid_L0.93±0.030.35**0.30*
eyes-openCingulum_Post_L1±0.050.32*0.26*0.32*0.35**
eyes-openCingulum_Post_R0.94±0.030.33**
eyes-openParaHippocampal_L1.12±0.080.33**
eyes-openPostcentral_R0.84±0.05-0.30*
eyes-openParietal_Inf_R1.06±0.060.28*
eyes-openCaudate_R0.87±0.05-0.25*
eyes-openThalamus_L1±0.08-0.29*-0.32*-0.33**
eyes-openThalamus_R1±0.07-0.29*-0.42**-0.44**-0.40**
eyes-openTemporal_Mid_R0.98±0.030.25*
eyes-openCerebelum_Crus1_L0.95±0.09-0.31*-0.37**
eyes-openCerebelum_Crus2_L0.63±0.22-0.29*
eyes-openCerebelum_9_R0.79±0.28-0.32*
eyes-openVermis_70.82±0.06-0.29*
eyes-openVermis_102.25±0.68-0.31*-0.30*-0.26*
differenceSupp_Motor_Area_L-0.02±0.040.30*
differenceOlfactory_L0.01±0.030.32*0.40**
differenceInsula_R0.01±0.020.29*0.38**
differenceAmygdala_R0.01±0.040.27*
differenceCalcarine_L-0.08±0.1-0.29*
differenceCalcarine_R-0.07±0.08-0.30*
differenceLingual_L-0.06±0.08-0.32*
differenceOccipital_Sup_R0±0.05-0.32*
differenceOccipital_Mid_L0.01±0.04-0.30*
differenceOccipital_Inf_L0.01±0.05-0.34**
differenceFusiform_L0.01±0.02-0.27*
differenceHeschl_R-0.04±0.060.34**
differenceTemporal_Pole_Sup_L0.01±0.050.33*0.32*0.30*
differenceTemporal_Pole_Sup_R0±0.050.34*0.28*0.33*0.32*
differenceTemporal_Mid_L0±0.020.31*0.28*
differenceTemporal_Pole_Mid_R0±0.030.28*
differenceCerebelum_Crus1_L-0.01±0.05-0.27*-0.31*
differenceCerebelum_Crus2_L-0.01±0.09-0.29*-0.27*
differenceCerebelum_6_L-0.02±0.05-0.34*
differenceCerebelum_7b_L-0.01±0.09-0.28*-0.30*
differenceCerebelum_7b_R0.01±0.1-0.26*
differenceCerebelum_8_L0±0.11-0.31*
differenceVermis_6-0.01±0.040.26*
differenceVermis_70.01±0.030.28*0.31*0.30*
differenceVermis_9-0.01±0.09-0.29*

Pearson correlation analysis was made between ALFFs of resting-state and microexpression SD (see Table 4 and Fig 3). (1) In the eyes-closed resting-state, ALFFs in the frontal lobe, insula, cingulate cortex, occipital lobe, parietal lobe, precuneus, caudate lobe, putamen lobe, thalamus, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression SD. (2) In the eyes-open resting-state, ALFFs in the frontal lobe, insula, cingulate cortex, cuneus, occipital lobe, parietal lobe, precuneus, caudate lobe, putamen lobe, thalamus, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression SD. (3) In the ALFFs-difference of eyes-open minus eyes-closed resting-states, ALFFs-difference in the frontal lobe, insula, cingulate cortex, amygdala, fusiform, occipital lobe, parietal lobe, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression SD.

AAL brain areas whose ALFFs in eyes-closed and eyes-open resting-state and ALFFs-difference were related to the microexpression SD.
Fig 3

AAL brain areas whose ALFFs in eyes-closed and eyes-open resting-state and ALFFs-difference were related to the microexpression SD.

Table 4
The rs between ALFFs of resting-state and microexpression SD.
resting-stateAAL brain areaALFF (M±SD)sadness SDfear SDanger SDdisgust SDsurprise SDhappiness SD
eyes-closedFrontal_Mid_R0.86±0.05-0.29*
eyes-closedFrontal_Inf_Tri_L0.86±0.040.30*
eyes-closedFrontal_Inf_Tri_R0.78±0.040.26*0.27*0.26*
eyes-closedSupp_Motor_Area_L1.02±0.07-0.34**
eyes-closedSupp_Motor_Area_R1.03±0.08-0.28*
eyes-closedInsula_L0.91±0.040.37**
eyes-closedInsula_R0.96±0.050.28*
eyes-closedCingulum_Ant_R0.88±0.05-0.30*
eyes-closedCingulum_Post_L0.98±0.050.28*
eyes-closedCalcarine_R1.01±0.10.31*-0.26*
eyes-closedCuneus_R1.11±0.130.30*
eyes-closedLingual_L1.07±0.10.37**
eyes-closedLingual_R1.06±0.090.30*
eyes-closedOccipital_Sup_L0.92±0.080.33*-0.28*
eyes-closedOccipital_Sup_R0.89±0.060.33*
eyes-closedOccipital_Mid_L0.95±0.060.30*
eyes-closedOccipital_Inf_L0.92±0.070.27*
eyes-closedOccipital_Inf_R0.93±0.070.35**
eyes-closedParietal_Inf_L0.99±0.05-0.34**
eyes-closedSupraMarginal_L0.89±0.040.37**
eyes-closedPrecuneus_L1.08±0.05-0.28*
eyes-closedCaudate_L0.95±0.1-0.34*
eyes-closedPutamen_L0.79±0.040.33*
eyes-closedPallidum_L0.83±0.050.30*
eyes-closedThalamus_L1.01±0.090.31*
eyes-closedThalamus_R1.01±0.080.31*0.32*
eyes-closedTemporal_Pole_Sup_R1.02±0.080.36**
eyes-closedCerebelum_3_R1.78±0.330.28*
eyes-closedCerebelum_10_R0.99±0.17-0.43**
eyes-closedVermis_4_51.21±0.140.33*-0.41**
eyes-closedVermis_60.94±0.07-0.31*
eyes-closedVermis_70.81±0.070.29*
eyes-openFrontal_Mid_Orb_L0.88±0.080.25*
eyes-openRolandic_Oper_L0.84±0.030.29*-0.29*
eyes-openSupp_Motor_Area_L1±0.070.32*-0.25*
eyes-openOlfactory_L0.97±0.08-0.28*
eyes-openFrontal_Sup_Medial_L0.97±0.07-0.26*-0.25*
eyes-openInsula_L0.92±0.030.43**
eyes-openInsula_R0.97±0.040.26*
eyes-openCingulum_Ant_L0.99±0.06-0.36**
eyes-openCingulum_Ant_R0.89±0.040.26*
eyes-openCingulum_Post_L1±0.050.27*0.29*-0.41**
eyes-openCingulum_Post_R0.94±0.030.33**
eyes-openAmygdala_L1.11±0.11-0.32*
eyes-openAmygdala_R1.07±0.1-0.26*
eyes-openCalcarine_R0.94±0.050.35**-0.28*
eyes-openCuneus_L1.08±0.1-0.31*
eyes-openCuneus_R1.05±0.070.35**-0.39**
eyes-openLingual_L1.01±0.060.41**
eyes-openParietal_Inf_L0.98±0.05-0.26*
eyes-openAngular_R1.06±0.08-0.28*
eyes-openCaudate_L0.96±0.09-0.30*
eyes-openPutamen_L0.8±0.03-0.26*0.25*
eyes-openPutamen_R0.8±0.030.26*
eyes-openPallidum_L0.84±0.04-0.30*
eyes-openThalamus_L1±0.080.27*
eyes-openThalamus_R1±0.070.28*0.26*
eyes-openCerebelum_Crus1_R0.97±0.1-0.30*
eyes-openCerebelum_10_R0.99±0.210.25*
eyes-openVermis_4_51.18±0.13-0.33**
differenceFrontal_Mid_Orb_L0.04±0.070.31*
differenceFrontal_Mid_Orb_R0.03±0.040.36**-0.28*
differenceFrontal_Inf_Tri_R0.01±0.030.30*-0.281*
differenceFrontal_Inf_Orb_L0.02±0.03-0.27*
differenceRolandic_Oper_L0±0.02-0.26*
differenceRolandic_Oper_R-0.01±0.02-0.39**
differenceSupp_Motor_Area_L-0.02±0.040.27*
differenceSupp_Motor_Area_R-0.03±0.050.28*
differenceOlfactory_L0.01±0.03-0.30*
differenceInsula_R0.01±0.02-0.30*
differenceCingulum_Ant_R0.01±0.030.27*
differenceAmygdala_R0.01±0.04-0.27*
differenceOccipital_Sup_R0±0.05-0.30*0.28*
differenceOccipital_Inf_R0.01±0.06-0.32*
differenceFusiform_R0±0.02-0.31*
differenceParacentral_Lobule_R-0.07±0.090.32*
differenceTemporal_Pole_Sup_L0.01±0.05-0.27*
differenceTemporal_Pole_Sup_R0±0.05-0.28*
differenceTemporal_Pole_Mid_R0±0.030.30*
differenceTemporal_Inf_R0.01±0.02-0.28*
differenceCerebelum_Crus1_R-0.01±0.050.28*
differenceCerebelum_Crus2_L-0.01±0.090.30*
differenceCerebelum_6_L-0.02±0.050.28*
differenceCerebelum_7b_R0.01±0.1-0.31*
differenceCerebelum_8_R0.02±0.1-0.29*
differenceVermis_9-0.01±0.090.36**

4 Discussion

4.1 The EMERT had good reliability and validity

In the current study, we used white, black and yellow models’ expressions as microexpressions and backgrounds to improve the materials ecological validity of EMERT, and used two new recapitulative indexes such as microexpression M and microexpression SD.

Almost all the microexpression recognition accuracies and all the microexpression Ms were significantly higher than random, which means that the participants could effectively identify almost all the microexpressions. Only fear under surprise was not significantly higher than random, which might because the fear microexpression and the surprise backgrounds had similar face muscle status. All the microexpression SDs were significantly higher than random, which means that each microexpression type had significantly background effect [6, 8].

Each microexpression M in the first EMERT was significantly positively related to the corresponding one in the second EMERT, and the rs were high. Each microexpression SD except surprise SD in the first EMERT was significantly positively related to the corresponding one in the second EMERT, which showed that the EMERT had good retest reliability. Each microexpression M in the first EMERT was significantly positively related to the corresponding microexpression in the first JACBART, which showed that the EMERT had good criterion validity [6, 8].

4.2 The relevant brain areas of microexpression M in EMERT

In the eyes-closed resting state, ALFFs in the frontal lobe, insula, cingulate cortex, hippocampal, caudate nucleus, thalamus and vermis were significantly correlated with some microexpression M, of which the insula, cingulate cortex, hippocampal and thalamus were common brain areas of expression recognition [21], the frontal lobe, insula, cingulate cortex, hippocampal and thalamus might be responsible for microexpressions consciousness and attention [22, 23], and the caudate nucleus and vermis might be responsible for the change from expression backgrounds to microexpression [10], which of course need further research to determine, the same as below.

In the eyes-open resting state, ALFFs in the frontal lobe, insula, cingulate cortex, hippocampus, parietal lobe, caudate nucleus, thalamus, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression M, of which the insula, cingulate cortex, hippocampus, thalamus and temporal lobe were common brain areas of expression recognition, the frontal lobe, parietal lobe, hippocampus, insula, cingulate, hippocampus, thalamus and temporal lobe might be responsible for microexpressions consciousness and attention. The caudate nucleus, cerebellum and vermis might be responsible for the change from expression backgrounds to microexpression [24]. It can be seen that microexpression M was significantly correlated with similar brain areas in both eyes-closed and eyes-open resting-states. Still, in the eyes-open resting state, there were more relevant brain areas such as parietal lobe and temporal lobe.

In difference of eyes-open minus eyes-closed resting-states, ALFFs-difference in the frontal lobe, insula, amygdala, occipital lobe, fusiform, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression M, of which the insula, amygdala, occipital lobe, fusiform and temporal lobe were common expression recognition brain areas, the frontal lobe, insula and temporal lobe might be responsible for microexpressions consciousness and attention. The cerebellum and vermis might be responsible for the change from expression backgrounds to microexpression. It can be seen that there were some similar relevant brain areas in the ALFFs-difference as in eyes-closed and eyes-open resting-states. Still, in the ALFFs-difference, there were new relevan brain areas such as the amygdala, occipital lobe and fusiform.

It was found that ALFFs in both eyes-closed and eyes-open resting-states and ALFFs-difference could predict microexpression M of EMERT. Their predictability was similar, but there were also differences. According to the relevant brain areas and logic, there might be three cognitive processes in ecological microexpression recognition, such as the expression recognition, microexpressions consciousness and attention, and the change from expression background to microexpression. We need to explore whether and when each of them occurs and whether some other cognitive processes exist by developing new behavioural measurement methods to separate them and by task-state fMRI and ERP in the future. Nakano et al. [15, 16] found that transition from eyes-closed to eyes-open was from internal feeling to external stimulus processing. However, no study has taken the ALFFs-difference as a quantitative sensitivity index from internal feeling to external stimulus, and no study has investigated its psychological significance. In the current study, we defined the ALFFs-difference as the quantitative sensitivity index from internal feeling to external stimulus, and it was found that the ALFFs-difference could predict EMERT, indicating psychological significance.

Shen [9] found that the brain area responsible for classical microexpression recognition was in the left frontal lobe, while the brain area responsible for expression recognition was in the right frontal lobe. In the current study, it was found that for EMERT, both the left and right frontal lobes and more brain areas were involved. Zhang [10] found that for anger and neutral microexpressions, activation of the inferior parietal lobule was induced more in the negative expression backgrounds than in the neutral expression backgrounds, while activation of the right precuneus was induced more in the positive expression backgrounds than in the neutral expression backgrounds. For happiness microexpressions, activation of the parahippocampal gyrus was induced more in the positive backgrounds. The current study also found that these brain areas except the right precuneus were involved in EMERT, and more brain areas were involved. There might be three reasons for this difference: (1) The EMERT in the current study was more comprehensive and ecological, and there was more background effect. (2) The correlation analysis of resting-state was adopted in the current study. Still, the comparative analysis of task-states has been used in previous studies either between microexpressions and expressions or among different microexpressions, therefore many common brain areas either of microexpressions and expressions or of different microexpressions might be ignored by statistics. (3) We detected the relevant brain areas of new recapitulative index of EMERT such as microexpression M and did not pay attention to details. Zhang et al. [6] established EMERT but did not investigate the relevant brain areas. In the current study, the relevant brain areas of EMERT were comprehensively investigated. Of course, further researches are needed to determine which function brain areas are responsible for.

4.3 The relevant brain areas of microexpression SD in EMERT

In the eyes-closed resting-state, ALFFs in the frontal lobe, insula, cingulate cortex, occipital lobe, parietal lobe, precuneus, caudate lobe, putamen lobe, thalamus, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression SD. In the eyes-open resting-state, ALFFs in the frontal lobe, insula, cingulate cortex, cuneus, occipital lobe, parietal lobe, precuneus, caudate lobe, putamen lobe, thalamus, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression SD. It can be seen that microexpression SD was significantly associated with similar brain areas in both eyes-closed and eyes-open resting-states.

In the ALFFs-difference of eyes-open minus eyes-closed resting-states, ALFFs-difference in the frontal lobe, insula, cingulate cortex, amygdala, fusiform, occipital lobe, parietal lobe, temporal lobe, cerebellum and vermis were significantly correlated with some microexpression SD. It can be seen that there were many similar relevant brain areas in the ALFFs-difference as in eyes-closed and eyes-open resting-states. Still, in the ALFFs-difference, there were new relevant brain areas such as amygdala and fusiform.

It was found that ALFFs in both eyes-closed and eyes-open resting-states and ALFFs-difference could predict microexpression SD. Their predictability was similar, but there were also differences. In EMERT, Zhang et al. [6] and Yin, Tian, Hua, Zhang, & Liu [8] defined the microexpression SD as the fluctuation of the ecological micro expression to quantify the background effect. Still, they did not investigate the relevant brain areas. The current study comprehensively investigated the relevant brain areas involved in the quantification of the background effect. Of course, further researches are needed to determine which function brain areas are responsible for.

4.4 The similarities and differences of the relevant brain areas of microexpression M and SD

The microexpression M is the index of a microexpression type recognition. The microexpression SD is the background effect index of this microexpression type recognition [6, 8]. The former is a kind of ability, but the latter is the degree that this ability changes in different contexts, which, in turn, can be thought of as the stability of this ability. Therefore, there should be similarities and differences in brain mechanisms between them.

In the eyes-closed resting state, ALFFs in the frontal lobe, insula, cingulate cortex, caudate nucleus, thalamus and vermis were significantly correlated with both some microexpression M and some microexpression SD, which indicates they need emotional perception and feeling. But ALFFs in hippocampal were only significantly correlated with both some microexpression M, which indicates that the microexpression type recognition ability need memory more; and ALFFs in the occipital lobe, parietal lobe, precuneus, putamen lobe, temporal lobe and cerebellum were only significantly correlated with some microexpression SD, which indicates that the stability of the microexpression type recognition ability need cognitive control, consciousness and motion more.

In the eyes-open resting state, ALFFs in the frontal lobe, insula, cingulate cortex, parietal lobe, caudate nucleus, thalamus, temporal lobe, cerebellum and vermis were significantly correlated with both some microexpression M and some microexpression SD, which indicates they need emotional perception and feeling. But ALFFs in hippocampal were only significantly correlated with both some microexpression M, which indicates that the microexpression type recognition ability need memory more; and ALFFs cuneus, occipital lobe, precuneus and putamen lobe were significantly correlated with some microexpression SD, which indicates that the stability of the microexpression type recognition ability need visual, consciousness and motion more.

In difference of eyes-open minus eyes-closed resting-states, ALFFs-difference in the frontal lobe, insula, amygdala, occipital lobe, fusiform, temporal lobe, cerebellum and vermis were significantly correlated with both some microexpression M and some microexpression SD, which indicates they need emotional perception and feeling. But ALFFs-difference in the cingulate cortex and parietal lobe was only significantly correlated with some microexpression SD, which indicates that the stability of the microexpression type recognition ability needs cognitive control more.

Taken together, both microexpression M and microexpression SD need emotional perception and feeling, but the former need memory more, and the latter need cognitive control and consciousness more. Of course, a certain ability requires its related brain areas and memory, but in addition to the brain areas associated with this ability, the stability of this ability requires cognitive control and consciousness. The similarities and differences in brain mechanisms of microexpression M and SD are logical. All these relevant brain areas can be trained to enhance ecological microexpression recognition ability.

5 Conclusion

The current study used white, black and yellow models’ expressions as microexpressions and backgrounds to improve the materials ecological validity of EMERT. It used eyes-closed and eyes-open resting-state fMRI to detect relevant resting-state brain activity of EMERT. The result showed:

    Two new recapitulative indexes of EMERT were adopted, such as microexpression M and microexpression SD. The participants could effectively identify almost all the microexpressions, and each microexpression type had a significantly background effect. The EMERT had good retest reliability and calibration validity.

    ALFFs in both eyes-closed and eyes-open resting-states and ALFFs-difference could predict microexpression M. The relevant brain areas of microexpression M were some frontal lobes, insula, cingulate cortex, hippocampus, parietal lobe, caudate nucleus, thalamus, amygdala, occipital lobe, fusiform, temporal lobe, cerebellum and vermis.

    ALFFs in both eyes-closed and eyes-open resting-states and ALFFs-difference could predict microexpression SD, and the ALFFs-difference was more predictive. The relevant brain areas of microexpression SD were some frontal lobes, insula, cingulate cortex, cuneus, amygdala, fusiform, occipital lobe, parietal lobe, precuneus, caudate lobe, putamen lobe, thalamus, temporal lobe, cerebellum and vermis.

    There were many similarities and some differences in the relevant brain areas between microexpression M and SD. All these brain areas can be trained to enhance ecological microexpression recognition ability.

References

P.Ekman, & W. V.Friesen (1975). Unmasking the face:A guide to recognising theemotions from facial cues. Englewood Cliffs, NJ: Prentice Hall.

S.Porter, L.ten Brinke, & B.Wallace (2012). Secrets and lies: Involuntary leakage in deceptive facial expressions as a function of emotional intensity. Journal of Nonverbal Behavior, 36(1), 2337.

D.Matsumoto, J.Leroux, C.Wilsoncohn, J.Raroque, K.Kooken, & P.Ekman, et al (2000). A new test to measure emotion recognition ability: matsumoto and ekman's japanese and caucasian brief affect recognition test (JACBART). Journal of Nonverbal Behavior, 24(3), 179209.

M.Zhang, Q.Fu, Y. H.Chen, & X.Fu (2014). Emotional context influences micro-expression recognition. Plos One, 9(4), e95018 10.1371/journal.pone.0095018

M.Yin, J. X.Zhang, A. Q.Shi, & D. Z.Liu (2016). Characteristics, recognition, training of microexpressions and their influence factors. Advances in Psychological Science, 24(11), 17231736.

J. X.Zhang, L.Lu, M.Yin, C. L.Zhu, C. L.Huang, & D. Z.Liu (2017). The establishment of ecological microexpression recognition test(emert):an improvement on jacbart microexpression recognition test. Acta Psychologica Sinica, 49(7), 886896.

C. LZhu, X. YChen, J. X.Zhang, Z. Y.Liu, Z.Tang, Y. T.Xu, et al (2017). Comparison of ecological micro-expression recognition in patients with depression and healthy individuals. Frontiers in Behavioral Neuroscience, 11(11). 10.3389/fnbeh.2017.00199

M.Yin, L. C.Tian, W.Hua, J. X.Zhang, & D. Z.Liu (2019). The establishment of weak ecological microexpressions recognition test (WEMERT): An extension on EMERT. Frontiers in Psychology, 10, 05 3 2019 10.3389/fpsyg.2019.00275

X. B.Shen (2012). The Temporal Characteristics and Mechanisms of Microexpression Recognizing.

10 

M.Zhang (2014). The Effect of Emotional Context on Micro-expression Recognition and Its Mechanism.

11 

M. P. V. D.Heuvel, & H. E. H.Pol (2010). Exploring the brain network: a review on resting-state fmri functional connectivity. European Neuropsychopharmacology, 20(8), 519534. 10.1016/j.euroneuro.2010.03.008

12 

Q.Jiang, L. L.Hou, J.Qiu, C. R.Li, & H. Z.Wang (2018). The relationship between the caudate nucleus-orbitomedial prefrontal cortex connectivity and reactive aggression: A resting-state fMRI study. Acta Psychologica Sinica, 50(6), 655666.

13 

W. F.Li, D. D.Tong, J.Qiu, & Q. L.Zhang (2016). The neural basis of scientific innovation problems solving. Acta Psychologica Sinica, 48(4), 331342.

14 

J.Liu, X.Liao, M.Xia, & Y.He (2017). Chronnectome fingerprinting: identifying individuals and predicting higher cognitive functions using dynamic brain connectivity patterns. Human Brain Mapping, 39 (2). 10.1002/hbm.23890

15 

T.Nakano, M.Kato, Y.Morito, S.Itoi, & S.Kitazawa (2012). From the cover: blink-related momentary activation of the default mode network while viewing videos. Proceedings of the National Academy of Sciences of the United States of America, 110(2), 7026. 10.1073/pnas.1214804110

16 

T.Nakano (2015). Blink-related dynamic switching between internal and external orienting networks while viewing videos. Neuroscience Research, 96, 5458. 10.1016/j.neures.2015.02.010

17 

N.Tottenham, J. W.Tanaka, A. C.Leon, T.Mccarry, M.Nurse, & T. A.Hare, et al (2009). The nimstim set of facial expressions: judgments from untrained research participants. Psychiatry Research, 168(3), 242249. 10.1016/j.psychres.2008.05.006

18 

C. G.Yan, X. D.Wang, X. N.Zuo, & Y. F.Zang (2016). DPABI: Data Processing & Analysis for (Resting-State) Brain Imaging. Neuroinformatics, 14, 339351. 10.1007/s12021-016-9299-4

19 

X. W.Song, Z. Y.Dong, X. Y.Long, S. F.Li, X. N.Zuo, C. Z.Zhu, et al (2011). REST: A Toolkit for Resting-State Functional Magnetic Resonance Imaging Data Processing. PLoS ONE, 6(9), 2503110.1371/journal.pone.0025031

20 

M.Xia, J.Wang, & Y.He (2013) BrainNet Viewer: A Network Visualization Tool for Human Brain Connectomics. PLoS ONE, 8, e68910 10.1371/journal.pone.0068910

21 

X. Q.Hu, G. Y.Fu, & Z. Y.Shi (2009). Review and prospect of mirror neuron system. Advances in psychological science, 17(1), 118125.

22 

S.Dehaene, J. P.Changeux, & L.Naccache (2011). The global neuronal workspace model of conscious access: from neuronal architectures to clinical applications In Characterizing consciousness: From cognition to the clinic? (pp. 5584). Springer Berlin Heidelberg.

23 

P.Huang, Y. L.Li, J. X.Zhang, X. P.Wang, C. L.Huang, & A. T.Chen, & D. Z.Liu (2017). fMRI investigation on gradual change of awareness states in implicit sequence learning. Scientific Reports, 7 (1), 18. 10.1038/s41598-016-0028-x

24 

V. B.Penhune & C. J.Steele (2012). Parallel contributions of cerebellar, striatal and M1 mechanisms to motor sequence learning. Behavioural Brain Research, 226(2), 579591. 10.1016/j.bbr.2011.09.044