Print
The Effects of Synchronization With Either Joyful or Angry People on Perception of an Emotionally Neutral Person
expand article infoMikołaj Biesaga, Paweł Motyka, Andrzej Nowak§
‡ University of Warsaw, Warsaw, Poland
§ University of Social Sciences and Humanities in Warsaw, Warsaw, Poland
Open Access

Abstract

Synchronization has been shown to play an important role in social life through its effects on interactions between people and the quality of these interactions. However, little is known about how observing synchronization affects perceptions of the synchronized individuals. This paper examines how observed synchronization influences perceptions of a neutral person depending on the emotional valence of the faces with which they are synchronized. Two different forms of synchronization were used in these studies: synchronous flashing of faces and faces moving in a common direction. We hypothesized that observed synchronization biases the perception of emotions expressed by a neutral person and an observer’s attitude towards this person. These effects are expected to be congruent with the valence of the synchronizing faces. The results showed a divergent pattern of effects for different forms of synchronization. In Study 1, synchronous flashing biased only the perceived emotions. In Study 2, synchrony of movement affected participants’ attitudes towards the observed person. Our findings suggest that the form of observed synchrony is an important factor in drawing inferences about individuals.

Keywords

synchronization, emotion recognition, attitude, social cognition

Synchronization is ubiquitous in social life as it occurs in almost every interaction. Individuals spontaneously and often unconsciously synchronize in simple activities such as walking together (van Ulzen, Lamoth, Daffertshofer, Semin, & Beek, 2008; Zivotofsky & Hausdorff, 2007), shining handled pendulums (Richardson, Marsh, & Schmidt, 2005), rocking chairs (Richardson, Marsh, Isenhower, Goodman, & Schmidt, 2007) or moving one’s leg (Schmidt, Carello, & Turvey, 1990). Although it seems that people start to imitate one another’s behavior unwittingly and effortlessly, its effects on interactions are beneficial for both parties. Movement synchrony increases perceptual sensitivity and facilitates success in joint actions by fostering social cohesion (Valdesolo, Ouyang, & DeSteno, 2010). Synchronization is also present at higher levels of interaction. When people are engaged in a discussion, they automatically converge upon a dialect (Giles, 1973), speaking rate (Street, 1984), speaking speed (Giles, Coupland, Coupland, Williams, & Nussbaum, 1992; Sacks, Schegloff, & Jefferson, 1974), vocal intensity (Natale, 1975), pausing frequency (Cappella & Planalp, 1981), and speech rhythm (Condon, 1976; Condon & Ogston, 1971; Newtson, 1994). They also mimic each other’s facial expressions (Kulesza, Dolinski, Wicher, & Huisman, 2016; McHugo, Lanzetta, Sullivan, Masters, & Englis, 1985; Riehle, Kempkensteffen, & Lincoln, 2017). Such nonverbal synchronization cues facilitate the unfolding of the interaction. It is acknowledged that being synchronized leads to more positive evaluations of the interaction partner (Kulesza & Nowak, 2007; Miles, Nind, & Macrae, 2009). Furthermore, it increases the chances of establishing a rapport – defined as an affective state of mutual attention and positivity – in daily life situations (Bernieri, 1988; Isabella & Belsky, 1991; Tickle-Degnen & Rosenthal, 1990).

While most of the studies in this field focus on the effects occurring within the synchronizing partners, behavioral synchronization can also be observed by third parties and used to infer judgments about the people interacting. In real-life situations, we tend to spontaneously draw inferences about relations between people from the observed levels of behavioral coordination between them. Although the presence of such effects may seem obvious, evidence for them has been obtained only relatively recently. In the study conducted by Lakens and Stel (2011) participants had to infer the degree to which observed individuals constituted a social unit and shared feelings of rapport. Beforehand, they were shown video clips with two confederates rhythmically waving their arms either in synchrony or in asynchrony. The results showed that attributed rapport and perceived unity of the group were rated higher in the synchrony condition. This finding is in line with formerly reported effects of movement synchrony on judged rapport and unity of nonhuman objects – i.e., stick figures (Lakens, 2010; Miles et al., 2009). Taken together, these findings suggest that observation of coordinated behavioral patterns serves as a cue in judgments concerning the degree of attachment among group constituents. Yet, the question remains as to whether the observed synchronization of individuals also affects the observer’s perception of individuals depending on the characteristics of synchronized partners.

This question was partially addressed with respect to dyadic interaction in a study by Kavanagh, Suhler, Churchland, and Winkielman (2011) where participants observed an interview in which the interviewee was either mimicked or did not mimic the mannerisms of a cordial or unfriendly interviewer. The results revealed that friendliness of the interviewer led to judgment of the interviewee as more competent only in the mimicking condition. This finding suggests that evaluation of individuals may depend not only on whom they are interacting with, but more importantly on how they are doing it. It refines the idea of contextual influences on a person’s evaluation by showing that it is not merely the spatial proximity between partners that plays a role but rather the observed level of behavioral synchronization between them.

In our work, we address the question concerning the role of observed synchrony in drawing inferences about an individual in the context of the group. Therefore, we employed the presentation of a neutral person either synchronized or not with a group of surrounding people. Given that in real-life situations first glance evaluation of others takes place along the emotional valence dimension, we narrowed our research to the influence of synchronizing people’s emotions on the perception of a neutral person.

The valence dimension of affect is a salient property of emotions that can be easily signaled and received via nonverbal cues. Emotional expressions are thought to evolve as a pre-language form of communication that allows group members to track the intentions and attitudes of others. The basic facial expressions of emotions (anger, happiness, fear, surprise, disgust, and sadness) are universal, which means that they are easily recognized across different cultures (Ekman, 1992; Ekman & Friesen, 1971). Emotional facial expressions are also very rapidly processed by the brain (Batty & Taylor, 2003) and effectively recognized even if presented in the peripheral field of vision (Goren & Wilson, 2006), which supports the thesis of the automaticity of their analysis. The ability to encode other people’s emotions might have played an important role in survival since it informs about the possible outcomes of interaction and facilitates approach-avoidance behavior (Strack & Deutsch, 2004). In the current studies, we aim to investigate how observed synchronization affects inferences about emotionally neutral people depending on the valence of faces with which they were synchronized.

In two studies, we explore how different modes of synchronization influence emotional responses towards the emotionally neutral person. In the first experiment, synchronization was operationalized as synchronous flashing of the observed faces. In the second study, the faces were synchronizing via emergence of movement in a common direction. Depending on the experimental condition, the neutral face was synchronized with either joyful or angry faces. In both studies, we assumed that the consequences of such observations could be principally twofold: first, an emotionally neutral face may be perceived as expressing positive or negative emotions more strongly, and second, the observer’s attitude towards this person may be biased for higher or lower expressed willingness to interact with them. The first hypothesized effect refers to the phenomenon that perception of faces can be situationally biased by contextual information, including visual, verbal, and auditory cues (Anderson, Siegel, Bliss-Moreau, & Barrett, 2011; Wieser & Brosch, 2012; Wieser et al., 2014). The second hypothesized effect goes beyond momentary perception and is intended to demonstrate that synchronization with other emotional faces may also influence the formation of more generalized attitudes towards the person, which depend less on situational cues and develop on a larger time scale. In both studies, we hypothesized that observed synchronization would bias the perception of emotions expressed by a neutral face and the declared attitude towards them in the direction congruent with the emotional valence of the faces with which they were synchronized.

Study 1

In this study we operationalized synchronization as the synchronous flashing of faces. Participants watched a presentation with a neutral face in a red frame surrounded by faces expressing emotions of either anger or joy (surrounding conditions), which were presented either synchronously flashing or statically exposed (exposure conditions). Afterwards, participants were asked to report the perceived emotional expressions of the observed neutral person and choose whether they would like to further interact with them.

We expected that synchronization would cause a shift in the ratings of the neutral target face in a direction congruent with the emotions expressed by surrounding faces. Furthermore, we predicted that synchronization would also affect the attitude expressed towards the target neutral face. We hypothesized that synchronization would bias willingness for further interaction, i.e., participants would indicate a desire to interact with the neutral target more frequently in the synchronization condition than in the static condition when surrounded by faces expressing joy, while they would indicate such a desire less frequently in the synchronization condition than in the static condition when surrounded by faces expressing anger.

Procedure and Design

The experiment followed a 2 × 2 × 2 factorial design, including three (two levels) between-subjects variables: valence of surrounding faces, exposure condition, and gender of the assessed neutral target face. Figure 1 shows the scheme of the procedure. Each participant was exposed to only one presentation. All stimuli were presented using Keynote for Mac on a 13.3-inch monitor with a 1200 × 800 pixel resolution and 60 Hz refresh rate. All analyses were performed using R (R Core Team, 2018). The packages “dplyr” (version 0.7.4), “magrittr” (version 1.5), and “ggplot2” (2.2.1) were used for data manipulation, processing and visualization (Bache & Wickham, 2014; Wickham, 2009; Wickham, Francois, Henry, & Müller, 2017). The pictures used in the experiment originated from the AR Face DataBase (Martinez & Benavente, 1998).

Figure 1.

Flow chart of the study design. The top left panel shows the initial arrangement of faces with the target neutral face located in the middle. In each exposure condition (synchronization or static), subjects were exposed to one of four prepared presentations (differing in the gender of the target face and the valence of the surrounding faces). Depending on the surrounding condition, the target face was either surrounded by four joyful or angry faces. Participants were instructed to visually track the framed face throughout the entire presentation. In the synchronization condition they flashed simultaneously at two Hz for ten seconds (top middle panel), while in the static condition faces were presented statically for ten seconds (top right panel). After seeing the presentation, participants responded on a 7-point Likert scale to six questions concerning the characteristics of the observed face (bottom panel). The first two questions were about the general emotional valence of the face: expressing positive/negative emotions, eliciting positive/negative emotions in the observer. The other four questions addressed the expression of more specific emotions such as joy, fear, trust, and anger. Afterwards, participants were shown three simultaneously displayed pictures of people – two neutral faces and the target face observed during the presentation – and were asked to choose with whom they would like to have a future interaction. The presentation was originally displayed in color.

Participants

The participants were students from the Psychology Department at the University of Warsaw. A total of 160 subjects (113 females and 47 males), aged from 18 to 32 (M = 21.62, SD = 2.06), were randomly assigned to one of the presentations. The procedure was approved by the ethics committee of the Robert Zajonc Institute for Social Studies at the University of Warsaw. All participants gave informed consent before taking part in the study.

Results

Ratings of Perception of Emotions Expressed by the Neutral Face

To test the hypothesis that synchronization would cause a shift in the assessment of the neutral target face in a direction congruent with the emotions expressed by surrounding faces, we computed an indicator denoting differences in the average ratings of emotions expressed by the neutral face between surrounding conditions. The obtained values indicate a disparity in ratings between surrounding conditions for the different modes of exposure: static and synchronization. This allowed us to investigate the existence of a synchronization effect. The greater the indicator, the more the rating of the neutral face shifted in a direction congruent with the emotions expressed by the surrounding faces. Therefore, we expected that there would be a significant difference between exposure conditions. We expected that, for ratings of perceived emotions (on the positive-negative scale), this indicator would be greater in the synchronization condition than in the static condition.

The Mann-Whitney-Wilcoxon rank sum test revealed a significant difference between the synchronization and static conditions, W = 517, p < .01. The difference in average ratings of emotions expressed by neutral faces between different surrounding conditions was significantly higher in the synchronization condition (M = .23, SD = 1.23) than in the static condition (M = -.55, SD = 1.6; see Figure 2).

Figure 2.

The differences in average ratings of emotions expressed by neutral faces between surrounding conditions (faces expressing joy and faces expressing anger) as a function of exposure condition (synchronization and static exposure). In the synchronization condition, perceived emotions expressed by neutral faces were more congruent with the valence of surrounding faces than in the static condition.

Note. W = 517, p < .006, n = 80

This suggests that synchronization causes ratings of the neutral target face to shift in a direction congruent with the emotions expressed by the surrounding faces. However, to further examine the origins of this shift we performed a three-way analysis of variance (ANOVA) with three between-subjects factors (gender of the target face, valence of the surrounding faces, and exposure condition). The ANOVA showed only the expected interaction between valence of the surrounding and exposure condition effects to be significant, F (1, 152) = 6.24, p < .05, ​​η​ p​ 2 ​​ = .04. The post hoc pairwise analysis using the Tukey method adjustment of p value for comparing a family of four estimates revealed that there was only a significant difference between exposure conditions in the case of negative surrounding emotions, t (152) = 3.07, p = .0131, d = 0.71 (see Figure 3). A neutral face surrounded by faces expressing negative emotions was rated significantly more positively when exposed in the static condition (M = 4.2, SD = 1.04) than in the synchronization condition (M = 3.53, SD = .72).

Figure 3.

Interaction between exposure conditions (synchronization and static exposure) and emotional valence of surrounding faces expressing joy or anger in determining the perceived emotions expressed by the neutral face. Error bars represent 95% confidence intervals.

Note. F (1, 152) = 6.24, p = .0135, ​​η​ p​ 2 ​​ n = 160

Additionally, we performed exploratory analyses of ratings of elicited emotions and expression of specific emotions (joy, trust, anger, and fear). We found differences only in the ratings of elicited emotions, for which we performed a similar analysis to the one used for the expressed emotions ratings.

The Mann-Whitney-Wilcoxon rank sum test revealed a significant difference between synchronization and static conditions, W = 517, p < .05. The difference in average ratings of emotions elicited by neutral faces between surrounding conditions was significantly higher in the synchronization condition (M = .35, SD = 1.49) than in the static condition (M = -.4, SD = 1.65). This implies that synchronization causes the emotions elicited by the target neutral face to shift in a direction congruent with the emotions expressed by the surrounding faces (see Figure 4).

Figure 4.

The differences in average ratings of emotions elicited by neutral faces between surrounding conditions (faces expressing joy and faces expressing anger) as a function of exposure condition (synchronization and static exposure). In the synchronization condition, perceived emotions expressed by neutral faces were more congruent with the valence of surrounding faces than in the static condition.

Note. W = 552, p < .017, n = 80

However, to further examine the origins of this shift we performed a three-way analysis of variance (ANOVA) with three between-subjects factors (gender of the target face, valence of the surrounding faces, and exposure condition). The three-way ANOVA yielded a significant effect of exposure, F (1,152) = 6.84, p < .01, ​​η​ p​ 2 ​​ = .04. In the static exposure condition (M = 4.5, SD = 1.16) participants reported more positive emotions than in the synchronization condition (M = 4.04, SD = 1.03). Furthermore, the interaction between the valence of the surrounding and exposure effects on elicited emotion ratings for the target neutral face was significant, F (1, 152) = 4.75, p < .05, ​​η​ p​ 2 ​​ = .03. The post hoc pairwise analysis using the Tukey method adjustment of p value for comparing a family of four estimates revealed that there was a significant difference between the exposure conditions only in the case of negative surrounding emotions, t (152) = 3.39, p < .001, d = .8 (see Figure 5). The neutral face surrounded by faces expressing negative emotions in the static exposure condition was rated as displaying more positive emotions (M = 4.7, SD = 1.11) than in the synchronization exposure condition (M = 3.88, SD = .76).

Figure 5.

Interaction between exposure conditions (synchronous flashing and static exposure) and emotional valence of surrounding faces expressing joy or anger in determining the elicited emotions expressed by the neutral face. Error bars represent 95% confidence intervals.

Note. F (1, 152) = 4.75, p = .03,n p2 = .03, n = 160

Declared Willingness to Interact With the Neutral Person

A chi-square test was performed to determine whether synchronization would bias willingness for further interaction with the neutral person. First, we expected that the neutral target face would be chosen more frequently in the synchronization condition than in the static condition when surrounded by faces expressing joy. Although the target neutral face was chosen 18 times in the synchronization condition and only 9 times in the static condition, the one sample chi-square test did not reveal significant differences, χ2(1) = 2.37, p = .12. Second, we hypothesized that the neutral target face would be chosen less frequently in the synchronization condition than in the static condition when surrounded by faces expressing anger. Again, even though the direction was consistent with the hypothesis (14 for the static condition, 12 for the synchronization condition), the one sample chi-square test conducted did not yield significant differences, χ2(1) = .04, p = .84.

Taken together, the results of Study 1 supported the hypothesis that the perception of emotions expressed by the neutral face is biased by synchronization with the surrounding faces. The judgments were indeed shifted in the direction congruent with their emotional valence, even though this effect seems to stem from attunement to negative rather than positive emotions. On the other hand, no support for the hypothesis of biased willingness to interact with the neutral face was found. Participants chose faces that had been synchronized with the surrounding faces as often as faces that had not been seen before.

Study 2

In this study we operationalized synchronization as the emergence of movement in a common direction. As in Study 1, we asked participants to visually track a red-framed neutral target face throughout the experiment. In two between-subject conditions, after a period of independent movement of all faces, the target face began to move in the same direction with a specific group of emotional faces. In the first condition, the target face was temporarily synchronized with joyful faces, while in the second it was temporarily synchronized with faces expressing anger. We expected that the neutral target face would be rated as more positive after being synchronized with joyful faces and more negative after synchronization with angry faces. Furthermore, we hypothesized that participants would more frequently indicate a desire to interact with the target face after synchronizing with positive faces than with negative ones.

Procedure and Design

In contrast to Study 1, different types of emotional faces were embedded into a single presentation. In Study 1, the target face was surrounded by faces expressing either joy or anger, while in the current study the target face was simultaneously surrounded by positive (joyful), negative (angry), and emotionally neutral faces. The study followed a between-group design with one factor: the emotional valence of faces synchronizing with the target face. After a period of independent movement of all faces, the neutral target face began to move at the same angle together with either joyful or angry faces while the others continued moving chaotically. The study design is illustrated in Figure 6. The animations were presented using Microsoft PowerPoint 2007 on a 15.4-inch monitor with a 1200 × 800 pixel resolution and 60 Hz refresh rate.

Figure 6.

Flow chart of the study design. The top left panel shows the initial arrangement of faces in both conditions with the target neutral face located in the middle. The surrounding faces included four joyful, four angry, and four emotionally neutral faces. The initial distances between the target face and the groups of joyful and angry faces were balanced by a symmetrical arrangement relative to the target face. Participants were instructed to visually track the framed face throughout the entire presentation. The presentation started with a twelve second period of independent movement of all faces (top middle panel). Afterwards, the target face began to move across the screen at the same angle together with a group of either joyful or angry faces depending on the experimental condition (top right panel). Meanwhile, the other (non-synchronizing) faces continued moving chaotically. The synchronization phase lasted for three seconds until the target face reached the border of the screen (in the present example, one of the joyful faces has already moved beyond frame of the screen). This procedure was repeated eight times. The angular direction of movement during the synchronization phases varied from cycle to cycle and occurred in the following sequence: 315, 135, 45, and 225 (in degrees with zero corresponding to the true vertical) and was repeated twice. After the presentation was complete (it lasted one hundred twenty seconds) participants responded to the same questions as in study 1 (bottom panel). The presentation was originally displayed in color.

Participants

The participants were 72 undergraduates from different departments of the University of Warsaw (48 female, 24 male; ages 18 to 34; M = 21.9 years, SD = 3.06 years). They were randomly assigned to one of two experimental conditions, with 36 participants in each. The procedure was approved by the ethics committee of the Robert Zajonc Institute for Social Studies at the University of Warsaw. All participants gave informed consent before taking part in the study.

Results

Ratings of Perception of Emotions Expressed by the Neutral Face

As the data did not fit a normal distribution, we used the Mann-Whitney test to examine the hypothesis that the target neutral face would be rated as expressing more positive emotions after synchronizing with joyful faces than in the condition in which it synchronizes with angry faces. The results indicated that there were no significant differences in ratings of emotions of the neutral face between these two conditions, U = 580, p = .399. In comparison to Study 1, where a short episode of synchronization via synchronous flashing influenced ratings of emotions of the neutral target face, repeated phases of synchronization implemented as movement in a common direction did not bias the perception of emotions expressed by the neutral face. Additionally, we conducted exploratory analyses of ratings of elicited emotions and the expression of specific emotions (joy, trust, anger, and fear). We found no significant differences for these measures between the experimental conditions.

Declared Willingness to Interact With the Neutral Person

A chi-square test was performed to examine the relationship between the emotional valence of the synchronizing faces and declared willingness for further interaction with the neutral person. We expected that the observed neutral person would be more likely to be chosen for a hypothetical interaction after synchronizing with joyful faces than with angry ones. The relationship between the emotional valence of the synchronizing faces and the declared willingness for interaction with the neutral person was significant, χ2(1) = 4.59, p = .032. When the target face was temporarily synchronized with faces expressing joy it was chosen in 69% of cases, while one of the other two neutral faces was chosen in 31% of cases (see Figure 7). In the condition embedding synchronization with faces expressing anger, the neutral person was chosen for a hypothetical future interaction in 44% of cases, while the other faces were chosen in 56% of cases. This suggests that the observed periods of synchronization via movement in a common direction with emotional faces may bias attitudes towards the emotionally neutral person.

Figure 7.

The declared willingness to interact with the neutral person as a function of the emotional valence of the faces with which it was synchronizing. The neutral person was more likely to be chosen for a hypothetical interaction after being synchronized with faces expressing joy than with faces expressing anger.

Note. χ2(1) = 4.59, p = .032, n = 72

Taken together, in Study 2 we found that repeated phases of movement synchronization between the observed neutral face and emotional faces biased desire for further interaction with the observed person in a way congruent with our hypothesis but did not influence the perceived emotions expressed by that person.

Discussion

The aim of the present studies was to examine how different modes of observed synchronization influence emotional responses towards an emotionally neutral person. Previous research on the topic of observed synchronization showed how it might affect perception of synchronized actors. In a study conducted by Lakens and Stel (2011), the observed movement synchrony increased attribution of entitativity and level of rapport between group members. Following the assumption that synchronized agents tend to be perceived as sharing the same psychological state, here we intended to go a step further and address the question of how synchronization biases the perception of an individual synchronized with others depending on the valence of their facial expressions. Therefore, we developed a method that allows us to estimate the shift in emotional responses towards a neutral person depending on with whom they had been previously synchronized.

In Study 1, participants were asked to observe an emotionally neutral face that was presented either statically or flashing at the same rate as surrounding faces which expressed either emotions of joy or anger. We expected that synchronization would bias emotional responses to the neutral target face in a direction congruent with the valence of the surrounding faces. The results showed that the effect was only present regarding the perception of expressed emotions and not attitudes towards the neutral person. In the synchronization condition, the neutral target face was rated as expressing more congruent emotions with those surrounding it than in the static condition. In-depth analysis of this effect revealed that this was due to a decrease in ratings between static and synchronization conditions when surrounding faces expressed anger. This result might be interpreted in terms of the concept of negative-positive asymmetry (Peeters & Czapinski, 1990), which suggests that a negative context usually has a stronger influence than a positive one. This may explain why the effect of synchronization on perceived emotions was observed only in the context of negative emotions. Furthermore, contrary to our hypothesis, there was no effect of the observed synchronization on attitudes towards the neutral person as measured by choice of partner for a hypothetical future interaction.

The results of Study 2, employing repeated synchronization via emergence of movement in a common direction, revealed a different pattern of effects on the same measures. While the observed synchronization with emotional faces did not influence the perceived emotions expressed by the neutral face, it did bias willingness for further interaction in a way congruent with our hypothesis. The observed neutral person was more likely to be chosen for a hypothetical interaction after synchronizing with faces expressing joy than after synchronizing with faces expressing anger. Taken together, our results indicate that perception of and attitudes towards the neutral person are not equally affected by different forms of synchronization with emotional faces. Although we expected the observed synchronization to influence both of these measures, we did not assume that they would necessarily converge.

Our interpretation of the obtained results relies on the assumption that effects on the level of perception and attitude normally occur under different situational conditions. The perception of faces’ emotional expressions can be biased by a brief exposure to semantically relevant and internally coherent contextual information – including visual, verbal, and auditory cues (Wieser & Brosch, 2012; Wieser et al., 2014). In Study 1, we only used a 10 second interval of exposure in which the target face was surrounded by either positive or negative faces flashing at the same rate or remaining static. Under such conditions, we observed the effects of synchronization only for perception of expressed emotions, but not for attitudes towards the person. We believe that these results may be interpreted in terms of a lack of the necessary conditions that usually underlie attitude formation. Attitudes develop over multiple interactions and on larger time scales than in this experiment. However, such conditions were present in Study 2, where the neutral target face was alternately synchronizing with emotionally loaded faces (expressing joy in one condition and anger in the other) in between phases of independent movement of all faces. Our supposition is that this situation enabled the formation of an attitude towards the neutral face, which was not observed in the case of Study 1. Furthermore, the absence of any effect on the perception of expressed emotions in Study 2 could be ascribed to procedural differences. In Study 2, the total time of synchronization with emotional faces was four times smaller than episodes of independent movement. In most studies on neutral face processing, the contextual cues are internally coherent; this was the case in Study 1 (either positive or negative faces and the non-changeable mode of exposure), but not in Study 2 (employing multiple types of emotional faces and interchangeable phases of synchrony and desynchrony). We believe that the observed episodes of synchronizing with emotional faces in Study 2 could be sufficient to form an attitude towards the neutral face, but the lack of continually presented and internally coherent perceptual context hindered the effect of expressed emotions on perception.

Taken together, the results of these studies suggest that the characteristics of those with whom we synchronize might affect how we are perceived. A similar conclusion was reached by Kavanagh et al. (2011), however our results generalize beyond dyadic interaction and mimicry. Regardless of the form of synchronization, the contextual cues – emotions expressed by synchronizing faces – served as an interpretive frame towards which the neutral face was drawn. This might be due to the fact that synchronization facilitates the perceived entitativity of individuals (Lakens, 2010; Lakens & Stel, 2011), which in turn could lead to the effect of assimilation (Dasgupta, Banaji, & Abelson, 1999). Therefore, the perceived similarity between an individual and surrounding people seems to be facilitated by the observed synchrony between them. Our results add to the literature by showing that the emotional characteristics of group members may modulate perceptions of a neutral person depending on whether it synchronizes with the group or not. Yet, it needs to be determined to what extent the effects of observed synchronization could be generalized beyond the inferences about expressed emotions.

It is important to acknowledge the limitations of this research. First, we did not investigate the role of certain presentation parameters, i.e., the number of emotional faces surrounding the observed neutral face and the duration of particular phases (i.e., synchronous and asynchronous movements in Study 2). Second, we measured dependent variables only once per condition, whereas employing multiple trials would allow the randomization of the effects of stimuli and, hence, the estimation of intra-individual variation in responses. Third, including the measurement of reaction times could also inform the reliability of participants’ assessments and serve as a criterion for the exclusion of outliers. Additionally, we restricted our attention to the role of emotional valence of the surrounding faces (joyful and angry) and the different forms of implemented synchronization (synchronous flashing rate and movement in a common direction). While both studies were designed to test whether the emotional responses towards a neutral person would be biased in a direction congruent with the valence of the synchronizing faces, we did not make specific predictions regarding the effects induced by different forms of synchronization. Future research in this field should take into account the possible factors underlying the different pattern of effects between the two studies. This would help to evaluate the previously discussed post-hoc interpretation of the obtained results. Furthermore, we did not include a control condition with non-synchronized flashing faces in Study 1. This could further clarify whether it is the synchronous flashing rate that leads to the observed effects (as assumed by the authors) or rather the shared type of action (i.e., flashing, with its temporal parameters being of secondary importance).

In conclusion, we have found that it is not merely spatial proximity that leads to biased evaluation of a person in the context of an accompanying group but rather the observed synchronization between them. Thus, synchrony may serve as a cognitive heuristic that allows us to draw inferences about individuals in a dynamically changing social environment.

Funding

Project was supported by the Polish National Science Centre under grant agreement No.: 2011/03/B/ HS6/05084 and the University of Warsaw, Faculty of Psychology (DSM 1167-2017).

Competing Interests

The authors have declared that no competing interests exist.

Acknowledgements

The authors wish to thank Lan Bui-Wrzosińska for her insightful comments on the early stages of the research. We are also grateful to Corinne Gilad and Aliza Sloan for their help in editing text in English.

Data Availability

The data analyzed in this paper are freely available via the PsychArchives repository. For further information see the "Supplementary Materials" section.

Supplementary Materials

Data, codebooks and analysis code (R scripts) are accessible via the PsychArchives repository.

Index of Supplementary Materials

Biesaga, M., Motyka, P., & Nowak, A. (2018). Supplementary materials to: The effects of synchronization with either joyful or angry people on perception of an emotionally neutral person. https://doi.org/10.23668/psycharchives.927

References

  • Bernieri, F. J. (1988). Coordinated movement and rapport in teacher-student interactions. Journal of Nonverbal Behavior, 12(2), 120–138. https://doi.org/10.1007/BF00986930
  • Condon, W. S., & Ogston, W. D. (1971). Speech and body motion synchrony of the speaker–hearer. In D. L. Horton, & J. J. Jenkins (Eds.), Perception of language (pp. 150–184). Columbus, OH, USA: Charles E. Merrill.
  • Dasgupta, N., Banaji, M. R., & Abelson, R. P. (1999). Group entitativity and group perception: Associations between physical features and psychological judgment. Journal of Personality and Social Psychology, 77(5), 991–1003. https://doi.org/10.1037/0022-3514.77.5.991
  • Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the faces and emotion. Journal of Personality and Social Psychology, 17(2), 124–129. https://doi.org/10.1037/h0030377
  • Giles, H., Coupland, N., Coupland, J., Williams, A., & Nussbaum, J. (1992). Intergenerational talk and communication with older people. International Journal of Aging and Human Development, 34(4), 271–297. https://doi.org/10.2190/TCMU-0U65-XTEH-B950
  • Isabella, R. A., & Belsky, J. (1991). Interactional synchrony and the origins of infant‐mother attachment: A replication study. Child Development, 62(2), 373–384. https://doi.org/10.2307/1131010
  • Kavanagh, L. C., Suhler, C., Churchland, P., & Winkielman, P. (2011). When it’s an error to mirror: The surprising reputational costs of mimicry. Psychological Science, 22(10), 1274–1276. https://doi.org/10.1177/0956797611418678
  • Kulesza, W., & Nowak, A. (2007). Dlaczego Ciebie lubię? Bo się koordynujemy. In K. Winkowska-Nowak, A. Nowak, & A. Rychwalska (Eds.), Modelowanie matematyczne i symulacje komputerowe w naukach społecznych. Warszawa, Poland: Wydawnictwo SWPS Academica.
  • Kulesza, W., Dolinski, D., Wicher, P., & Huisman, A. (2016). The conversational chameleon: An investigation into the link between dialogue and verbal mimicry. Journal of Language and Social Psychology, 35(5), 515–528. https://doi.org/10.1177/0261927X15601460
  • Lakens, D., & Stel, M. (2011). If they move in sync, they must feel in sync: Movement synchrony leads to attributions of rapport and entitativity. Social Cognition, 29(1), 1–14. https://doi.org/10.1521/soco.2011.29.1.1
  • McHugo, G. J., Lanzetta, J. T., Sullivan, D. G., Masters, R. D., & Englis, B. G. (1985). Emotional reactions to a political leader’s expressive displays. Journal of Personality and Social Psychology, 49(6), 1513–1529. https://doi.org/10.1037/0022-3514.49.6.1513
  • Miles, L. K., Nind, L. K., & Macrae, N. C. (2009). The rhythm of rapport: Interpersonal synchrony and social perception. Journal of Experimental Social Psychology, 45(3), 585–589. https://doi.org/10.1016/j.jesp.2009.02.002
  • Natale, M. (1975). Convergence of mean vocal intensity in dyadic communication as a function of social desirability. Journal of Personality and Social Psychology, 32(5), 790–804. https://doi.org/10.1037/0022-3514.32.5.790
  • Newtson, D. (1994). The perception and coupling of behavior waves. In R.R. Vallacher, & A. Nowak (Eds.), Dynamical systems in social psychology (pp. 139–167). San Diego, CA, USA: Academic Press.
  • Peeters, G., & Czapinski, J. (1990). Positive-negative asymmetry in evaluations: The distinction between affective and informational negativity effects. European Review of Social Psychology, 1(1), 33–60. https://doi.org/10.1080/14792779108401856
  • Richardson, M. J., Marsh, K. L., & Schmidt, R. C. (2005). Effects of visual and verbal interaction on unintentional interpersonal coordination. Journal of Experimental Psychology: Human Perception and Performance, 31(1), 62–79. https://doi.org/10.1037/0096-1523.31.1.62
  • Richardson, M. J., Marsh, K. L., Isenhower, R. W., Goodman, J. R., & Schmidt, R. C. (2007). Rocking together: Dynamics of intentional and unintentional interpersonal coordination. Human Movement Science, 26(6), 867–891. https://doi.org/10.1016/j.humov.2007.07.002
  • Riehle, M., Kempkensteffen, J., & Lincoln, T. M. (2017). Quantifying facial expression synchrony in face-to-face dyadic interactions: Temporal dynamics of simultaneously recorded facial EMG signals. Journal of Nonverbal Behavior, 41(2), 85–102. https://doi.org/10.1007/s10919-016-0246-8
  • Sacks, H., Schegloff, E., & Jefferson, G. (1974). A simplest systematics for the organization of turn-taking for conversation. Language, 50(4), 696–735. https://doi.org/10.1353/lan.1974.0010
  • Schmidt, R. C., Carello, C., & Turvey, M. T. (1990). Phase transitions and critical fluctuations in the visual coordination of rhythmic movements between people. Journal of Experimental Psychology: Human Perception and Performance, 16(2), 227–247. https://doi.org/10.1037/0096-1523.16.2.227
  • Valdesolo, P., Ouyang, J., & DeSteno, D. (2010). The rhythm of joint action: Synchrony promotes cooperative ability. Journal of Experimental Social Psychology, 46(4), 693–695. https://doi.org/10.1016/j.jesp.2010.03.004
  • van Ulzen, N. R., Lamoth, C. J., Daffertshofer, A., Semin, G. R., & Beek, P. J. (2008). Characteristics of instructed and uninstructed interpersonal coordination while walking side-by-side. Neuroscience Letters, 432(2), 88–93. https://doi.org/10.1016/j.neulet.2007.11.070
  • Wieser, M. J., & Brosch, T. (2012). Faces in context: A review and systematization of contextual influences on affective face processing. Frontiers in Psychology, 3, Article 471. https://doi.org/10.3389/fpsyg.2012.00471
  • Wieser, M. J., Gerdes, A. B., Bungel, I., Schwarz, K. A., Muhlberger, A., & Pauli, P. (2014). Not so harmless anymore: How context impacts the perception and electrocortical processing of neutral faces. NeuroImage, 92, 74–82. https://doi.org/10.1016/j.neuroimage.2014.01.022
  • Zivotofsky, A. Z., & Hausdorff, J. M. (2007). The sensory feedback mechanisms enabling couples to walk synchronously: An initial investigation. Journal of Neuroengineering and Rehabilitation, 4(1), Article 28. https://doi.org/10.1186/1743-0003-4-28