“…Raw videos from the experiment were rendered into split-screen presentations showing both P1 and P2. Synchrony composed of 11 categories was manually coded via C-BAS by trained coders (n = 5): laughing along with the partner, gestural mirroring, postural mirroring, eye synchrony (e.g., both participants rolling their eyes, looking up, or looking down), head nods/shakes, verbal repetition, vocal synchrony (i.e., a matching of the vocal features), lower facial mimicry (e.g., synchrony in smiling or frowning), upper facial mimicry (e.g., eyebrow synchrony), synchrony of temporal behaviors, and other instances of synchrony (see also Dunbar et al, 2020). Coders assessed these types of local synchrony indicating the occurrence of matching for micro-units of behavior (Dunbar et al, 2014).…”