A fundamental characteristic of social exchanges is the synchronization of individuals' behaviors, physiological responses, and neural activity. However, the influence of how individuals communicate in terms of emotional content and expressed associative knowledge on interpersonal synchrony has been scarcely investigated so far. This study addresses this research gap by bridging recent advances in cognitive neuroscience data, affective computing, and cognitive data science frameworks. Using functional near-infrared spectroscopy (fNIRS) hyperscanning, prefrontal neural data were collected during social interactions involving 84 participants (i.e., 42 dyads) aged 18-35 years. Wavelet transform coherence was used to assess interpersonal neural synchrony between participants. We used manual transcription of dialogues and automated methods to codify transcriptions as emotional levels and syntactic/semantic networks. Our quantitative findings reveal higher than random expectations levels of interpersonal neural synchrony in the superior frontal gyrus (p = 0.020) and the bilateral middle frontal gyri (p < 0.001; p = 0.002). Stepwise models based on dialogues' emotional content only significantly predicted interpersonal neural synchrony in the medial (R2 = 14.13%) and left prefrontal cortex (R2 = 8.25%). Conversely, models relying on semantic features were more effective for the right prefrontal cortex (R2 = 18.30%). Generally, emotional content emerged as the most accurate predictor of synchrony. However, we found an interplay between emotions and associative knowledge during role reversal (i.e., a clinical technique involving perspective-taking) providing quantitative support to empathy in social interactions emphasizing both affective and cognitive computations. Our study identifies a mind-brain duality in emotions and associative knowledge reflecting neural synchrony levels, opening new ways for investigating human interactions.