Neural mechanisms that arbitrate between integrating and segregating multisensory information are essential for complex scene analysis and for the resolution of the multisensory correspondence problem. However, these mechanisms and their dynamics remain largely unknown, partly because classical models of multisensory integration are static. Here, we used the Multisensory Correlation Detector, a model that provides a good explanatory power for human behavior while incorporating dynamic computations. Participants judged whether sequences of auditory and visual signals originated from the same source (causal inference) or whether one modality was leading the other (temporal order), while being recorded with magnetoencephalography. To test the match between the Multisensory Correlation Detector dynamics and the magnetoencephalographic recordings, we developed a novel dynamic encoding-model approach of electrophysiological activity, which relied on temporal response functions. First, we confirm that the Multisensory Correlation Detector explains causal inference and temporal order patterns well. Second, we found strong fits of brain activity to the two outputs of the Multisensory Correlation Detector in temporo-parietal cortices, a region with known multisensory integrative properties. Finally, we report an asymmetry in the goodness of the fits, which were more reliable during the causal inference than during the temporal order judgment task. Overall, our results suggest the plausible existence of multisensory correlation detectors in the human brain, which explain why and how causal inference is strongly driven by the temporal correlation of multisensory signals.