Event Causality Identification (ECI) task aims to identify causal relations between events in texts, which is beneficial to understanding the precise logic meaning expressed by a text. Although existing event causality identification works based on fine-tuned PLMs have achieved promising results, they suffer from prohibitive computation costs, catastrophic forgetting of distributional knowledge, as well as poor interpretability. Particularly in low-resource and cross-linguistic scenarios, existing multilingual models are generally confronted with the so-called curse of multilinguality, language bias, and hence result in low accuracy and generalization ability. In this paper, we propose a paradigm, termed Pre-training with Event Knowledge of ConceptNet (PTEKC), to couple Multilingual Pre-trained Language Models (MPLMs) with external event knowledge for Cross-lingual Event Causality Identification. Specifically, we have developed a parameter-sharing adapter plugin that facilitates the integration of event knowledge into the frozen Pretrained Language Model (PLM). This approach significantly diminishes the number of trainable parameters and greatly reduces the risk of catastrophic forgetting. Our Adapter integrates multilingual-aligned event knowledge into the MPLMs through two designed pre-training tasks, namely span masking and self-supervised link prediction. Extensive experiments on the benchmark dataset MECI show that PTEKC is parameter-efficient and can effectively incorporate knowledge for improving cross-lingual event causality identification.