Today most applications continuously produce information under the form of streams, due to the advent of the means of collecting data. Sensors and social networks collect an immense variety and volume of data, from di erent real-life situations and at a considerable velocity. Increasingly, applications require processing of heterogeneous data streams from di erent sources together with large background knowledge. To use only the information on the data stream is not enough for many use cases. Semantic Complex Event Processing (CEP) systems have evolved from the classical rule-based CEP systems, by integrating high-level knowledge representation and RDF stream processing using both the data stream and background static knowledge. Additionally, CEP approaches lack the capability to semantically interpret and analyze data, which Semantic CEP (SCEP) attempts to address. SCEP has several limitations; one of them is related to their high processing time. This paper provides a conceptual model and an implementation of an infrastructure for distributed SCEP, where each SCEP operator can process part of the data and send it to other SCEP operators in order to achieves some answer. We show that by splitting the RDF stream processing and the background knowledge using the concept of SCEP operators, it's possible to considerably reduce processing time.