Today organizations capture and store an abundant amount of data from their interaction with clients, internal information systems, technical systems and sensors. Data captured this way comprises many useful insights that can be discovered by various analytical procedures and methods. Discovering regular and irregular data sequences in the captured data can reveal processes performed by the organization, which can be then assessed, measured and optimized, to achieve overall better performance, lower costs, resolve congestions, find potentially fraudulent activities and similar. Besides process discovery, capturing data sequences can give additional behavioral and tendency insights for various observations in the organization, such as sales dynamic, customer behaviour and similar. The issue is that most of the captured data intertwine multiple processes, customers, cases, products in a single data log or data stream. In this article, we propose an evolving tokenized transducer (ETT), capable of learning data sequences from a multi-contextual data log or stream. The proposed ETT is a semi-supervised relational learning method that can be used as a classifier on an unknown data log or stream, revealing previously learned data sequences. The proposed ETT was tested on multiple synthetic and real-life cases and datasets, such as dataset of retail sales sequences, hospital process log involving septic patient treatment and BPI challenge 2019 dataset. Test results are successful, revealing ETT as a prominent process discovery method.