Virtual laboratories have been increasingly used in tertiary education for natural and applied sciences, especially due to the COVID pandemic, generating a substantial investment in corresponding software applications, including simulation experiments and procedures. However, it is expensive and time-consuming to analyze, understand, model and implement the virtual experiments, especially when it is necessary to create new ones from scratch, but also when they must be redesigned and addressed to an audience in a different educational setting. We use UML Activity Diagrams and Petri nets to model experimental procedures and then apply conformance checking to detect possible nonconformities between expected model behavior and actual model execution. As a result, we provide an estimation of the conceptual proximity between experiments performed in different educational settings using the same virtual laboratory software, assisting educators and developers in making informed decisions about software reuse and redesign by providing a systematic and formal way of evaluating software applicability. A virtual microscoping experiment was used as a case study for validation purposes. The results revealed that the specific virtual lab software can be ported, without modifications, from tertiary to secondary education, to achieve learning outcomes relevant to that education level, even though it was originally designed for a distance education university. The proposed framework has potential applications beyond virtual laboratories, as a general approach to process modeling and conformance checking to evaluate the similarity between the specification of experimental procedures and actual execution logs can be applied to various domains.