Parallel independence between transformation steps is a basic and well-understood notion of the algebraic approaches to graph transformation, and typically guarantees that the two steps can be applied in any order obtaining the same resulting graph, up to isomorphism. The concept has been redefined for several algebraic approaches as variations of a classical "algebraic" condition, requiring that each matching morphism factorizes through the context graphs of the other transformation step. However, looking at some classical papers on the doublepushout approach, one finds that the original definition of parallel independence was formulated in set-theoretical terms, requiring that the intersection of the images of the two left-hand sides in the host graph is contained in the intersection of the two interface graphs. The relationship between this definition and the standard algebraic one is discussed in this position paper, both in the case of left-linear and non-left-linear rules.