In a network of dynamical systems, concurrent synchronization is a regime where multiple groups of fully synchronized elements coexist. In the brain, concurrent synchronization may occur at several scales, with multiple "rhythms" interacting and functional assemblies combining neural oscillators of many different types. Mathematically, stable concurrent synchronization corresponds to convergence to a flow-invariant linear subspace of the global state space. We derive a general condition for such convergence to occur globally and exponentially. We also show that, under mild conditions, global convergence to a concurrently synchronized regime is preserved under basic system combinations such as negative feedback or hierarchies, so that stable concurrently synchronized aggregates of arbitrary size can be constructed. Robustnesss of stable concurrent synchronization to variations in individual dynamics is also quantified. Simple applications of these results to classical questions in systems neuroscience and robotics are discussed.