Previous studies have shown that neural machine translation (NMT) models can benefit from explicitly modeling translated (PAST) and untranslated (FUTURE) source contents as recurrent states (Zheng et al., 2018). However, this less interpretable recurrent process hinders its power to model the dynamic updating of PAST and FUTURE contents during decoding. In this paper, we propose to model the dynamic principles by explicitly separating source words into groups of translated and untranslated contents through parts-to-wholes assignment. The assignment is learned through a novel variant of routing-by-agreement mechanism (Sabour et al., 2017), namely Guided Dynamic Routing, where the translating status at each decoding step guides the routing process to assign each source word to its associated group (i.e., translated or untranslated content) represented by a capsule, enabling translation to be made from holistic context. Experiments show that our approach achieves substantial improvements over both RNMT and Transformer by producing more adequate translations. Extensive analysis demonstrates that our method is highly interpretable, which is able to recognize the translated and untranslated contents as expected. 1