Predicting timing behaviour is essential for the design of embedded real-time systems that can switch between different operational modes at runtime. The settling time of a mode change, called mode change transition latency, is an important system parameter. Known approaches that address the problem of timing analysis for multi-mode real-time systems are restricted to applications without communicating tasks. Also, these assume that transitions are initiated only during a steady state, however, without indicating when a system executes in a steady state. In this paper, we present an analysis algorithm which gives a maximum bound on each mode change transition latency of multi-mode distributed applications thereby overcoming limitations of previous work. We explain the algorithm, prove its correctness, illustrate the steps and provide experimental data that show its usefulness.