The problem of distributed data compression for function computation is considered, where (i) the function to be computed is not necessarily symbol-wise function and (ii) the information source has memory and may not be stationary nor ergodic. We introduce the class of smooth sources and give a sufficient condition on functions so that the achievable rate region for computing coincides with the Slepian-Wolf region (i.e., the rate region for reproducing the entire source) for any smooth sources. Moreover, for symbol-wise functions, the necessary and sufficient condition for the coincidence is established. Our result for the full side-information case is a generalization of the result by Ahlswede and Csiszár to sources with memory; our dichotomy theorem is different from Han and Kobayashi's dichotomy theorem, which reveals an effect of memory in distributed function computation. All results are given not only for fixed-length coding but also for variable-length coding in a unified manner. Furthermore, for the full side-information case, the error probability in the moderate deviation regime is also investigated.
Index Termsdistributed computing, information-spectrum method, Slepian-Wolf coding
I. INTRODUCTIONWe study the problem of distributed data compression for function computation described in Fig. 1 and Fig. 2, where the function to be computed is not necessarily symbol-wise function. In [1], Körner and Marton revealed that the achievable rate region for computing modulo-sum is strictly larger than the rate region that can be achieved by first applying Slepian-Wolf coding [2] and then computing the function. 1 Since then, distributed coding schemes that are tailored for some classes of functions were studied (e.g., see [3, Chapter 21]). These results are the cases such that the structure of functions can be utilized for distributed coding. However, not all functions have such nice structures, and even for some classes of functions, it is known that the Slepian-Wolf region cannot be improved at