In distributed, cooperative Internet of Things (IoT) settings, sensing devices must communicate in a resource-aware fashion to achieve a diverse set of tasks, (i.e., event detection, image classification). In such settings, we continue to see a shift from reliance on cloud-centric to edge-centric architectures for data processing, inference and actuation. Distributed edge inference techniques address real-time, connectivity, network bandwidth and latency challenges in spatially distributed IoT applications. Achieving efficient, resource-aware communication in such systems is a longstanding challenge. Many current approaches require complex, hand-engineered communication protocols. In this paper, we present a novel scalable, data-driven and communication-efficient Convolutional Recurrent Neural Network (C-RNN) framework for distributed tasks. We provide empirical and systematic analyses of model convergence, node scalability and communication-cost based on dynamic network graphs. Further to this, we show that our framework is able to solve distributed image classification tasks via automatically learned communication.