Personalized predictive analytics are changing the way people live, such as recommendation, ehealth, and chatbots. With the prominent development of edge devices, predictive models based on deep neural networks (DNNs) can be conveniently hosted on-device to deliver accurate results, which also provides better privacy and timeliness. Due to the inevitable data sparsity on a single device, collaborative learning (e.g., federated learning) is regarded as an integral approach for learning performant personalized deep models. However, traditional decentralized learning protocols communicate the knowledge by sharing the model parameters (i.e., weights or gradients). Consequently, such methods strictly require homogeneity of all participant models, thus being unable to handle applications where heterogeneous deep models with different architectures are built for the same task but adapt to a variety of device capacities. Furthermore, to support model personalization, a common practice is to pick a curated subset of similar users (i.e., neighbors) for knowledge aggregation, but the neighbor selection process potentially threatens user privacy with the need for exchanging personal information or model parameters that are highly sensitive.In this paper, we propose a Similarity-based Decentralized Knowledge Distillation (SD-Dist) framework for collaboratively learning heterogeneous deep models on decentralized devices. By introducing a preloaded reference dataset, SD-Dist enables all participant devices to identify similar users and distil knowledge from them without any assumptions on a fixed model architecture. In addition, none of these operations will reveal any sensitive information like personal data and model parameters. Extensive experimental results on three real-life datasets show that SD-Dist can achieve competitive performance with less compute resources, while ensuring model heterogeneity and privacy. As revealed in our experiments, our framework also enhances the resultant models' robustness when users' data is sparse and diverse.