Recent developments in deep learning have contributed to numerous success stories in healthcare. The performance of a deep learning model generally improves with the size of the training data. However, there are privacy, ownership, and regulatory issues that prevent combining medical data into traditional centralized storage. Decentralized learning approaches enable collaborative model training by distributing the learning process among several nodes or devices. Conceptually, decentralized learning builds on earlier work in distributed optimization, but the focus of this paper is on recent and emerging techniques such as Federated Learning (FL), Split Learning (SL), and hybrid Split-Federated Learning (SFL). With common, universal deep learning models and centralized aggregator servers, FL overcomes the difficulties of centralized training. Additionally, patient data remains at the local party, upholding the security and anonymity of the data. SL enables machine learning without directly accessing data on clients or end devices. It further enhances privacy in a decentralized setting and mitigates clients' storage issues. In this survey, we first provide a contemporary survey of FL, SL, and SFL approaches. Second, we discuss their state-ofthe-art applications in healthcare, particularly in medical image analysis. Third, we review these emerging decentralized learning approaches under challenging conditions such as statistical and system heterogeneity, privacy preservation, communication efficiency, fairness, etc. Then, we address existing approaches to tackle these challenges. We detail unique complications related to healthcare applications including data, privacy and security, and communication challenges. Finally, we outline potential areas for further research on emerging decentralized learning techniques in healthcare, including developing personalized models, reducing bias, incorporating hybrid non-IID features, hyperparameter tuning, developing sufficient incentive mechanisms, and incorporating domain expertise knowledge.