The recent surge in popularity of real-time RGB-D sensors has encouraged research into combining colour and depth data for tracking. The results from a few, recent works in RGB-D tracking have demonstrated that state-of-the-art RGB tracking algorithms can be outperformed by approaches that fuse colour and depth, for example [1,3,4,5].In this paper, we propose a real-time RGB-D tracker which we refer to as the Depth Scaling Kernalised Correlations Filters (DS-KCF). It is based on, and improves upon, the RGB Kernelised Correlation Filters tracker (KCF) from [2]. KCF is based on the use of the 'kernel trick' to extend correlation filters for very fast RGB tracking. The KCF tracker has important characteristics, in particular its ability to combine high accuracy and processing speed as demonstrated in [2,6]. It is based on a simple processing chain that comprises training, detection, retraining and model update obtained by linear interpolation. The key to KCF is that it exploits the properties of circulant matrices to achieve efficient learning by implicitly encoding convolution and by allowing to operate in the Fourier domain using mainly element wise operations.The proposed DS-KCF tracker 1 extends the RGB KCF tracker in three ways: (i) we employ an the efficient combination of colour and depth features (ii) we propose an efficient a novel management of scale changes and (iii) occlusions handling. The improvements we implement provide higher rates of accuracy while still operating at better than real-time frame rates (35fps on average ). In particular, depth data in the target region is segmented with a fast K-means approach to extract relevant features for the target's depth distribution. Modelled as a Gaussian distribution, this data allows to identify scale changes and efficiently model them in the Fourier domain. The advantage of the proposed approach is that only a single target model is kept and updated. Furthermore, region depth distribution enables the detection of possible occlusions identified as sudden changes in the target region's depth histogram, and recovering lost tracks by searching for the unoccluded object in specifically identified key areas. During an occlusion, the model is not updated and the occluding object is tracked to guide the target's search space.