Recent progress in the development of unmanned aerial vehicles (UAVs) has led to more and more situations in which drones like quadrocopters or octocopters pose a potential serious thread or could be used as a powerful tool for illegal activities. Therefore, counter-UAV systems are required in a lot of applications to detect approaching drones as early as possible. In this paper, an efficient and robust algorithm is presented for UAV detection using static VIS and SWIR cameras. Whereas VIS cameras with a high resolution enable to detect UAVs in the daytime in further distances, surveillance at night can be performed with a SWIR camera. First, a background estimation and structural adaptive change detection process detects movements and other changes in the observed scene. Afterwards, the local density of changes is computed used for background density learning and to build up the foreground model which are compared in order to finally get the UAV alarm result. The density model is used to filter out noise effects, on the one hand. On the other hand, moving scene parts like moving leaves in the wind or driving cars on a street can easily be learned in order to mask such areas out and suppress false alarms there. This scene learning is done automatically simply by processing without UAVs in order to capture the normal situation. The given results document the performance of the presented approach in VIS and SWIR in different situations.