Many pervasive computing applications need sensor data streams, which can vary significantly in accuracy. Depending on the application, deriving information (e.g., higherlevel context) from low-quality sensor data might lead to wrong decisions or even critical situations. Thus, it is important to control the quality throughout the whole data stream processing, from the raw sensor data up to the derived information, e.g., a complex event. In this paper, we present a uniform meta data model to represent sensor data and information quality at all levels of processing; we show how this meta data model can be integrated in a data stream processing engine to ease the development of quality-aware applications; and we present an approach to learn probability distributions of incoming sensor data which needs no prior knowledge. We demonstrate and evaluate our approach in a real-world scenario.