Due to limitations of cooling methods such as using fan and heat sink, dynamic thermal management (DTM) is being widely adopted to manage the temperature of computing systems. However, application of DTM can reduce the system performance and thereby affect the quality of real-time applications. Real-time video encoding, which has high computational need and hard deadlines, is a commonly used application that can be severely affected by the usage of DTM. We study the effect of DTM on a widely used H.264 video encoder and formulate a multidimensional optimization problem to maximize video quality and minimize bit rate while ensuring that the video encoder can run in real time in spite of DTM effects. We model the effects of adapting encoding parameters on video quality, bit rate, and encoder speed. We propose a dynamic application adaptation method to efficiently solve the optimization problem by optimally adapting the encoding parameters in response to DTM effects. In addition, we show that the proposed dynamic application adaptation method would reduce the need for cooling methods such as forced convection cooling. We implement the proposed approach on an Intel® Core™ 2 Duo platform where dynamic voltage and frequency scaling (DVFS) is used for DTM. Our measurements with several videos reveal that when DTM is applied, the video quality is affected significantly. However, using the proposed adaptation algorithm, the encoder can run in real time, and the quality loss is minimized with only a marginal increase in the bit rate.