Contemporary camera-equipped wearable devices, such as smartphones and tablets, are now powerful enough to be used as ubiquitous remote sources of live video streams. However, as these devices may operate in various positions, including upside down or rotated, encoding the captured video into a data format suitable for live streaming also includes the rotation of video frames to compensate the device rotation effect.In this paper, we analyze the impact of the video rotation procedures to the frame rate of the produced live video. MJPEG is used as a video encoding format. We compare the performance of two video rotation methods: classical pixel-level rotation based on rotation matrix and JPEG tagging using EXIF metadata. The results show that EXIF-based tagging outperforms matrix-based calculations significantly, and gets superior as video resolution increases. However, since EXIF metadata are not yet widely supported in modern web browsers, if compatibility with open user community is required, matrix-based rotation is the option to go with.