With YouTube's growing importance as a news platform, its recommendation system came under increased scrutiny. Recognizing YouTube's recommendation system as a broadcaster of media, we explore the applicability of laws that require broadcasters to give important political, ideological, and social groups adequate opportunity to express themselves in the broadcasted program of the service. We present audits as an important tool to enforce such laws and to ensure that a system operates in the public's interest. To examine whether YouTube is enacting certain biases, we collected video recommendations about political topics by following chains of ten recommendations per video. Our findings suggest that YouTube's recommendation system is enacting important biases. We find that YouTube is recommending increasingly popular but topically unrelated videos. The sadness evoked by the recommended videos decreases while the happiness increases. We discuss the strong popularity bias we identified and analyze the link between the popularity of content and emotions. We also discuss how audits empower researchers and civic hackers to monitor complex machine learning (ML)-based systems like YouTube's recommendation system.