Recommendation algorithms profoundly shape users’ attention and information consumption on social media platforms. This study introduces a computational intervention aimed at mitigating two key biases in algorithms by influencing the recommendation process. We tackle \interest bias, or algorithms creating narrow non-news and entertainment information diets, and ideological bias, or algorithms directing the more strongly partisan users to like-minded content. Employing a sock-puppet experiment (N = 8,600 sock puppets) alongside a month-long randomized experiment involving 2,142 frequent YouTube users, we investigate if nudging the algorithm by playing videos from verified and ideologically balanced news channels in the background increases recommendations to and consumption of news. We additionally test if providing balanced news input to the algorithm promotes diverse and cross-cutting news recommendations and consumption. We find that nudging the algorithm significantly and sustainably increases both recommendations to and consumption of news and also minimizes ideological biases in recommendations and consumption, particularly among conservative users. In fact, recommendations have stronger effects on users’ exposure than users’ exposure has on subsequent recommendations. In contrast, nudging the users has no observable effects on news consumption. Increased news consumption has no effects on a range of survey outcomes (e.g., knowledge, participation, polarization, misperceptions), adding to the growing evidence of limited attitudinal effects of on-platform exposure. The intervention does not adversely affect user engagement on YouTube, showcasing its potential for real-world implementation. These findings underscore the influence wielded by platform recommender algorithms on users’ attention and information exposure.