Recommender Systems have become one of the most important tools for streaming and marketplace systems in recent years. Their increased use has revealed clear bias and unfairness against minorities and underrepresented groups. This paper seeks the origin of these biases and unfairness. To this end, it analyzes the demographic characteristics of a gold standard dataset and its prediction performance when used in a multitude of Recommender Systems. In addition, this paper proposes Soft Matrix Factorization (SoftMF), which tries to balance the predictions of different types of users to reduce the present inequality. The experimental results show that those biases and unfairness are not introduced by the different recommendation models and that they come from the socio-psychological and demographic characteristics of the used dataset.
INDEX TERMSRecommender systems, collaborative filtering, fairness, MovieLens. SANTIAGO ALONSO received the B.S. degree in software engineering and the Ph.D. degree in computer science and artificial intelligence from the Universidad Politécnica de Madrid, in 2015. He is currently an Associate Professor with the Universidad Politécnica de Madrid, participating in master and degree subjects and doing work related with advanced databases. His main research interests include natural computing (P-systems) and did some work on genetic algorithms. His current research interests include machine learning, data analysis, and artificial intelligence.