Evolutionary games on networks traditionally involve the same game at each interaction. Here we depart from this assumption by considering mixed games, where the game played at each interaction is drawn uniformly at random from a set of two different games. While in well-mixed populations the random mixture of the two games is always equivalent to the average single game, in structured populations this is not always the case. We show that the outcome is in fact strongly dependent on the distance of separation of the two games in the parameter space. Effectively, this distance introduces payoff heterogeneity, and the average game is returned only if the heterogeneity is small. For higher levels of heterogeneity the distance to the average game grows, which often involves the promotion of cooperation. The presented results support preceding research that highlights the favorable role of heterogeneity regardless of its origin, and they also emphasize the importance of the population structure in amplifying facilitators of cooperation.