We show that the observed mass-to-light (M/L) ratio of galaxy clusters increases with cluster temperature as expected from cosmological simulations. Contrary to previous observational suggestions, we find a mild but robust increase of from poor ( -2 keV) to rich ( keV) clusters; over this range, the mean M/L T∼ 1 T ∼ 12 increases by a factor of about 2. The best-fit relation satisfies at , witha large scatter. This trend confirms predictions from cosmological simulations that show that the richest clusters are antibiased, with a higher ratio of mass per unit light than average. The antibias increases with cluster temperature. The effect is caused by the relatively older age of the high-density clusters, where light has declined more significantly than average since their earlier formation time.