Screen resolution along with network conditions are main objective factors impacting the user experience, in particular for video streaming applications. Terminals on their side feature more and more advanced characteristics resulting in different network requirements for good visual experience [1]. Previous studies tried to link MOS (Mean Opinion Score) to video bit rate for different screen types (e.g., CIF, QCIF, and HD) [2]. We leverage such studies and formulate a QoE-driven resource allocation problem to pinpoint the optimal bandwidth allocation that maximizes the QoE (Quality of Experience) over all users of a provider located behind the same bottleneck link, while accounting for the characteristics of the screens they use for video playout. For our optimization problem, QoE functions are built using curve fitting on data sets capturing the relationship between MOS, screen characteristics, and bandwidth requirements. We propose a simple heuristic based on Lagrangian relaxation and KKT (Karush Kuhn Tucker) conditions for a subset of constraints. Numerical simulations show that the proposed heuristic is able to increase overall QoE up to 20% compared to an allocation with TCP look-alike strategies implementing max-min fairness. Later, we use a MPEG/DASH implementation in the context of ns-3 and show that coupling our approach with a rate adaptation algorithm (e.g., [3]) can help increasing QoE while reducing both resolution switches and number of interruptions.