Touch is an ubiquitous way of interacting with smartphones. However, building mobile user interfaces (UI) where touch interactions work well for all users is a difficult problem, because users have different motor abilities, skills, and even preferences [51]. For example, consider a right-handed user who wants to access a menu item on the left side of the screen. For larger screens, this menu item is more difficult for the user to touch with their right hand than for a left-handed user.UI adaptivity is a promising approach towards improving touch interactions, because it allows systems to dynamically personalize the UI and tailor the UI to the users' needs. But for UI adaptivity to be practically useful for real-world apps, it must support two goals. First, the technique should be generally useful across a broad range of existing mobile applications. Second, the technique should apply adaptations in a way that respects the design intentions of the original applications. In other words, we expect that drastic UI adaptations are likely to make the user interface less familiar to the user and disruptive to the overall user experience [7].To operationalize these goals, we built a system, Reflow, which automatically applies small UI adaptations-called refinements-to mobile existing app screens to improve touch efficiency. Towards the first goal of supporting broad applicability, Reflow is entirely pixel-based: the system does not need knowledge of the applications' dependencies or view hierarchy to make its UI adaptations. Towards the second goal of respecting design intent and minimally disrupting the user experience, Reflow incorporates the theory of microstrategies in its model [12,13,17]. Microstrategies suggest that even small, principled adaptations to the user interface can significantly improve task efficiency-particularly over cumulative usage-and we postulate that the same principle applies when personalizing touch-based mobile applications.Reflow supports personalized optimization by constructing a spatial map from usage data, which identifies difficultto-access areas of the screen (e.g., elements on edges of the screen requires users to reach and reposition their hand to select). Reflow then (i) automatically detects the UI elements contained on the screen, (ii) uses a machine learning model to optimize the UI layout to better support the difficulty map, and then (iii) re-renders the existing UI pixels to match the new layout (Figure 1). Reflow improves on existing approaches because it works with a range of existing mobile applications and enables an end-to-end pipeline from layout optimization to re-rendering the application screens.To evaluate Reflow, we first conducted a study with 10 participants, where we found it improved interaction speed by 9% on average, and improved interaction speeds by up to 17% for some UIs. From lessons learned, we made further improvements to this model by detecting and applying an additional set of UI constraints (relative positioning, alignment).We then conducted a heuris...