With the overarching goal of developing user-centric Virtual Reality (VR) systems, a new wave of studies focused on understanding how users interact in VR environments has recently emerged. Despite the intense efforts, however, current literature still does not provide the right framework to fully interpret and predict users trajectories while navigating in VR scenes. This work advances the state-of-the-art on both the study of users' behaviour in VR and the user-centric system design. In more details, we complement current datasets by presenting a public available dataset that provides navigation trajectories acquired for heterogeneous omnidirectional videos and different viewing platforms, namely, head-mounted display, tablet and laptop. We then present an exhaustive analysis on the collected data, to better understand navigation in VR across users, content, and for the first time across viewing platforms. The novelty lies in the user-affinity metric, proposed in this work to investigate users' similarities when navigating within the content. The analysis reveals useful insights on the effect of device and content on the navigation, which could be precious considerations from the system design perspective. As a case study of the importance of studying users' behaviour when designing VR systems, we finally propose an user-centric server optimisation. We formulate an integer linear program that seeks the best stored set of omnidirectional content that minimises encoding and storage cost while maximises the user's experience. This is posed while taking into account network dynamics, type of video content, but also user population interactivity. Experimental results prove that our solution outperforms commonly company recommendations in terms of experienced quality but also in terms of encoding and storage, achieving a saving up to 70%. More importantly, we highlight a strong correlation between the storage cost and the user-affinity metric, showing the impact of the latter in the system architecture design.