The primary device for personal computing has consistently evolved over the years, from desktops to mobile devices, and now to the newly emerging trend of wearable computing. Wearable computing is expected to be integral to consumer electronics, with the primary mode of interaction for the user often being a neareye display. To prepare for the shift to near-eye displays, several limitations of current-generation near-eye displays must be addressed, including the inability to provide fully natural focus cues for all users, which can lead to discomfort. A core issue causing these limitations is the optics of the systems themselves, with current displays simply being unable to reproduce changes in focus that natural vision entails. In addition, the form factor often makes it difficult for users with refractive errors to use their corrective eyewear. With two prototype near-eye displays employing computational optics, we demonstrate how these issues can be overcome using display modes that adapt to the user. These prototype systems make use of focus-tunable lenses, mechanically actuated displays, and gaze tracking technology to customize the experience to each specific user to correct common refractive errors, and provide natural focus cues by dynamically updating scene depth based on where a user looks in a scene. These and other advances in computational optics hint at a future in which computational optics enable some users to experience better vision in the virtual world than in the real one.