From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype neareye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.virtual reality | augmented reality | 3D vision | vision correction | computational optics E merging virtual reality (VR) and augmented reality (AR) systems have applications that span entertainment, education, communication, training, behavioral therapy, and basic vision research. In these systems, a user primarily interacts with the virtual environment through a near-eye display. Since the invention of the stereoscope almost 180 years ago (1), significant developments have been made in display electronics and computer graphics (2), but the optical design of stereoscopic near-eye displays remains almost unchanged from the Victorian age. In front of each eye, a small physical display is placed behind a magnifying lens, creating a virtual image at some fixed distance from the viewer (Fig. 1A). Small differences in the images displayed to the two eyes can create a vivid perception of depth, called stereopsis.However, this simple optical design lacks a critical aspect of 3D vision in the natural environment: changes in stereoscopic depth are also associated with changes in focus. When viewing a near-eye display, users' eyes change their vergence angle to fixate objects at a range of stereoscopic depths, but to focus on the virtual image, the crystalline lenses of the eyes must accommodate to a single fixed distance ( Fig. 2A). For users with normal vision, this asymmetry creates an unnatural condition known as the vergence-accommodation conflict (3, 4). Symptoms associated with this conflict include double vision (diplopia), compromised visual clarity, visual discomfort, and fatigue (3, 5). Moreover, a lack of accurate focus also removes a cu...