Fig. 1. The IlluminatedFocus technique optically defocuses real-world appearances in a spatially varying manner regardless of the distance from the user's eyes to observed real objects. The proposed technique enables various vision augmentation applications. (a) The proposed system consists of focal sweep eyeglasses (two Electrically Focus-Tunable Lenses (ETL)) and a high-speed projector. (b) Experimental proof of the proposed technique. (b-1) Experimental setup. Four objects (A, B, C, and D) are placed in front of the projector and ETL. A camera is regarded as a user's eye. (b-2, b-3, b-4) The objects are illuminated by the projector at different timings and the camera's focal length is periodically modulated by the ETL. As indicated by the yellow arrows, objects to appear focused (A, C, and D) are illuminated when they are in focus and the other object to appear blurred (B) is illuminated when it is out of focus. (b-5) When the frequency of the focal sweep is higher than the critical fusion frequency (CFF), these appearances are perceived to be integrated. The appearance of this image (only B is blurred) cannot be achieved by normal lens systems. Note that the brightness of (b-2) to (b-5) has been adjusted for better understanding.Abstract-Aiming at realizing novel vision augmentation experiences, this paper proposes the IlluminatedFocus technique, which spatially defocuses real-world appearances regardless of the distance from the user's eyes to observed real objects. With the proposed technique, a part of a real object in an image appears blurred, while the fine details of the other part at the same distance remain visible. We apply Electrically Focus-Tunable Lenses (ETL) as eyeglasses and a synchronized high-speed projector as illumination for a real scene. We periodically modulate the focal lengths of the glasses (focal sweep) at more than 60 Hz so that a wearer cannot perceive the modulation. A part of the scene to appear focused is illuminated by the projector when it is in focus of the user's eyes, while another part to appear blurred is illuminated when it is out of the focus. As the basis of our spatial focus control, we build mathematical models to predict the range of distance from the ETL within which real objects become blurred on the retina of a user. Based on the blur range, we discuss a design guideline for effective illumination timing and focal sweep range. We also model the apparent size of a real scene altered by the focal length modulation. This leads to an undesirable visible seam between focused and blurred areas. We solve this unique problem by gradually blending the two areas. Finally, we demonstrate the feasibility of our proposal by implementing various vision augmentation applications.Index Terms-Vision augmentation, spatial defocusing, depth-of-field, focal sweep, high-speed projection, spatial augmented reality • Daisuke Iwai is with Osaka University and JST, PRESTO.