Nearâeye light field displays based on integral imaging through a microlens array provide attractive features like ultraâcompact volume and freedom of the vergenceâaccommodation conflict to headâmounted displays with virtual or augmented reality functions. To enable optimal design and analysis of such systems, it is desirable to have a physical model that incorporates all factors that affect the image formation, including diffraction, aberration, defocusing, and pixel size. Therefore, in this study, using the fundamental HuygensâFresnel principle and the Arizona eye model with adjustable accommodation, we develop an image formation model that can numerically calculate the retinal light field image with nearâperfect accuracy, and experimentally verify it with a prototype system. Next, based on this model, the visual resolution is analyzed for different field of views (FOVs). As a result, a rapid resolution decay with respect to FOV caused by offâaxis aberration is demonstrated. Finally, resolution variations as a function of image depth are analyzed based on systems with different central depth planes. Significantly, the resolution decay is revealed to plateau when the image depth is large enough, which is different from realâimage type light field displays.