.The sensitivity limits of space telescopes are imposed by uncalibrated errors in the point spread function, photon-noise, background light, and detector sensitivity. These are typically calibrated with specialized wavefront sensor hardware and with flat fields obtained on the ground or with calibration sources, but these leave vulnerabilities to residual time-varying or non-common path aberrations and variations in the detector conditions. It is, therefore, desirable to infer these from science data alone, facing the prohibitively high dimensional problems of phase retrieval and pixel-level calibration. We introduce a new Python package for physical optics simulation, ∂ Lux, which uses the machine learning framework Jax to achieve graphics processing unit acceleration and automatic differentiation (autodiff), and apply this to simulating astronomical imaging. In this first of a series of papers, we show that gradient descent enabled by autodiff can be used to simultaneously perform phase retrieval and calibration of detector sensitivity, scaling efficiently to inferring millions of parameters. This new framework enables high dimensional optimization and inference in data analysis and hardware design in astronomy and beyond, which we explore in subsequent papers in this series.