BackgroundVoxel‐based analysis (VBA) for population level radiotherapy (RT) outcomes modeling requires topology preserving inter‐patient deformable image registration (DIR) that preserves tumors on moving images while avoiding unrealistic deformations due to tumors occurring on fixed images.PurposeWe developed a tumor‐aware recurrent registration (TRACER) deep learning (DL) method and evaluated its suitability for VBA.MethodsTRACER consists of encoder layers implemented with stacked 3D convolutional long short term memory network (3D‐CLSTM) followed by decoder and spatial transform layers to compute dense deformation vector field (DVF). Multiple CLSTM steps are used to compute a progressive sequence of deformations. Input conditioning was applied by including tumor segmentations with 3D image pairs as input channels. Bidirectional tumor rigidity, image similarity, and deformation smoothness losses were used to optimize the network in an unsupervised manner. TRACER and multiple DL methods were trained with 204 3D computed tomography (CT) image pairs from patients with lung cancers (LC) and evaluated using (a) Dataset I (N = 308 pairs) with DL segmented LCs, (b) Dataset II (N = 765 pairs) with manually delineated LCs, and (c) Dataset III with 42 LC patients treated with RT.ResultsTRACER accurately aligned normal tissues. It best preserved tumors, indicated by the smallest tumor volume difference of 0.24%, 0.40%, and 0.13 % and mean square error in CT intensities of 0.005, 0.005, 0.004, computed between original and resampled moving image tumors, for Datasets I, II, and III, respectively. It resulted in the smallest planned RT tumor dose difference computed between original and resampled moving images of 0.01 and 0.013 Gy when using a female and a male reference.ConclusionsTRACER is a suitable method for inter‐patient registration involving LC occurring in both fixed and moving images and applicable to voxel‐based analysis methods.