Purpose: Sparse-data computed tomography (CT) frequently occurs, such as breast tomosynthesis, Carm CT, on-board four-dimensional cone-beam CT (4D CBCT), and industrial CT. However, sparse-data image reconstruction remains challenging due to highly undersampled data. This work develops a datadriven image reconstruction method for sparse-data CT using deep neural networks (DNN). Methods: The new method so-called AirNet is designed to incorporate the benefits from analytical reconstruction method (AR), iterative reconstruction method (IR), and DNN. It is built upon fused analytical and iterative reconstruction (AIR) that synergizes AR and IR via the optimization framework of modified proximal forward-backward splitting (PFBS). By unrolling PFBS into IR updates of CT data fidelity and DNN regularization with residual learning, AirNet utilizes AR such as FBP during the data fidelity, introduces dense connectivity into DNN regularization, and learns PFBS coefficients and DNN parameters that minimize the loss function during the training stage; and then AirNet with trained parameters can be used for end-to-end image reconstruction. Results: A CT atlas of 100 prostate scans was used to validate the AirNet in comparison with stateof-art DNN-based postprocessing and image reconstruction methods. The validation loss in AirNet had the fastest decreasing rate, owing to inherited fast convergence from AIR. AirNet was robust to noise in projection data and content differences between the training set and the images to be reconstructed. The impact of image quality on radiotherapy treatment planning was evaluated for both photon and proton therapy, and AirNet achieved the best treatment plan quality, especially for proton therapy. For example, with limited-angle data, the maximal target dose for AirNet was 109.5% in comparison with the ground truth 109.1%, while it was significantly elevated to 115.1% and 128.1% for FBPConvNet and LEARN, respectively.