Ultra-high density data storage has gained high significance given the increasing amounts of data; many technologies have been proposed to achieve a high density. Among them, bit-pattern media recording (BPMR) is a promising technology. In BPMR systems, data are stored on magnetic islands. Therefore, high densities can be achieved by reducing the distance between the magnetic islands. Because of the closeness between the magnetic islands, the readback signal is distorted by two-dimensional (2D) interference, which includes the intersymbol interference according to the down-track direction and the intertrack interference according to the cross-track direction. A simple and effective serial detection algorithm was recently proposed to mitigate the 2D interference. However, serial detection utilizes the hard output in inner detection, and this degrades the serial detection performance. To resolve this problem, a subsequent study used feedback to estimate the noise and used this noise signal to create a soft output for inner detection. Following up, in this paper we propose a model that utilizes a neural network for noise prediction. The proposed neural network-based model and the model with the feedback line were compared in terms of bit error rate (BER). The results show that the proposed model achieves a gain of approximately 1 dB at a BER of 10−6.