Dynamic bit encoding and decoding of the magnetic recording process remain a challenge in that the process is restrained by the balance between reading and writing performance of the decoder's bit error rate (BER). Sequential neural networks offer data streamflow for processes to reproduce recoded bits from signal distribution, overcoming the limitation of codeword mapping designed for each specific bitpatterned magnetic recording (BPMR) channel. Here, we implement the vanilla long short-term memory (LSTM) for adaptive modulation decoders in various BPMR channel designs within a single network, which benefits multi-channel decoder calibration tools with the same standardization. Signal information from media readback, a two-dimensional (2D) equalizer, 2D Viterbi, and a 2D soft-output Viterbi algorithm (SOVA) detector is arranged as a tensor that enables sequence-to-sequence bit prediction even with a highly complex data arrangement. Our adaptive model can predict recorded bits from readback with accuracies of approximately 97% for rate 4/5 decoding and 75% for crossing platforms, using a recently proposed singlereader/two-track reading (SRTR) system at an areal density of 4 Tb/in 2 in a signal-to-noise ratio range of 1 to 8 dB. We conducted a BER simulation with the relevant results from conventional decoders and the LSTM model. Ultimately, our approach may demonstrate the limitation of supervised learning designed for BPMR systems and reveal a sequence data focus on LSTM that paves the way for sequential-type, unsupervised, mechanism-based, next-generation magnetic recordings.INDEX TERMS Long-short term memory (LSTM), supervised learning, deep learning, bit-patterned media recording (BPMR), channel decoding