One of the main security requirements for symmetric-key block ciphers is resistance against differential cryptanalysis. This is commonly assessed by counting the number of active substitution boxes (S-boxes) using search algorithms or mathematical solvers that incur high computational costs. These costs increase exponentially with respect to block cipher size and rounds, quickly becoming inhibitive. Conventional S-box enumeration methods also require niche cryptographic knowledge to perform. In this paper, we overcome these problems by proposing a data-driven approach using deep neural networks to predict the number of active S-boxes. Our approach trades off exactness for real-time efficiency as the bulk of computational work is brought over to pre-processing (training). Active S-box prediction is framed as a regression task whereby neural networks are trained using features such as input and output differences, number of rounds, and permutation pattern. We first investigate the feasibility of the proposed approach by applying it on a reduced (4-branch) generalized Feistel structure (GFS) cipher. Apart from optimizing a neural network architecture for the task, we also explore the impact of each feature and its representation on prediction error. We then extend the idea to 64-bit GFS ciphers by first training neural networks using data from five different ciphers before using them to predict the number of active S-boxes for TWINE, a lightweight block cipher. The best performing model achieved the lowest root mean square error of 1.62 and R 2 of 0.87, depicting the feasibility of the proposed approach.INDEX TERMS Active S-boxes, block cipher, cryptanalysis, deep learning, differential cryptanalysis, lightweight cryptography, neural networks, TWINE