This paper deals with the recognition of freestyle handwritten text lines. We compare 2 state-of-the-art segmentation-free recognition approaches. The first one is the popular context-dependent HMM approach (Hidden Markov Models). The second one is the recent BLSTM (Bi-directional Long Short-Term Memory) approach based on recurrent neural networks and memory blocks. For the sake of comparison, both recognizers use the same set of features and language model. They are compared from the following perspectives: sliding window parameters for feature extraction, training and decoding speed and performance accuracy with or without using a language model. We compare these two approaches on the publicly available Rimes database of French handwritten mails. Our main findings are that long frame sequences, obtained with specific window parameters, improve both recognizers, and that BLSTMs outperform HMMs in terms of WER rates, at the expense of considerably longer training times.