This paper investigates the impact of word-based RNN language models (RNN-LMs) on the performance of end-to-end automatic speech recognition (ASR). In our prior work, we have proposed a multi-level LM, in which character-based and word-based RNN-LMs are combined in hybrid CTC/attention-based ASR. Although this multi-level approach achieves significant error reduction in the Wall Street Journal (WSJ) task, two different LMs need to be trained and used for decoding, which increase the computational cost and memory usage. In this paper, we further propose a novel wordbased RNN-LM, which allows us to decode with only the wordbased LM, where it provides look-ahead word probabilities to predict next characters instead of the character-based LM, leading competitive accuracy with less computation compared to the multi-level LM. We demonstrate the efficacy of the word-based RNN-LMs using a larger corpus, LibriSpeech, in addition to WSJ we used in the prior work. Furthermore, we show that the proposed model achieves 5.1 %WER for WSJ Eval'92 test set when the vocabulary size is increased, which is the best WER reported for end-to-end ASR systems on this benchmark.Index Terms-End-to-end speech recognition, language modeling, decoding, connectionist temporal classification, attention decoder 1. Character-based LM can help correct hypotheses survive until they are rescored at word boundaries during the beam search. Before the hypothesis reaches the boundary, the identity of the last word is unknown and its word probability cannot be applied. Hence, good character-level prediction is important to avoid pruning errors for hypotheses within a word.2. Character-based LM can predict character sequences even for OOV words not included in the vocabulary of the word-based LM. Since the word-based LM basically cannot predict unseen character sequences, good character-level prediction is important for open-vocabulary ASR.However, the multi-level LM approach has a problem that it requires two different RNN-LMs. To build the two LMs, we need to