Any sequential learning task relies on the idea of connecting previous time-stamp information to the immediate present time-stamp task to predict the future. The underlying challenge is to understand the hidden patterns in the sequence by means of analyzing short-and long-term dependencies and temporal differences. Recurrent Neural Networks (RNNs) and their variants, such as Long Short-Term Memory (LSTM) are widely used in problem domains like speech recognition, Natural Language Processing (NLP), fault prediction, and language translation modeling over the past few years. Higher accuracy demands complex LSTM network models which lead to high computational cost, area overhead, and excessive power consumption. Reversible logic circuit synthesis, in the context of ideally Zero heat dissipation, has emerged as a new research paradigm for low power circuit designs. In this paper, we have proposed a novel design of LSTM architecture using reversible logic gates. To the best of our knowledge, the proposed approach is the first attempt to implement a complete feedforward LSTM circuit using only reversible logic gates. The hardware implementation of the proposed method is presented using VHDL and Altera Arria10 GX FPGA. The comparative analysis demonstrates that the proposed approach has achieved an approximately 17% reduction in overall power dissipation compared to traditional networks. The proposed approach also has better scalability than the classical design approach.