The paper proposes a simple machine learning solution for hand-gesture classification, based on processed MM-wave radar signal. It investigates the classification up to 12 different intuitive and ergonomic gestures, which are intended to serve as a contactless user interface. The system is based on AWR1642 boost Frequency-Modulated Continuous-Wave (FMCW) radar, which allows capturing standardized data to support the scalability of the proposed solution. More than 4000 samples were collected from 4 different people, with all signatures extracted from the radar hardware available in open-access database accompanying the publication. Collected data were processed and used to train Long short-term memory (LSTM) and artificial recurrent neural network (RNN) architecture. The work studies the impact of different input parameters, the number of hidden layers, and the number of neurons in those layers. The proposed LSTM network allows for classification of different gestures, with the total accuracy ranging from 94.4% to 100% depending on use-case scenario, with a relatively small architecture of only 2 hidden layers with 32 neurons in each. The solution is also tested with additional data recorded from subjects not involved in the original training set, resulting in an accuracy drop of no more than 2.24%. This demonstrates that the proposed solution is robust and scalable, allowing quick and reliable creation of larger databases of gestures to expand the use of machine learning with radar technologies.