Due to the epidemic threat, more and more companies decide to automate their production lines. Given the lack of adequate security or space, in most cases, such companies cannot use classic production robots. The solution to this problem is the use of collaborative robots (cobots). However, the required equipment (force sensors) or alternative methods of detecting a threat to humans are usually quite expensive. The article presents the practical aspect of collision detection with the use of a simple neural architecture. A virtual force and torque sensor, implemented as a neural network, may be useful in a team of collaborative robots. Four different approaches are compared in this article: auto-regressive (AR), recurrent neural network (RNN), convolutional long short-term memory (CNN-LSTM) and mixed convolutional LSTM network (MC-LSTM). These architectures are analyzed at different levels of input regression (motor current, position, speed, control velocity). This sensor was tested on the original CURA6 robot prototype (Cooperative Universal Robotic Assistant 6) by Intema. The test results indicate that the MC-LSTM architecture is the most effective with the regression level set at 12 samples (at 24 Hz). The mean absolute prediction error obtained by the MC-LSTM architecture was approximately 22 Nm. The conducted external test (72 different signals with collisions) shows that the presented architecture can be used as a collision detector. The MC-LSTM collision detection f1 score with the optimal threshold was 0.85. A well-developed virtual sensor based on such a network can be used to detect various types of collisions of cobot or other mobile or stationary systems operating on the basis of human-machine interaction.