The data science has evolved over the past 2 decades, allowing the technical norms to be built in a way that can handle the new issues. While various technical issues develop, the requirement for text summary has always been there. Nearly 10 years ago, the foundation of automatic text summarisation was created, and since then, technical improvements and refinements have been made for large-scale big data handling, crime investigation, and cybersecurity, to name a few. There are several text summarising techniques, and these influence the outcomes as well. Another difference from the last 20 years is the requirement for time for text summarising. To pursue acquiring the findings, machine learning methods are applied to the core set of text phrases as a data set. Neural networks are now being used to improve text summarisation. But because there is more data available as short text to summarise, there will also be a big need for short text summarising. Utilising a quick summarising technique, accuracy, precision, and memory are improved. The focus of the challenge in this work is on brief text summarisation and boosting accuracy using a cutting-edge algorithm called Bidirectional Encoder Representations from Transformers (BERT). Bidirectional Encoder Representations from Transformers with transformer produced outstanding results for Short Text Summarisation. The model receives the input and performs a sequence-to-sequence analysis of the data down to the word level. The model's implementation is then contrasted with Word2Vec + RNN and Word2Vec + long short-term memory (LSTM), two earlier works. The proposed strategy, which seeks to increase the training data duration and accuracy for short text summarisation, produced best results utilising BERT + LSTM and BERT + Transformer. Using a confusion matrix to monitor and analyse the improved findings, it was shown that BERT + Transformer had an accuracy of 97%. The suggested model also performs better than existing models in terms of precision (46%), and recall (30%).
K E Y W O R D Slearning (artificial intelligence), natural language processing, neural nets, software engineering, software maintenance, text analysisThis is an open access article under the terms of the Creative Commons Attribution-NonCommercial License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.