SWOT analysis is a highly effective method for organizations to develop strategic planning and gain widespread adoption by various institutions, industries, and businesses. The importance of SWOT analysis lies in its ability to provide a comprehensive assessment of an organization's internal and external factors. Despite its advantages, there are several challenges in its implementation, such as the challenge to identify the four elements of SWOT and to put statements into their correct position as strength, weakness, opportunity, or threat. This study aims to determine the best SWOT statement classification from a combination of using BERT models as feature extraction technique and compare it with traditional method of TF-IDF. The SWOT statement is input to the model to get a vector as a sentence representation. More similar vector representations indicate the closer meaning of the sentences. The similarity is the basis for the classifier to determine whether a sentence falls into the domain S, W, O, or T. We examined two classification algorithms, namely Support Vector Machine (SVM) and Naïve Bayes Classifier (NBC). Data consists of 635 SWOT statements from study programs of a higher education institution. Five combinations of feature extraction techniques and classification algorithms were tested. The study finds that SBERT model embedding in conjunction with support vector machine classification yield the best performance with an accuracy of 0.73 and an F1-score of 0.738. It outperforms the more traditional method of feature extraction of TF-IDF and other combinations using the Naive Bayes Classifier.