This study investigates the performance of transformer-based machine learning models, specifically BERT, RoBERTa, and ALBERT, in multiclass text classification within the context of the Universal Access to Quality Tertiary Education (UAQTE) program. The aim is to systematically categorize and analyze qualitative responses to uncover domain-specific patterns in students' experiences. Through rigorous evaluation of various hyperparameter configurations, consistent enhancements in model performance are observed with smaller batch sizes and increased epochs, while optimal learning rates further boost accuracy. However, achieving an optimal balance between sequence length and model efficacy presents nuanced challenges, with instances of overfitting emerging after a certain number of epochs. Notably, the findings underscore the effectiveness of the UAQTE program in addressing student needs, particularly evident in categories such as "Family Support" and "Financial Support," with RoBERTa emerging as a standout choice due to its stable performance during training. Future research should focus on fine-tuning hyperparameter values and adopting continuous monitoring mechanisms to reduce overfitting. Furthermore, ongoing review and modification of educational efforts, informed by evidence-based decision-making and stakeholder feedback, is critical to fulfill students' changing needs effectively.