This research paper explores the development of a semantic search engine designed to enhance user query comprehension and deliver contextually applicable research results. Classic search engines basically struggle to catch the nuanced meaning of user queries, giving to suboptimal results. To address this challenge, we give the merge of advanced natural language processing (NLP) techniques with Elasticsearch, and with a specific focus on Bidirectional Encoder Representations from Transformers (BERT), a state-of-the-art pre-trained language model. Our approach involves leveraging BERT's ability to analyze the contextual meaning of words within documents by sentence transformers as (SBERT) , enabling the search engine to grab the user queries and better under- stand semantics of the content as it is converted into vector embeddings making it understandable in the Elasticsearch server. By utilizing BERT's bidirectional attention mechanism, the search engine can discern the relationships between words, thereby capturing the contextual nuances that are crucial for accurate query interpretation. Through experimental validation and performance assessments, we demonstrate the efficacy of our semantic search engine in providing contextually relevant search results. This research contributes to the advancement of search technology by enhancing the intelligence of search engines, ultimately improving the user experience by giving context based research.