Named entity recognition (NER) stands as a pivotal task in natural language processing (NLP). Recent advancements have considerably enhanced its effectiveness. However, a gap remains in these systems' ability to fully leverage the recursive dynamics of linguistic structures. This study introduces a novel approach, intertwining the recognition of entities with a deeper understanding of linguistic syntax and tree-like structures. Utilizing a Tree-LSTM that operates under the guidance of dependency trees, we capture the intricate syntactic relationships between words. This process is further refined through the dual application of relative and global attention mechanisms. The relative attention zone is on critical words in the context of each evaluated word, whereas global attention identifies keywords throughout the entire sentence. By projecting these attention-modulated features into a tagging space, our model employs a conditional random field classifier to determine entity labels. We discover that our model adeptly highlights verbs that reveal the types of entities, influenced by their syntactic roles within sentences. Our model sets a new benchmark for performance on two prominent datasets, substantiating our approach.