This paper addresses the issue of reliable and automated analog circuit's structure recognition (ACSR). First, it presents a comprehensive critical review of the related state-of-the-art. Then, essentially, this study proposes and validates a novel approach for realizing a dependable structure recognition system for analog circuits through a comprehensive ontology definition, some transformation tools, scene modeling through graph models, and then, by involving a graph attention neural network (GAT) model. Knowingly, identifying sub-circuits requires, in the brute-force mode, extensive screening of the numerous alternative substructures that can be constructed from the basic elements (i.e. the transistors) in presence, especially w.r.t. the actual connectivity topology amongst them; this is evidently a very complex and challenging endeavor. The relevant state-of-the-art has involved mostly unsupervised learning approaches to tackle this analog circuit's structure recognition-related challenging task, whereby they have reached a clear limitation of not (or only very hardly) reaching beyond 90% to 93% accuracy/precision. But in this work, we develop, for the first time, a comprehensive supervised-learning approach that demonstrates its clear superiority by enabling the reaching of a recognition accuracy or precision that is reliably in the range beyond 99% (or even 100%) for each subblock of the analog circuit. The supervised clustering approach developed and validated in this study consists of a comprehensive modeling and conceptual pipeline that is implemented around and through a Graph Attention Neural Network model, which ensures a reliable recognition of sub-blocks of analog circuits and their related internal adjacency connectivity topology. Besides the modeling pipeline, this study also develops a set of tools for mapping an analog circuit's schematics into a graph model. To overcome the limited and unbalanced samples for training the graph neural model, this study proposes a special novel augmentation strategy based on a graph sub-cropping technique. This augmentation technique is embedded in a smart stepwise augmentation protocol that leads at the end, through iterative additional dedicated training steps to a significant progressive increase of the recognition accuracy by the graph neural model until reaching more than 99%, better 100%, for each of the subblocks of the analog circuits as demonstrated in the demonstration case-study presented. Indeed, this paper's quintessence represents a significant breakthrough in view of the clear outperforming of the currently relevant related state-of-the-art.