I n this issue, Woodworth et al 1 detail the multistep development of an ultrasound interpretation examination capable of distinguishing ultrasound-guided regional anesthesia (UGRA) providers of varying skill levels. Initially, they defined knowledge and skills relevant to the most salient regional anesthesia techniques, surveyed residency programs to confirm the importance of those skills, and narrowed focus to the important content. Thereafter, the authors defined important anatomy and anatomic relationships using observations of faculty/resident teaching. They then created and refined an assessment tool, eliminating questions regional anesthesia experts could not correctly answer. Finally, they administered the examination to participants across all skill levels, eliminating the few questions that did not aid in distinguishing between test takers at different levels of training. Each study phase was cross-checked against external controls. Sound psychometric principles were applied to develop this assessment. In fact, an excellent concise review of how to establish assessment tool validity and reliability is presented in the Methods section. The main oversight appears to be exclusion of brachial plexus blocks below the clavicle, which could be added at a later date using the same test-design roadmap established in this initial study.With this well-designed tool in hand, several questions arise. Where does this test fit within the context of other UGRA assessments? How could it be integrated into existing skill assessment regimens? What is the purpose of UGRA performance and knowledge assessments?This research effort complements the existing literature well. Multiple performance assessment tools have been developed for UGRA, yet they attend only lightly to ultrasound interpretation. 2-5 Out of more than 30 elements in 1 recent assessment tool, only 1 element focuses on recognizing anatomic structures, with 2 reserved for recognizing proper needle and injectate location. 3,4 The remainder of the instrument centers around procedural steps like patient positioning, probe selection, needle alignment, and injection technique. Mastery of UGRA requires reliable performance of all these steps and moreincluding adherence to sterile technique, anatomic knowledge, 2-hand 3-dimensional coordination, patient communication, and monitoring vigilance. No single instrument tests all these skills.Technical performance metrics should not serve as surrogates for anatomic knowledge; skills and knowledge are not necessarily linked. Other studies have shown that disparate skills required for a procedure may not improve at the same rate or impact block success rate. Friedman et al 6 showed that, even as trainees became facile with epidural catheterization, adherence to sterile technique did not improve, suggesting performance on these skills is not linked. In that study, a separate asepsis checklist was ultimately added to a technical skills checklist to scaffold good performance in both domains. It should not, therefore, be surprisin...