We examine interrater reliability for scoring the Rorschach Performance Assessment System (R-PAS) in a sample of 89 adolescents ( MAge = 13.2, SD = 1.01) from Brazil using exact agreement intraclass correlations coefficient (ICCs) for the 60 protocol-level scores that are the focus of interpretation. The first author completed or reviewed all of the primary coding, and seven R-PAS proficient psychologists trained at different sites independently produced secondary coding. Overall, excellent agreement was found ( M ICC = 0.89; SD = 0.09). When averaged across this study and three other comparison studies, stronger reliability was present, in general, for commonly coded variables ( M = 0.87) as opposed to rare or infrequent variables ( M = 0.78). In addition, 78.3% of the variables showed excellent interrater reliability and an additional 20.0% had good reliability. The results also showed that the ICCs for most variables had low variability across studies, suggesting clear coding guidelines. However, variables with higher ICC variability across studies indicated domains where it would be desirable to expand guidelines with more detailed parameters. Overall, the findings indicate excellent interrater reliability for the great majority of codes and present solid grounds for future research on interrater reliability with R-PAS.