Background: Integration of assessment with education is vital and ought to be performed regularly to enhance learning. There are many assessment methods like Multiple-choice Questions, Objective Structured Clinical Examination, Objective Structured Practical Examination, etc. The selection of the appropriate method is based on the curricula blueprint and the target competencies. Although MCQs has the capacity to test students’ higher cognition, critical appraising, problem-solving, data interpretation, and testing curricular contents in a short time, there are constraints in its analysis. The authors aim to accentuate some consequential points about psychometric analysis displaying its roles, assessing its validity and reliability in discriminating the examinee’s performance, and impart some guide to the faculty members when constructing their exam questions bank.
Methods: Databases such as Google Scholar and PubMed were searched for freely accessible English articles published since 2010. Synonyms and keywords were used in the search. First, the abstracts of the articles were viewed and read to select suitable match, then full articles were perused and summarized. Finally, recapitulation of the relevant data was done to the best of the authors’ knowledge.
Results: The searched articles showed the capacity of MCQs item analysis in assessing questions’ validity, reliability, its capacity in discriminating against the examinee’s performance and correct technical flaws for question bank construction.
Conclusion: Item analysis is a statistical tool used to assess students’ performance on a test, identify underperformed items, and determine the root causes of this underperformance for improvement to ensure effective and accurate students’ competency judgment.
Keywords: assessment, difficulty index, discrimination index, distractors, MCQ item analysis