The 21 st century skills require learners to master high-order thinking skills (HOTS). High-order thinking is a cognitive process that involves a complex understanding of information beyond just memorizing and recalling. It covers complex problem-solving, critical thinking, generating new information, and synthesizing it. Traditional methods are known to induce test anxiety and ignore the learner's thinking processes. This low-order process of memorizing and understanding information is decontextualized and does not lead to deep knowledge. Student assessment should not only be concerned with what students know but rather what they can do with the knowledge they have acquired. These needed skills can be developed and improved through practical models such as game-based learning. Games can ensure HOTS; however, it is uncertain how they assess and evaluate learner performance. Initially, students were given a test before and after game play to gauge how much they knew before and after learning. To assess HOTS we need robust assessment mechanisms embedded within game applications to measure performance throughout gameplay. Therefore, research needs to explore how HOTS is measured in pedagogical games. The ability of games to quantify data for scoring cannot be doubted, however little is known particularly in the context of grading, results analysis, and implementing assessment strategies. Empirical studies are still lacking. We reviewed digital game-based learning assessment's (DGBLA) capacity to evaluate and improve high-order thinking. Authoritative databases were analyzed and compared, Web of Science and Scopus emerged best, giving 32 literature sources. The study found that game assessment structured through Bloom's Taxonomy improves HOTS.