This study investigates the internal structure and construct validity of Complex Problem Solving (CPS), which is measured by a Multiple-Item-Approach. It is tested, if (a) three facets of CPS -rule identification (adequateness of strategies), rule knowledge (generated knowledge) and rule application (ability to control a system) -can be empirically distinguished, how (b) reasoning is related to these CPS-facets and if (c) CPS shows incremental validity in predicting school grade point average (GPA) beyond reasoning. N = 222 university students completed Micro-DYN, a computer-based CPS test and Ravens Advanced Progressive Matrices. Analysis including structural equation models showed that a 2-dimensionsal model of CPS including rule knowledge and rule application fitted the data best. Furthermore, reasoning predicted performance in rule application only indirectly through its influence on rule knowledge indicating that learning during system exploration is a prerequisite for controlling a system successfully. Finally, CPS explained variance in GPA even beyond reasoning, showing incremental validity of CPS. Thus, CPS measures important aspects of academic performance not assessed by reasoning and should be considered when predicting real life criteria such as GPA.