Time on task has been recognized as an important variable in academic learning, but self-report measures of study time are problematic. Therefore, this study employs an automated system for recording time spent reading a course textbook. College students in an introductory engineering course accessed their textbook online. The book contained pages of instructional text, worked examples, homework problems, and answers to homework problems. An instrumented document reader program called "STL Reader" recorded the time each student spent on each page, thus providing detailed measures of reading habits. Across the 10-week course, students spent an average of 1.9 hr reading instructional text, 1.4 hr on worked examples, 22.1 hr on homework problems, and 0.9 hr on homework answers, indicating a preference for practicing to solve test problems (i.e., self-testing) rather than being told (i.e., receiving direct instruction). Furthermore, course grade (based largely on solving problems on exams and quizzes) correlated significantly and positively with time viewing homework problems, but not with time viewing either instructional text or worked examples, indicating that achievement was related to time spent practicing for solving test problems but not to time spent being instructed. Results suggest a revision of the time-on-task hypothesis to include the value of spending time on tasks aligned to test requirements.