Each spring roughly 200 students, mostly nonmajors, enroll in the Introduction to Meteorology course at Iowa State University and are required to make at least 25 forecasts throughout the semester. The Dynamic Weather Forecaster (DWF) forecasting platform requires students to forecast more than just simple ''numeric'' forecasts and includes questions on advection, cloudiness, and precipitation factors that are not included in forecast contests often used in meteorology courses. The present study examines the evolution of forecasting skill for students enrolled in the class in spring 2010 and 2011 and compares student performance with that of an ''expert forecaster. '' The expert forecasters were chosen from meteorology students in an advanced forecasting course who showed exemplary forecasting skill throughout the previous semester. It is shown that these introductory students improve in forecast skill over only the first 10-15 days that they forecast, a number smaller than the 25 days found in an earlier study examining meteorology majors in an upper-level course. The skill of both groups plateaus after that time. An analysis of two types of questions in the DWF reveals that students do have skill slightly better than that of a persistence forecast when predicting parameters traditionally used in forecasting contests, but fail to outperform persistence when predicting more complex atmospheric processes like temperature advection and factors influencing precipitation such as moisture content and instability. The introduction of a contest ''with prizes'' halfway through the semester in 2011 was found to have at best mixed impacts on forecast skill. ABSTRACT Each spring roughly 200 students, mostly nonmajors, enroll in the Introduction to Meteorology course at Iowa State University and are required to make at least 25 forecasts throughout the semester. The Dynamic Weather Forecaster (DWF) forecasting platform requires students to forecast more than just simple ''numeric'' forecasts and includes questions on advection, cloudiness, and precipitation factors that are not included in forecast contests often used in meteorology courses. The present study examines the evolution of forecasting skill for students enrolled in the class in spring 2010 and 2011 and compares student performance with that of an ''expert forecaster.'' The expert forecasters were chosen from meteorology students in an advanced forecasting course who showed exemplary forecasting skill throughout the previous semester. It is shown that these introductory students improve in forecast skill over only the first 10-15 days that they forecast, a number smaller than the 25 days found in an earlier study examining meteorology majors in an upper-level course. The skill of both groups plateaus after that time. An analysis of two types of questions in the DWF reveals that students do have skill slightly better than that of a persistence forecast when predicting parameters traditionally used in forecasting contests, but fail to outperform persistence...