Effective writing skills are crucial for engineers, and engineering programs have always struggled with how to prepare their students for the writing they will do as professionals. Now, programs must also show the Accreditation Board for Engineering and Technology (ABET) that they have clear educational outcomes for engineering communication and have a process for assessing student performance on those outcomes. At the University of Washington, we have spent the last five years developing an outcomes‐based assessment program for engineering writing. In spring 2001, the first round of writing assessment was completed. The assessment indicated that most of our students are competent in the outcomes we have developed. It also uncovered several weak areas, particularly in regard to working with sources and to adequately stating and supporting the purpose of the writing. We will be addressing these areas with additional instruction in the stand‐alone technical writing courses taken by engineering students. The process described in this paper could be helpful for other engineering programs preparing for ABET accreditation visits.
is a Professor of Learning Technologies at the University of Missouri. She is PI of the NSF-funded Supporting Collaboration in Engineering Education, and has studied and published on engineering education, women and minorities in STEM, online learning and assessment. Marra holds a PhD. in Educational Leadership and Innovation and worked as a software engineer before entering academe.
This paper presents a comparison of online to traditional face-to-face delivery of undergraduate digital systems material. Two specific components of digital content were compared and evaluated: a sophomore logic circuits course with no laboratory, and a microprocessor laboratory component of a junior-level computer systems course. For each of these, a baseline level of student understanding was evaluated when they were being taught using traditional, face-to-face delivery. The course and lab component were then converted to being fully online, and the level of student understanding was again measured. In both cases, the same purpose-developed assessment tools were used to carry out the measurement of understanding. This paper presents the details of how the course components were converted to online delivery, including a discussion of the technology used to accomplish remote access of the electronic test equipment used in the laboratory. A comparison is then presented between the control and the experimental groups, including a statistical analysis of whether the delivery approach impacted student learning. Finally, student satisfaction is discussed, and instructor observations are given for the successful remote delivery of this type of class and laboratory.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.