Objective – Due to the individualized nature of consultations and institutional constraints, research consultations can be challenging to assess. At Texas A&M University Libraries, subject librarians use research consultations to teach information literacy to upper-division engineering student teams working on a technical paper project. This paper describes an action research project designed to evaluate which assessment method for consultations with student teams would provide the most actionable data about the instruction and the consultation logistics as well as optimize librarian time.
Methods – For three semesters, we simultaneously used up to four consultation assessment methods: one-minute papers, team process interviews, retrospective interviews, and questionnaires. We followed the action research cycle to plan the assessments, implement the assessments, reflect on the data collected and our experiences implementing the assessments, and revise the assessments for the next semester. Each assessment method was distributed to students enrolled in an engineering course at a different point in the technical paper project. The one-minute paper was given immediately after the consultation. The team process interviews occurred after project deliverables. The questionnaire was distributed in-person on the last day of class. Focus groups were planned for after the assignment was completed, but low participation meant that instead of focus groups we conducted retrospective interviews. We used three criteria to compare the assessments: information provided related to the effectiveness of the instruction, information provided about the logistics of the consultation, and suitability as an assessment method in our context. After comparing the results of the assessment methods and reflecting on our experiences implementing the assessments, we modified the consultation and the assessment methods for the next semester.
Results – Each assessment method had strengths and weaknesses. The one-minute papers provided the best responses about the effectiveness of the instruction when questions were framed positively, but required the most staff buy-in to distribute. The team process interviews were time intensive, but provided an essential understanding of how students think about and prepare for each progress report. Recruiting for and scheduling the focus groups required more time and effort than the data collected about the instruction and logistics warranted. The questionnaire provided student perspectives about their learning after the assignment had been completed, collected feedback about the logistics of the consultations, was easy to modify each semester, and required minimal librarian time.
Conclusion – Utilizing multiple assessment methods at the same time allowed us to determine what would work best in our context. The questionnaire, which allowed us to collect data on the instruction and consultation logistics, was the most suitable assessment method for us. The description of our assessment methods and our findings can assist other libraries with planning and implementing consultation assessment.