2005
DOI: 10.1108/00907320510597372
|View full text |Cite
|
Sign up to set email alerts
|

Assessing Auburn University Library's Tiger Information Literacy Tutorial (TILT)

Abstract: PurposeDuring Spring semester 2001, Auburn University Libraries launched the Tiger Information Literacy Tutorial (TILT), based on the Texas Information Literacy Tutorial, designed by librarians at the University of Texas at Austin. This work assesses the effectiveness of the tutorial.Design/methodology/approachIn‐depth analysis of data collected over three semesters is presented.FindingsFollowing the Texas model, Auburn's TILT is comprised of three modules, covering searching, selecting, and evaluating informa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(15 citation statements)
references
References 13 publications
0
15
0
Order By: Relevance
“…A selection of questions used in the briefer tests can be found in Houlson, 2007;Knight, 2002;and Helmke and Matthies, 2004. Examples of questions from longer tests include Burkhardt, 2007;Noe and Bishop, 2005;and Ondrusek, 2005. In summary, there is a real mix in the way librarians are using multiple choice questions to assess information literacy. Even though it is a method often chosen for convenience, some groups have put a great deal of effort into producing tools which map question areas onto established information literacy competencies and for which they have tried to establish reliability and validity.…”
Section: Multiple Choice Questionnairementioning
confidence: 99%
“…A selection of questions used in the briefer tests can be found in Houlson, 2007;Knight, 2002;and Helmke and Matthies, 2004. Examples of questions from longer tests include Burkhardt, 2007;Noe and Bishop, 2005;and Ondrusek, 2005. In summary, there is a real mix in the way librarians are using multiple choice questions to assess information literacy. Even though it is a method often chosen for convenience, some groups have put a great deal of effort into producing tools which map question areas onto established information literacy competencies and for which they have tried to establish reliability and validity.…”
Section: Multiple Choice Questionnairementioning
confidence: 99%
“…The measurement of information literacy appears to be more advanced in English-speaking countries, as several tests have already been developed (e.g., Wise et al, 2009;Noe and Bishop, 2005;Ondrusek et al, 2005), most of them to evaluate information literacy courses. Many tests use a multiple-choice format for practical reasons (e.g., ease of use, reliability).…”
Section: Introductionmentioning
confidence: 99%
“…A general set of best practices has emerged from the literature on both learning object development and learning object assessment, concentrated in four key areas: content, structure, implementation, and assessment. In the area of content, in general, multiple authors stress the importance of either avoiding or defining any library jargon used in learning objects (Reece 2005;Noe & Bishop 2005). Blummer and Kritskaya (2009) suggest aligning content to an existing set of standards, such as the ACRL (Association of College and Research Libraries) Information Literacy Competency Standards for Higher Education.…”
Section: Literature Reviewmentioning
confidence: 99%
“…When implementing learning objects, multiple authors suggest that course-related use will be more effective than stand-alone (Dewald 1999a;Veldof & Beavers 2001;Noe & Bishop 2005). Several also recommend collaborating with faculty and other campus stakeholders (Blummer & Kritskaya 2009;Lo & Dale 2009).…”
Section: Literature Reviewmentioning
confidence: 99%