This paper presents a twelve-month evaluation of a Web-based
CALL Project at London Guildhall University, highlighting its rationale and
objectives within the design process, the adopted methodology and its findings,
shedding further light on students’ attitudes regarding CALL design,
multimedia delivery and language learning practices. The evaluated CALL
environment, stemming from previously identified user requirements, combines
both hypertext navigation with Java-written interactive CALL exercises and
provides a built-in authoring mode for increased adaptability. Techniques used
involved peer evaluation and discussions in the formative stages of the design as
well as user walkthroughs, questionnaires and workshops in the summative phase of
the process. Data related to requirements and feedback are presented, emphasising
the user-centred approach and its integration within the design process itself.
Finally, changes brought about by the evaluation are shown to illustrate the
influence and impact the research undertaken has had on the original concepts and
the design.
Whilst the potential of online Computer Assisted Language Learning (CALL) in terms of access, immediacy and exposure is widely acknowledged, as yet remarkably little is known about its impact on the user population, be they language teachers or learners. Indeed, beyond its technological development, online CALL design is still affected by a dearth of reliable evaluative data, critically needed for improving the interface design, implementation and integration. On this premise, this paper proposes to focus on a range of appropriate evaluation methods applied to a recently authored web-based interactive CALL interface, comprising multimedia interactive exercises and self testing, as part of an ongoing research combining Human Computer Interaction (HCI), Java programming and language teaching expertise. This current evaluation adopts a novel approach insofar as its main objectives are not only to assess the usability and usefulness of the interface as well as the student experience, but also to compare the different methods used on the basis of their input and evaluative output. Ultimately, this paper will strive to bring theory and practice closer together by shedding further light on the crucial role of evaluation, the appropriateness of the data it generates and its applicability to online CALL development.
By providing access, data and new forms of literacy and communication practices, it is widely accepted that
networked technologies have done much to promote learner autonomy. However, in practical terms, the lack of resources, expertise and
research investigations into learner interaction have all too often meant that autonomous learning is conveniently likened to
teacher-independent learning, largely relying on the success and assumed intuitiveness of the World Wide Web (web) for its
learner driven delivery. This situation affecting foreign language teaching and learning has been further aggravated by the recent
trend, at least in UK universities, to conceive languages solely as communicative tools, further severing them from their academic
base and cultural roots, often reducing learner autonomy to poor repetitive interaction. On this premise, this paper proposes to
focus on how to make better use of the interactive potential of the web in order to maximise independent language learning online.
From a Human Computer Interaction (HCI) design perspective, it intends to shed further light on and increase our understanding of
hypermedia and multimedia structures through learner participation and evaluation. On the basis of evidence from an ongoing research
investigation into online CALL literacy, it will seek to identify crucial causalities between the user interface and learner
interaction affecting the learners’ focus and engagement within their own learning processes. The adopted methodology
combines a task analysis of a hypermedia prototype underpinned by an activity theory approach and participatory design based on
user walkthroughs and focus groups. By looking at the relationship between action and goal as well as between activities and motives,
it attempts to provide a framework for evaluating online hypermedia interactivity based on identified activities, design tasks
and design criteria.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.