ElsevierValor Miró, JD.; Silvestre Cerdà, JA.; Civera Saiz, J.; Turró Ribalta, C.; Juan Císcar, A. (2015). Efficiency and usability study of innovative computer-aided transcription strategies for video lecture repositories. AbstractVideo lectures are widely used in education to support and complement face-to-face lectures. However, the utility of these audiovisual assets could be further improved by adding subtitles that can be exploited to incorporate added-value functionalities such as searchability, accessibility, translatability, note-taking, and discovery of content-related videos, among others. Today, automatic subtitles are prone to error, and need to be reviewed and post-edited in order to ensure that what students see on-screen are of an acceptable quality. This work investigates different user interface design strategies for this post-editing task to discover the best way to incorporate automatic transcription technologies into large educational video repositories. Our three-phase study involved lecturers from the Universitat Politècnica de València (UPV) with videos available on the poliMedia video lecture repository, which is currently over 10,000 video objects. Simply by conventional post-editing automatic transcriptions users almost reduced to half the time that would require to generate the transcription from scratch. As expected, this study revealed that the time spent by lecturers reviewing automatic transcriptions correlated directly with the accuracy of said transcriptions. However, it is also shown that the average time required to perform each individual editing operation could be precisely derived and could be applied in the definition of a user model. In addition, the second phase of this study presents a transcription review strategy based on confidence measures (CM) and compares it to the conventional post-editing strategy. Finally, a third strategy resulting from the combination of that based on CM with massive adaptation techniques for ASR achieved to improve the transcription review efficiency in comparison with the two aforementioned strategies.
Abstract. In this paper we present the integration of a state-of-the-art ASR system into the Opencast Matterhorn platform, a free, open-source platform to support the management of educational audio and video content. The ASR system was trained on a novel large speech corpus, known as poliMedia, that was manually transcribed for the European project transLectures. This novel corpus contains more than 115 hours of transcribed speech that will be available for the research community. Initial results on the poliMedia corpus are also reported to compare the performance of different ASR systems based on the linear interpolation of language models. To this purpose, the in-domain poliMedia corpus was linearly interpolated with an external large-vocabulary dataset, the wellknown Google N-Gram corpus. WER figures reported denote the notable improvement over the baseline performance as a result of incorporating the vast amount of data represented by the Google N-Gram corpus.
Abstract. Video lectures are a valuable educational tool in higher education to support or replace face-to-face lectures in active learning strategies. In 2007 the Universitat Politècnica de València (UPV) implemented its video lecture capture system, resulting in a high quality educational video repository, called poliMedia, with more than 10.000 mini lectures created by 1.373 lecturers. Also, in the framework of the European project transLectures, UPV has automatically generated transcriptions and translations in Spanish, Catalan and English for all videos included in the poliMedia video repository. transLectures's objective responds to the widely-recognised need for subtitles to be provided with video lectures, as an essential service for non-native speakers and hearing impaired persons, and to allow advanced repository functionalities. Although high-quality automatic transcriptions and translations were generated in transLectures, they were not error-free. For this reason, lecturers need to manually review video subtitles to guarantee the absence of errors. The aim of this study is to evaluate the efficiency of the manual review process from automatic subtitles in comparison with the conventional generation of video subtitles from scratch. The reported results clearly indicate the convenience of providing automatic subtitles as a first step in the generation of video subtitles and the significant savings in time of up to almost 75% involved in reviewing subtitles.
Los vídeos docentes son una herramienta de gran aceptación en el mundo universitario, que se integran en metodologías activas y innovadoras, dando pie a plataformas como poliMedia, plataforma de la Universitat Politècnica de València (UPV) que permite la creación, publicación y difusión de este tipo de contenido multimedia. En el marco del proyecto europeo transLectures, la UPV ha generado automaticamente transcripciones y traducciones en español, catalán e inglés para todos los vídeos incluidos en el repositorio poliMedia.Las transcripciones y traducciones automáticas generadas tienen una alta calidad. Sin embargo, estas transcripciones y traducciones poseen errores inherentes al proceso automático de subtitulado utilizado, por lo que en ocasiones puede ser necesaria una revisión manual posterior para garantizar la ausencia de errores. El objetivo de este trabajo es evaluar este proceso de revisión manual para poder compararlo en términos de coste temporal con un proceso de obtención completamente manual. Así pues, en el marco de las ayudas de la UPV de Docencia en Red 2013-2014, pusimos en marcha una campaña de evaluación del proceso de revisión de transcripciones y traducciones automáticas por parte del profesorado, cuyos resultados indican inequívocamente la eficacia de estas técnicas y el importante ahorro de tiempo que suponen.
Usage of educational videos in higher education
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.