Background Periodic demographic health surveillance and surveys are the main sources of health information in developing countries. Conducting a survey requires extensive use of paper-pen and manual work and lengthy processes to generate the required information. Despite the rise of popularity in using electronic data collection systems to alleviate the problems, sufficient evidence is not available to support the use of electronic data capture (EDC) tools in interviewer-administered data collection processes. Objective This study aimed to compare data quality parameters in the data collected using mobile electronic and standard paper-based data capture tools in one of the health and demographic surveillance sites in northwest Ethiopia. Methods A randomized controlled crossover health care information technology evaluation was conducted from May 10, 2016, to June 3, 2016, in a demographic and surveillance site. A total of 12 interviewers, as 2 individuals (one of them with a tablet computer and the other with a paper-based questionnaire) in 6 groups were assigned in the 6 towns of the surveillance premises. Data collectors switched the data collection method based on computer-generated random order. Data were cleaned using a MySQL program and transferred to SPSS (IBM SPSS Statistics for Windows, Version 24.0) and R statistical software (R version 3.4.3, the R Foundation for Statistical Computing Platform) for analysis. Descriptive and mixed ordinal logistic analyses were employed. The qualitative interview audio record from the system users was transcribed, coded, categorized, and linked to the International Organization for Standardization 9241-part 10 dialogue principles for system usability. The usability of this open data kit–based system was assessed using quantitative System Usability Scale (SUS) and matching of qualitative data with the isometric dialogue principles. Results From the submitted 1246 complete records of questionnaires in each tool, 41.89% (522/1246) of the paper and pen data capture (PPDC) and 30.89% (385/1246) of the EDC tool questionnaires had one or more types of data quality errors. The overall error rates were 1.67% and 0.60% for PPDC and EDC, respectively. The chances of more errors on the PPDC tool were multiplied by 1.015 for each additional question in the interview compared with EDC. The SUS score of the data collectors was 85.6. In the qualitative data response mapping, EDC had more positive suitability of task responses with few error tolerance characteristics. Conclusions EDC possessed significantly better data quality and efficiency compared with PPDC, explained with fewer errors, instant data submission, and easy handling. The EDC proved to be a usable data collection tool in the rural study setting. Implementation organization needs to consider consistent power source, decent internet connection, standby technical support, and security assurance...
Despite a vast and growing number of publications on alarm fatigue there seem to be few attempts to quantify in a systematic way the root causes or consequences of alarm fatigue. The paper describes such an attempt, based on the notion of quality dimensions of an alarm system. The concept as well as concrete instances of these dimensions were developed in a scoping review including focus group interviews and subsequent iterative refinement workshops. The resulting model goes beyond currently used statistics for describing alarm system quality and appears to be useful for the construction of tools to support clinical users in the management of their alarm system.
Background: Periodic demographic health surveillance and surveys are the main sources of health information in developing countries. Conducting survey in requires extensive use of paper-pen and manual work and lengthy processes to generate the required information. Despite the rise of popularity in using electronic data collection systems to alleviate the problems, sufficient evidence is not available to support the use of electronic data capture tools in interviewer administered data collection processes.Objective: The purpose of this study was to compare data quality parameters in the data collected using mobile electronic and standard paper-based data capture tools in one of health and demographic surveillance sites in northwest Ethiopia Methods: A randomized controlled crossover Healthcare IT-Evaluation was conducted from May 10 to June 3 rd , 2016 in demographic and surveillance site. Twelve interview administrator, as two individuals in six groups (one with a tablet computer and the other with paper-based questionnaire), were assigned in the six towns of the surveillance premises. Data collectors switch data collection method based on computer generated random order. Data was cleaned using Mysql program and transferred to SPSS and R statistical software for analysis. Descriptive and mixed ordinal logistic analyses employed. The qualitative interview audio record was transcribed, and the usability of this open data kit (ODK) based system was assessed using system usability scale (SUS) and interview mapping in the isometric dialogue principles for system usability.Result: From the submitted 1,251 complete records/questionnaires in each tools, 42 % (522) of the paper and pen data capture (PPDC) and 29 % (285) of the electronic data capture (EDC) tool questionnaires had one or more types of data quality errors. The overall error rates were 1.67% and 0.6% for PPDC and EDC respectively. The chances of more errors on PPDC tool were multiplied by 1.015 for each additional question in the interview compared to EDC. The SUS score of the data collectors was 85.6. In qualitative data response mapping, EDC had more positive suitability of task responses with few error tolerance characteristics. Conclusion:EDC possessed significantly better data quality and efficient compared with PPDC, explained in fewer errors, instant data submission, and easy handling. The EDC proved to be a usable data collection tool in the rural study setting. The consistent power source and decent 2 internet connection, standby technical support, and security assurance for the device users suggested by the data collectors before full-fledged the implementation of the system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.