Since 2010, the WHO has recommended that clinical decision-making for malaria case management be performed based on the results of a parasitological test result. Between 2015 and 2017, the U.S. President's Malaria Initiative-funded MalariaCare project supported the implementation of this practice in eight sub-Saharan African countries through 5,382 outreach training and supportive supervision visits to 3,563 health facilities. During these visits, trained government supervisors used a 25-point checklist to observe clinicians' performance in outpatient departments, and then provided structured mentoring and action planning. At baseline, more than 90% of facilities demonstrated a good understanding of WHO recommendations-when tests should be ordered, using test results to develop an accurate final diagnosis, severity assessment, and providing the correct prescription. However, significant deficits were found in history taking, conducting a physical examination, and communicating with patients and their caregivers. After three visits, worker performance demonstrated steady improvement-in particular, with checking for factors associated with increased morbidity and mortality: one sign of severe malaria (72.9-85.5%), pregnancy (81.1-87.4%), and anemia (77.2-86.4%). A regression analysis predicted an overall improvement in clinical performance of 6.3% (P < 0.001) by the third visit. These findings indicate that in most health facilities, there is good baseline knowledge on the processes of quality clinical management, but further training and on-site mentoring are needed to improve the clinical interaction that focuses on second-order decision-making, such as severity of illness, management of non-malarial fever, and completing the patient-provider communication loop.
Between 2012 and 2017, the U.S. President's Malaria Initiative-funded MalariaCare project supported national malaria control programs in sub-Saharan Africa to implement a case management quality assurance (QA) system for malaria and other febrile illnesses. A major component of the system was outreach training and supportive supervision (OTSS), whereby trained government health personnel visited health facilities to observe health-care practices using a standard checklist, to provide individualized feedback to staff, and to develop health facility-wide action plans based on observation and review of facility registers. Based on MalariaCare's experience, facilitating visits to more than 5,600 health facilities in nine countries, we found that programs seeking to implement similar supportive supervision schemes should consider ensuring the following: 1) develop a practical checklist that balances information gathering and mentorship; 2) establish basic competency criteria for supervisors and periodically assess supervisor performance in the field; 3) conduct both technical skills training and supervision skills training; 4) establish criteria for selecting facilities to conduct OTSS and determine the appropriate frequency of visits; and 5) use electronic data collection systems where possible. Cost will also be a significant consideration: the average cost per OTSS visit ranged from $44 to $333. Significant variation in costs was due to factors such as travel time, allowances for government personnel, length of the visit, and involvement of central level officials. Because the cost of conducting supportive supervision prohibits regularly visiting all health facilities, internal QA measures could also be considered as alternative or complementary activities to supportive supervision.
Although on-site supervision programs are implemented in many countries to assess and improve the quality of care, few publications have described the use of electronic tools during health facility supervision. The President's Malaria Initiative-funded MalariaCare project developed the MalariaCare Electronic Data System (EDS), a custom-built, opensource, Java-based, Android application that links to District Health Information Software 2, for data storage and visualization. The EDS was used during supervision visits at 4,951 health facilities across seven countries in Africa. The introduction of the EDS led to dramatic improvements in both completeness and timeliness of data on the quality of care provided for febrile patients. The EDS improved data completeness by 47 percentage points (42-89%) on average when compared with paper-based data collection. The average time from data submission to a final data analysis product dropped from over 5 months to 1 month. With more complete and timely data available, the Ministry of Health and the National Malaria Control Program (NMCP) staff could more effectively plan corrective actions and promptly allocate resources, ultimately leading to several improvements in the quality of malaria case management. Although government staff used supervision data during MalariaCare-supported lessons learned workshops to develop plans that led to improvements in quality of care, data use outside of these workshops has been limited. Additional efforts are required to institutionalize the use of supervision data within ministries of health and NMCPs.
Rapid diagnostic tests (RDTs) are one of the primary tools used for parasitological confirmation of suspected cases of malaria. To ensure accurate results, health-care workers (HCWs) must conduct the RDT test correctly. Trained supervisors visited 3,603 facilities to assess RDT testing performance and conduct outreach training and supportive supervision activities in eight African countries between 2015 and 2017, using a 12-point checklist to determine if key steps were being performed. The proportion of HCWs performing each step correctly improved between 1.1 and 21.0 percentage points between the first and third visits. Health-care worker scores were averaged to calculate facility scores, which were found to be high: the average score across all facilities was 85% during the first visit and increased to 91% during the third visit. A regression analysis of these facility scores estimated that, holding key facility factors equal, facility performance improved by 5.3 percentage points from the first to the second visit (P < 0.001), but performance improved only by 0.6 percentage points (P = 0.10) between the second and third visits. Factors strongly associated with higher scores included the presence of a laboratory worker at the facility and the presence of at least one staff member with previous formal training in malaria RDTs. Findings confirm that a comprehensive quality assurance system of training and supportive supervision consistently, and often significantly, improves RDT performance.
Although light microscopy is the reference standard for diagnosing malaria, maintaining skills over time can be challenging. Between 2015 and 2017, the U.S. President's Malaria Initiative-funded MalariaCare project supported outreach training and supportive supervision (OTSS) visits at 1,037 health facilities in seven African countries to improve performance in microscopy slide preparation, staining, and reading. During these visits, supervisors observed and provided feedback to health-care workers (HCWs) performing malaria microscopy using a 30-step checklist. Of the steps observed in facilities with at least three visits, the proportion of HCWs that performed each step correctly at baseline ranged from 63.2% to 94.2%. The change in the proportion of HCWs performing steps correctly by the third visit ranged from 16.7 to 23.6 percentage points (n = 916 observations). To assess the overall improvement, facility scores were calculated based on the steps performed correctly during each visit. The mean score at baseline was 85.7%, demonstrating a high level of performance before OTSS. Regression analysis predicted an improvement in facility scores of 3.6 percentage points (P < 0.001) after three visits across all countries. In reference-level facilities with consistently high performance on microscopy procedures and parasite detection, quality assurance (QA) mechanisms could prioritize more advanced skills, such as proficiency testing for parasite counting and species identification. However, in settings with high staff turnover and declining use of microscopy in favor of rapid diagnostic tests, additional supervision visits and/or additional QA measures may be required to improve and maintain performance. * Checklist steps in bold ("minimum standard" steps) are considered more important and are collectively weighted twice as much as the other steps when calculating scores. † Health facilities preferred to use the 10% Giemsa solution because of its rapid turnaround when compared with the 3% Giemsa solution.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.