Background Health service data from Health Management Information Systems is important for decision–making at all health system levels. Data quality issues in low-and-middle-income countries hamper data use however. Smart Paper Technology, a novel digital–hybrid technology, was designed to overcome quality challenges through automated digitization. Here we assessed the impact of the novel system on data quality dimensions, metrics and indicators as proposed by the World Health Organization′s Data Quality Review Toolkit. Methods This cross–sectional study was conducted between November 2019 and October 2020 in 13 health facilities sampled from 33 facilities of one district in rural Tanzania, where we implemented Smart Paper Technology. We assessed the technology′s data quality for maternal health care against the standard District Health Information System-2 applied in Tanzania. Results Smart Paper Technology performed slightly better than the District Health Information System–2 regarding consistency between related indicators and outliers. We found <10% difference between related indicators for 62% of the facilities for the new system versus 38% for the standard system in the reference year. Smart Paper Technology was inferior to District Health Information System–2 data in terms of completeness. We observed that data on 1st antenatal care visits were complete >90% in only 76% of facilities for the new system against 92% for the standard system. For the indicator internal consistency over time 73%, 59% and 45% of client numbers for antenatal, labour and postnatal care recorded in the standard system were documented in the new system. Smart Paper Technology forms were submitted in 83% of the months for all service areas. Conclusion Our results suggest that not all client encounters were documented in Smart Paper Technology, affecting data completeness and partly consistency. The novel system was unable to leverage opportunities from automated processes because primary documentation was poor. Low buy–in of policymakers and lack of internal quality assurance may have affected data quality of the new system. We emphasize the importance of including policymakers in evaluation planning to co-design a data quality monitoring system and to agree on a realistic way to ensure reporting of routine health data to national level.