Skip to content

Call for Integrated Data Quality & Assurance Approach

by on July 9, 2014

Guest post by David Boone, MEASURE Evaluation Epidemiologist 

Data quality assurance was a hot topic at the Technical Consultation meeting on Monitoring Results with Health Facility Data organized by USAID, WHO, and the University of Oslo in Glion-sur-Montreux, Switzerland from 11-12 June 2014.

In the past 10-12 years much emphasis has been placed on assuring the quality of data reported from health facilities to the national level for program planning, monitoring and evaluation. Whether for the national health management information systems (HMIS) or for donor-funded projects like PEPFAR and The Global Fund, data quality assurance tools and methods have been developed and widely used to measure data quality (in terms of accuracy, timeliness and completeness of data) and identify gaps in reporting systems which inhibit the production of quality data.

Existing tools include the GAVI Immunization Data Quality Audit (IDQA), the WHO IVB Data Quality Self-Assessment (DQS) for immunization, the Global Fund/MEASURE Evaluation Data Quality Audit (DQA) for HIV, TB and Malaria indicators, and the Routine Data Quality Assessment Tool, a indicator-generic capacity building and self assessment version of the DQA. The MEASURE Evaluation PRISM methodology also measures data quality (as a component of Routine Health Information System (RHIS) performance) in a similar fashion. Each tool focuses on specific diseases or reporting systems though they all measure accuracy, timeliness and completeness in the same way.

The result of the ad hoc implementation of these tools in country by different disease programs and donors has often been overlapping and redundant assessments which add burden for health workers and is wasteful of precious public health resources.

WHO and it’s partners, GAVI and the Global Fund, are promoting are more integrated and holistic approach for assessing and assuring good data quality. The proposed methodology includes an annual desk review of data quality which includes the examination of:

  1. reporting timeliness and completeness, completeness of indicator data (i.e. identification of missing data)
  2. internal consistency of reported data (identification of outliers, evaluation of trends, etc.)
  3. external consistency or reported data (i.e. comparisons of routinely reported data to survey estimates)
  4. consistency of population data (i.e. the denominators used to calculate outcome indicators)

This national desk review would be combined with data verification conducted at health facilities as part of a health facility survey, such as the Service Availability and Readiness Assessment (SARA). Health facility surveys of this type generally use samples of health facilities much larger than those used for data quality assessments and therefore would provide more robust estimates of data quality.

The results of data verification and the national level desk review would feed into regular health planning events (i.e. Health Sector Review) and provide important information on the adequacy of data used for planning.

The proposed integrated data quality assessment which combines health facility data verification with national level analytics and links to the health sector planning process, was well received by the meeting participants. Stay tuned for more information on this promising approach as the methods and tools are finalized.

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: