201 An Evaluation of Outbreak Surveillance in North Carolina

Monday, June 15, 2015: 3:30 PM-4:00 PM
Exhibit Hall A, Hynes Convention Center
Heather R Dubendris , North Carolina Department of Health and Human Services, Raleigh, NC
Jean-Marie Maillard , North Carolina Department of Health and Human Services, Raleigh, NC
Jennifer K MacFarquhar , Centers for Disease Control and Prevention, Raleigh, NC
Zack Moore , North Carolina Department of Health and Human Services, Raleigh, NC

BACKGROUND: Describing and tracking the occurrence of communicable disease outbreak events provides data to inform outbreak response and control recommendations. Additionally, outbreak surveillance can provide a public record of investigations and assure that outbreak data can be easily shared with national outbreak reporting systems. The North Carolina Division of Public Health (NC DPH) has maintained an outbreak surveillance system since 2011, which includes collection of a standard set of information for each communicable disease outbreak and entry into an aggregate database. We performed an evaluation of this system to assess its usefulness and identify areas for improvement.

METHODS: Key attributes of the system were assessed using CDC’s updated Guidelines for Evaluation Public Health Surveillance Systems. We interviewed NC DPH ‘on-call’ epidemiologists using a 20 question online survey tool. We assessed 2012−2014 aggregate outbreak event data and corresponding outbreak report forms for timeliness and completeness.

RESULTS: The system was viewed as useful. Fifteen of 16 (94%) respondents indicated that outbreak surveillance contributes to prevention and control of diseases. Fourteen of 16 (88%) felt the system was easy to use, but only 19% (3 of 16) indicated that the process was well defined. Acceptability was low; only 2 (13%) respondents reported satisfaction with the system. Data completeness for fields within the aggregate outbreak database ranged from 76–100% by event. Data were entered at different stages of the outbreaks, most commonly when first reported by the local health department. Most (63%) on-call staff occasionally entered outbreaks into the system. Between 2012−2014, a mean of 79 days (median=30) passed between the initial report (typically by phone) to NC DPH and receipt of the local health department final report (using the standard outbreak report form). Timeliness improved since 2012, with the median time between initial report and final report decreasing from 33 to 27 days. Staff stated the importance of having a centralized source for outbreak information and were interested in improvements that would allow routine analysis of aggregate outbreak data.

CONCLUSIONS: While usefulness of this system is high, key areas identified for improvement include data completeness, timeliness, and analytic capacity. We recommend adding an “end of outbreak” date to the system to allow a more accurate calculation of timeliness. Future development of an analytic data platform (e.g., Epi Info), would improve data quality and increase aggregate data analysis.