Developing and Usability Testing an Internal Data Request System: Improving Epidemiology and Informatics Infrastructure in a Health Department

Wednesday, June 17, 2015: 11:00 AM
Back Bay A, Sheraton Hotel
Kelly A Gerard , Denver Public Health, Denver, CO
Arthur Davidson , Denver Public Health, Denver, CO
Heather Weir , Denver Public Health, Denver, CO

BACKGROUND:  Health departments (HD) face a plethora of data (e.g., surveys, vital statistics, reporting systems, census, registries and most recently the advent of electronic health record (EHR) data sharing).  Expectations for analyzing data are expanding. Epidemiologists and informaticians (E&I) receive and work to complete requests. Inefficiency and waste is observed managing requests with duplication of effort, lost time, lost knowledge, and ambiguous standard processes. This quality improvement (QI) effort addressed those inefficiencies.

METHODS:  A local HD developed a QI project using Lean methodology to examine data management and processing problems.  Areas of waste and opportunities for improvement were identified including a root cause analysis. Once a standard process was identified, a data request system, hosted in SharePoint 2013, was developed and usability tested by users. Six participants who played the role of requestor tested and scored (serious vs minor problem) three scenarios: 1) looking up information in the data sources list, 2) completing the request form, 3) tracking the status and making changes to an item in the request list. E&I did not test using scenarios. To test the data request system, E&I claimed a data source in the data source list and updated the status and comments for one request in the request list.

RESULTS:   8 users reported 11 serious problems (frustrated and giving up) and 12 minor problems (annoyed but able to complete task).  Scenario 1: Users testing the data sources list identified 6 serious and 5 minor problems. All 6 users provided data source contact name.  Scenario 2: 6 users tested and identified 2 minor problems; only 4 users completed the form with all the available information provided. Scenario 3: Users identified 5 serious and 5 minor problems. 6 users (4 requestors, 2 analysts) were able to update an existing request.

CONCLUSIONS:   Duplication of work occurs when two or more analysts respond to a requestor. Scenario 1 findings demonstrate that this inefficiency can be reduced with a formal data request system. Lack of complete request information delayed moving forward. Scenario 2 revealed people’s performance and preference don’t always match; thus, testing for usability should include subjective and objective metrics.  Scenario 3 outcome metrics highlighted the importance of usability testing because users identified 5 serious problems that could reduce usability. Usability testing for sharing knowledge online using scenarios was effective in identifying problems and areas for improvement; this informatics QI approach is worthy of review by other HD.