Predictive Modeling of Elevated Nitrate, Arsenic, and Uranium in Groundwater and Assessment of Potential Exposures Via Private Well Water

Monday, June 20, 2016: 5:00 PM
Tikahtnu E, Dena'ina Convention Center
Jim VanDerslice , University of Utah, Salt Lake City, UT
Johnni Daniel , CDC/National Center for Environmental Health, Chamblee, GA
Lorraine Backer , Centers for Disease Control and Prevention, Chamblee, GA
BACKGROUND:

Approximately 15% of the US population relies on private water wells that are not regulated under the Safe Drinking Water Act.  However, few of these wells are tested except at the time of completion or at the change of ownership, and this testing is typically limited to bacteriological quality.  This lack of data makes it difficult to target high risk areas for testing or implementing household water treatment programs. To meet this need, the Centers for Disease Control and Prevention and the University of Utah developed and evaluated predictive models to identify areas where private wells may have elevated levels of nitrate, arsenic, or uranium.

METHODS:

We compiled data on groundwater concentrations of nitrate (n=75624), arsenic (n=70769), and uranium (n=68335) from wells found in the U.S. Geological Survey (USGS) and state sources. We developed a national model and region-specific predictive models using four methods: ordinary kriging, inverse-distance weighting, land use regression, and Categorical and Regression Tree analysis. For each model, we generated potential exposure profiles by spatially linking predicted groundwater concentrations with US Census data at the block level. We assessed model performance using a split-sample validation for nine scenarios (three study sites for each of the three ground water constituents). Model predicted values that would be considered to be indicative of an elevated concentration were chosen to achieve approximately 85% sensitivity while maintaining specificity >0.70.

RESULTS:

Overall elevated concentrations for nitrate (>10 mg/l), arsenic (>10 µg/l) and uranium (>15 µg/l) occurred in 8.8, 14.8, and 12.1 percent of wells for which data were available, respectively. Predicted values for arsenic and uranium could be generated for approximately 80% of the areas where private well use was predicted. Groundwater chemistry, well depth, and rainfall minus potential evapotranspiration were the most important predictors of dissolved arsenic and uranium, while nitrate application, soil type, and well depth were the most important predictors of elevated nitrate. Predictive power was good with sensitivities ranging from 0.78 to 0.88. Area Under the Receiver-Operator curve (AUC) values indicated good predictive power, ranging from 0.75 to 0.90 across the four methods. No one method was significantly better at predicting elevated levels.

CONCLUSIONS:

All four methods were able to produce predictive models with good predictive power for large areas where private well use is expected. Predicted values and county maps with resolution down to the block level are currently available from the authors in digital format.