Data Quality and Public Trust 1

I recently published a series of Maptitude maps of earthquakes. A couple of these mapped recent Oklahoma earthquakes against saltwater disposal wells.

Much of the public commentary has blamed these earthquakes on hydraulic fracturing (aka “fracking”) by the oil industry. However the earthquakes do not correlate with the fracking in either time or space. Instead it has been shown (Walsh & Zoback 2015) that the earthquakes are probably due to the deep disposal of ‘produced’ water. This is water that has been separated out from a well’s produced oil. It is generally very saline (comparable to the Dead Sea) and of course contains some hydrocarbons. Despite being “natural”, this would be toxic if it was disposed off in a water course. Also, the volumes are too great for it to be driven in tankers to an effluent treatment plant. Therefore the preferred method of disposal by authorities such as the EPA, is to pump it back into the ground. Ideally it should go into a porous permeable geological formation at or below the original source field (sometimes it is pumped directly into the original field in order to boost production). In the case of Oklahoma, a lot has been pumped into the Arbuckle Formation in the Lower Palaeozoic which has ideal permeability and porosity characteristics. This has been going on for decades. Walsh & Zoback have correlated the recent increase in earthquakes to a large, recent (last 10 years) increase in disposal volumes.

The drilling companies are required to report their disposal well volumes to the Oklahoma Corporation Commission Oil and Gas Division who also make it available to the general public for download. This is important because researchers such as the Oklahoma Geological Survey and Walsh & Zoback can use it to determine the causes of the earthquakes and to inform public policy. In this case, it has exonerated the practice of “fracking” and is beginning to inform regulatory limits on injection volumes.

To produce the maps, I downloaded the disposal well data for 2014 from the OCC website (see above). This data was aggregated and cleaned up using a Python script – a process I describe here.

There were a lot of problems with the coordinates in the data. The script found 101,402 rows of data (one row per month per well that disposed of water), representing 6046 wells. Of these, 4897 had valid (or salvageable) coordinates, and 1148 had bad coordinates. Of the ‘valid’ coordinates, 26 had incorrect longitude or latitude signs, placing the wells in the wrong hemisphere. These were easy to detect and correct. Of the 1148, many simply had zero / null coordinates. Others clearly had transcription errors – for example, a line of wells had coordinates in a line across North Africa at the same latitude as Oklahoma (i.e. their longitudes were incorrect). Totaled up, over 18% of the wells had incorrect coordinates! For my maps, I simply ignored them; but if this information is provably wrong, can we trust the rest of the data?

This is especially a problem in this case due to the widespread public misconceptions about the earthquakes and the disposal wells. There is a strong tendency to not believe anything local government or the oil companies say on these issues. It is good that the OCC makes this data public because it can help to correct these misconceptions, but such obvious mistakes only make it easier for people to dismiss it as “erroneous”.  Therefore the reporting process needs to be strengthened and quality control needs to be improved, if the public are expected to trust this data.

 

One comment on “Data Quality and Public Trust

Comments are closed.