While most Van Tuyl lecturers come with some amount of praise for what they have done, when the lecture started with Mines professor Dr. Kamini Singha stating that the upcoming presenter “single handily changed the way we think about characterizing watersheds with geophysical methods,” she was understating the developments that would be discussed throughout the lecture. The speaker in question was Dr. Burke Minsley from USGS in Denver. Minsley is crucial in overlapping the worlds of geophysics and hydrology and the result has been increased accuracy and modeling in both disciplines.
Minsley began with what might have appeared as a simple question that would be the guiding mentality of the whole hour. “Why do we do airborne geophysics?” he asked.To drive home the basic answer, he put up a bit of geophysical results that had been mapped on the ground, then compared it to an airborne survey. Where the ground survey would have taken all day and wound up being about a kilometer of data, the airborne data was much broader, covering around 100 kilometers over the course of an hour. Of ground work, Minsley stated, “[it is] great information, but it doesn’t show the whole picture.”
Minsley got into the meat of airborne geophysics by starting out simple. A brief outline of the two main methods: frequency and time domain. Between the two methods a section extending down around 100 meters into the subsurface can be obtained with the intent of discerning structures of differing resistivity. After the surveys were obtained, data processing gives sections that are generally portrayed as red and blue beds and structures. Beyond that, it is the job of the data interpreter to take a stab at determining what the differing resistivities mean.
Though Minsley has been part of countless studies, for this particular presentation, the focus would fall on work done in Nebraska, Alaska, and in the San Luis Valley in Colorado. In the Nebraska case study, the intent was to better characterize the known aquifers and groundwater paths, Minsley also added, “we used it to better understand how the system was connected.” Where normally a lot of time would have to be spent on associating resistivities with different rock layers, in the target area, many boreholes had been drilled prior that helped to add a lot of control to the models. As the model evolved, the data seen in the boreholes was similar to what was in the geophysical surveys, “for the most part the correlation works out pretty well,” said Minsley. The results of the surveys allowed for new, highly accurate models to come around, as well as the identification of 3.7 million acre feet of aquifer that had not been seen prior.
Where the Nebraska study was fairly easy due to the abundance of data in the area before the survey was conducted, the second case study was on entirely un-characterized terrain. “In Nebraska we had thousands of boreholes to confine the geophysics, in Alaska we had one,” said Minsley. The primary objective of the Alaska studies was to better understand how groundwater behaves in a system where there are variable amounts of permafrost. Permafrost can serve as a barrier to flow to groundwater and as the climate changes, the degree of continuous permafrost is changing, so by understanding how the groundwater interacts with it, questions regarding flow patterns can be answered. The main goal Minsley’s team was to turn the ~500,000 data points into something that could be understood by geologists and hydrologists and could mean something beyond a few red and blue lines. This meant that the team would have to establish an interpretive resistivity scale so that there could be meaning drawn directly from the geophysical data. In cases where two units looked the same in terms of the resistivity profiles, it would take a bit of multidisciplinary knowledge to determine where a unit would exist.
Along with helping to establish a thermal profile of the subsurface, one of the more profound results that Minsley observed focused around the river in the region. The Yukon River has been known to migrate, a fact that was not originally considered with the survey. When Minsley began looking into the data, the thermal profile in the subsurface displayed a strange tilt which could not have been explained by local geology. As it turned out, the tilt in the data represented the permafrost re-establishing in the subsurface since the river had moved. By checking the gradients as well as the timescale of the shift in the river, the team was able to confirm the rate at which permafrost occurs during a cold period. “From a single geophysical snapshot, you see a preserved permafrost legacy,” proudly stated Minsley. The data from this survey is expected to help establish how groundwater migrates in the frozen subsurface.
Another part of Minsley’s talk focused on a different area of Alaska, and the results were much more important to the geophysics side of the presentation. With the help of equations established by Mines professor Andre Revil, Minsley began to look at how the constraints of a permafrost system could better define geophysical parameters. In the second Alaska field area, a lake was present which presented the team with some nice constraints. Usually in cases where water is on the surface, permafrost doesn’t exist in the subsurface. This allows for areas of confined permafrost to exist on islands in the middle of rivers or lakes, and these islands can in turn be used to constrain models. A member of Minsley’s team used an island to model several different situations. By picking the model that best fits the actual field data, a model can be used as a standard for the area.
The talk of models in the Alaska area lead to the finale of the talk, which consisted of an overview of new approaches to modeling the San Luis Valley case study. “I’ve spent this whole time talking about red and blue images and how great they are,” began Minsley, “but they may not be the right answers.” The correct answers, as Minsley assumes, are ones that involve useful data, such as rock types and flow patterns as opposed to raw geophysical data. One of the first steps then was to find a way to make the models more reliable. To do this Minsley began running thousands of models, and instead of picking the best, he opted to use them all. “We shouldn’t get too bogged down about one of the models,” revealed Minsley. For his research he used over 100,000 models and overlaid them to create a histogram. From this histogram, features such as rock interfaces could be extrapolated with a higher degree of certainty. The modeling method also allows for the modeler to see where the data begins to lose certainty, which is highly important in disciplines such as hydrology.
The final case study emphasized the use of this modeling technique using an example which is close to home for some Mines faculty. The San Luis Valley in southern Colorado is currently a fairly dry valley, but at some point in the past, the valley hosted a large lake. This ancient lake has been established through geological work, but much about the morphology is hidden under the sand. By taking several thousand models and combining them, the layers in the valley became fairly apparent, then by tying in a lithology probability to the resistivities, the clay of the ancient lake began to stand out in the stratigraphy.