Soil moisture data analysis is often straightforward, but it can leave you scratching your head with more questions than answers. There’s no substitute for a little experience when looking at surprising soil moisture behavior.
Understand what’s happening at your site
METER soil scientist, Dr. Colin Campbell has spent nearly 20 years looking at problematic and surprising soil moisture data. In this 30-minute webinar, he discusses what to expect in different soil, environmental, and site situations and how to interpret that data effectively. Learn about:
Telltale sensor behavior in different soil types (coarse vs. fine, clay vs. sand)
Possible causes of smaller than expected changes in water content
Factors that may cause unexpected jumps and drops in the data
What happens to dielectric sensors when soil freezes and other odd phenomena
Surprising situations and how to interpret them
Undiagnosed problems that affect plant-available water or water movement
Why sensors in the same field or same profile don’t agree
My name is Colin Campbell, I’m a research scientist here at METER group. Today we’re going to spend time doing a data deep dive. We’ll be looking at some data coming from my research site on the Wasatch Plateau at 10,000 feet (3000 meters) in the middle of the state of Utah.
Right now, I’m interested in looking at the weather up on the plateau. And as you see from these graphs, I’m looking at the wind speeds out in the middle of three different meadows that are a part of our experiment. At 10,000 feet right now, things are not that great. This is a picture I collected today. If you look very closely, there’s an ATMOS 41 all-in-one weather station. It includes a rain gauge. And down here is our ZENTRA ZL6 logger. It’s obviously been snowing and blowing pretty hard because we’ve got rime ice on this post going out several centimeters, probably 30 to 40 cm. This is a stick that tells us how deep the snow is up on top.
One of the things we run into when we analyze data is the credibility of the data and one day someone was really excited as they talked to me and said, “At my research site, the wind speed is over 30 meters per second.” Now, 30 meters per second is an extremely strong wind speed. If it were really blowing that hard there would be issues. For those of you who like English units, that’s over 60 miles an hour. So when you look at this data, you might get confused and think: Wow, the wind speed is really high up there. And from this picture, you also see the wind speed is very high.
But the instrument that’s making those measurements is the ATMOS 41. It’s a three-season weather station, so you can’t use it in snow. It’s essentially producing an error here at 30 meters per second. So I’ll have to chop out data like this anemometer data at the summit where the weather station is often encrusted with snow and ice. This is because when snow builds up on the sonic anemometer reflection device, sometimes it simply estimates the wrong wind speed. And that’s what you’re seeing here.
This is why it’s nice to have ZENTRA cloud. It consistently helps me see if there’s a problem with one of my sensors. In this case, it’s an issue with my wind speed sensors. One of the other things I love about ZENTRA Cloud is an update about what’s going on at my site. Clearly, battery use is important because if the batteries run low, I may need to make a site visit to replace them. However, one of the coolest things about the ZL6 data logger is that if the batteries run out, it’s not a problem because even though it stops sending data over the cellular network, it will keep saving data with the batteries it has left. It can keep going for several months.
I have a mix of data loggers up here, some old EM60G data loggers which have a different voltage range than these four ZL6 data loggers. Three of these ZL6s are located in tree islands. In all of the tree islands, we’ve collected enough snow so the systems are buried and we’re not getting much solar charging. The one at the summit collects the most snow, and since late December, there’s been a slow decline in battery use. It’s down. This is the actual voltage on the batteries. The battery percentage is around 75%. The data loggers in the two other islands are also losing battery but not as much. The snow is just about to the solar charger. There’s some charging during the day and then a decrease at night.
So I have the data right at my fingertips to figure out if I need to make a site visit. Are these data important enough to make sure the data loggers call in every day? If so, then I can decide whether to send someone in to change batteries or dig the weather stations out of the snow.
I also have the option to set up target ranges on this graph to alert me whether the battery voltage is below an acceptable level. If I turn these on, it will send me an email if there’s a problem. So these are a couple of things I love about ZENTRA cloud that help me experiment better. I thought I’d share them with you today. If you have questions you want to get in contact me with me, my email is [email protected]. Happy ZENTRA clouding.
Dr. Yossi Osroosh, Precision Ag Engineer in the Department of Biological Systems Engineering at Washington State University, discusses where and why IoT fits into irrigation water management. In addition, he explores possible price, range, power, and infrastructure road blocks.
Wireless sensor networks collect detailed data on plants in areas of the field that behave differently.
Studies show there is a potential for water savings of over 50% with sensor-based irrigation scheduling methods. Informed irrigation decisions require real-time data from networks of soil and weather sensors at desired resolution and a reasonable cost. Wireless sensor networks can collect data on plants in a lot of detail in areas of the field that behave differently. The need for wireless sensors and actuators has led to the development of IoT (Internet of Things) solutions referred to as Low-Power Wide-Area Networking or LPWAN. IoT simply means wireless communication and connecting to some data management system for further analysis. LPWAN technologies are intended to connect low-cost, low-power sensors to cloud-based services. Today, there are a wide range of wireless and IoT connectivity solutions available raising the question of which LPWAN technology best suits the application?
IoT Irrigation Management Scenarios
The following are scenarios for implementing IoT:
buying a sensor that is going to connect to a wireless network that you own (i.e., customer supplied like Wi-Fi, Bluetooth),
buying the infrastructure or at least pieces of it to install onsite (i.e., vendor managed LPWAN such as LoRaWAN, Symphony Link), and
relying on the infrastructure from a network operator LPWAN (e.g., LTE Cat-M1, NB-IOT, Sigfox, Ingenu, LoRWAN).
This is how cellular network operators or cellular IoT works. LPWAN technology fits well into agricultural settings where sensors need to send small data over a wide area while relying on batteries for many years. This distinguishes LPWAN from Bluetooth, ZigBee, or traditional cellular networks with limited range and higher power requirements. However, like any emerging technology, certain limitations still exist with LPWAN.
Individual weather and soil moisture sensor subscription fees in cellular IoT may add up and make it very expensive where many sensors are needed.
IoT Strengths and Limitations
The average data rate in cellular IoT can be 20 times faster than LoRa or Symphony Link, making it ideal for applications that require higher data rates. LTE Cat-M1 (aka LTE-M), for example, is like a Ferrari in terms of speed compared to other IoT technologies. At the same time, sensor data usage is the most important driver of the cost in using cellular IoT. Individual sensor subscription fee in cellular IoT may add up and make it very expensive where many sensors are needed. This means using existing wireless technologies like traditional cellular or ZigBee to complement LPWAN. One-to-many architecture is a common approach with respect to wireless communication and can help save the most money. Existing wireless technologies like Bluetooth LE, WiFi or ZigBee can be exploited to collect in-field data. In this case, data could be transmitted in-and-out of the field through existing communication infrastructure like a traditional cellular network (e.g., 3G, 4G) or LAN. Alternatively, private or public LPWAN solutions such as LoRaWAN gateways or cellular IoT can be used to push data to the cloud. Combination of Bluetooth, radio or WiFi with cellular IoT means you will have fewer bills to pay. It is anticipated that, with more integrations, the IoT market will mature, and costs will drop further.
Many of LPWAN technologies currently have a very limited network coverage in the U.S. LTE Cat-M1 by far has the largest coverage. Ingenu, which is a legacy technology, Sigfox and NB-IOT have very limited U.S. coverage. Some private companies are currently using subscription-free, crowd-funded LoRaWAN networks to provide service to U.S. growers: however, with a very limited network footprint. Currently, cellular IoT does not perform well in rural areas without strong cellular data coverage.
In two weeks: Dr. Osroosh continues to discuss IoT strengths and limitations in part 2.
Globally, the number one reason for data loggers to fail is flooding. Yet, scientists continue to try to find ways to bury their data loggers to avoid constantly removing them for cultivation, spraying, and harvest. Chris Chambers, head of Sales and Support at METER always advises against it. He warns, “Almost all natural systems, even arid ones, will saturate at least once or twice a year—and it only takes once.” Still…there are innovative scientists who have had some success.
A prototype buriable logger container, made from a paint can, PVC elbow, silicone, epoxy putty, and desiccant. Photo Credit: NDSU | Soil Sciences | Soil Physics
The Good
Radu Carcoana, research specialist and Dr. Aaron Daigh, assistant professor at North Dakota State University, use paint cans to completely seal their data loggers before burying. They drill ports for the sensor cables, seal them up, and when they need to collect data, they dig up the cans. Chambers comments, “So far it looks promising, but we had a long discussion about the consequences of getting any water in those cans. I don’t know what they were sealing the ports with, but they were pretty confident that they could even dunk their paint cans under water.” The North Dakota research team buried the paint cans last fall, and Chambers says he’s reserving judgment until spring. Radu comments, “The picture above is just the concept. The story will continue in April when we see the North Dakota winter toll.” (See update).
The Bad
Chambers has good reason for his skepticism. If a logger gets saturated even once, its life will be short. And even if it doesn’t get completely flooded, there is still risk. As water gets into the enclosure that encases the logger, the resulting high humidity can damage the instrument. Chambers says, “If loggers that are mounted on a post get a small amount condensation or water inside, they’ll be fine. But the buried ones have no escape route for water vapor. If they get wet or are exposed to water vapor even once, they are going to fail. We’ve seen horror stories time and time again. It’s just not a good environment for electronics.”
One group of scientists tried burying their loggers in five-gallon buckets.
The Ugly
Chambers likes to relate a cautionary tale about some scientists in Seattle, who buried their data loggers in five-gallon buckets with lids. They taped their loggers to the lid, but when they dug the buckets up, they were half full of water, and the loggers were dead. This is because as the buckets filled with water, the loggers were continuously exposed to water-condensing conditions. After the loggers were repaired, the scientists re-buried them. But, six weeks later, their buckets were again half full of water, and their loggers were dead.
One Success Story So Far
There is one innovative group at Washington State University, however, who can be considered successful. Postdoctoral research associate Caley Gasch decided she wanted to bury data loggers in the Cook Agricultural Farm, an actively managed field, so they weren’t constantly taking down loggers and causing large gaps in their data.
Next week:Find out how she was able to solve many of the problems that prevent successful deployment of data loggers underground.
Get more information on applied environmental research in our
Dr. Lauren Hallett, researcher at the University of California, Berkeley, recently conducted a study testing the importance of compensatory dynamics on forage stability in an experimental field setting where she manipulated rainfall availability and species interactions. She wanted to understand how climate variability affected patterns of species tradeoff in grasslands over time and how those tradeoffs affected the stability of things like forage production across changing rainfall conditions.
Species tradeoffs could help mitigate the negative effects of climate variability on overall forage production.
Species Tradeoff
A key mechanism that can lead to stability in forage production is compensatory dynamics, in which the responses of different species to climate fluctuations result in tradeoffs between functional groups over time. These tradeoffs could help mitigate the negative effects of climate variability on overall forage production. Dr. Hallett comments, “In California grasslands, there’s a pattern that is part of rangeland dogma, that in dry years you have more forbs, and in wet years you have more grasses. I wondered if you could manage the system so that both forbs and grasses are present in the seed bank, able to respond to climate. This would perhaps buffer community properties, like soil cover for erosion control and forage production in terms of biomass, from the effects of climate variability.”
In areas experiencing moderate grazing, there was a strong species tradeoff between grasses and forbs.
Manipulating Species Composition
Dr. Hallett capitalized on the pre-existing grazing manipulation that her lab had done over the previous four years. The grazing she replicated for this study was experimentally controlled, making it easier to ensure consistency. She built rainout shelters where she collected the water and applied it to dry versus wet plots. She also manipulated species composition, allowing only grasses, only forbs, or a mix of the two. These treatments allowed her to study changes in cover and biomass.
Hallett used soil moisture probes and data loggers to characterize the treatment effects of this experiment and to parameterize models that predict rangeland response to climate change. She says, “I wanted to verify that my rainfall treatments were getting a really strong soil moisture dynamic, and I found the shelters and the irrigation worked really well.” Along with above-ground vegetation, she collected soil cores and looked at nutrient differences in conjunction with soil moisture. Since her field site is located within the Sierra Foothills Research and Extension Center, Dr. Hallett was able to rely on precipitation data that was already measured on-site.
Results
Dr. Hallett found that in areas experiencing moderate grazing, there was a strong species tradeoff between grasses and forbs. She comments, “I had a seedbank that had both functional groups represented, and those tradeoffs did a lot to stabilize cover over time.”
When Dr. Hallett replicated the experiment in an area that had a history of low grazing, she found that the proportion of forbs wasn’t as high in the seedbank. As a consequence, there was a major loss of cover in the dry plots. She explains, “When the grass died, there weren’t many forbs to replace it, and you ended up with a lot of bare ground. The areas that were lightly grazed had more litter, so initially, the soil moisture was okay, but as the season progressed into a dry condition and the litter decomposed, there wasn’t enough new vegetation to stabilize the soil.” As a result, Dr. Hallett thinks in low-grazed areas it’s important to have an intermediate level of litter. She says, “You need enough litter to increase soil moisture, but not so much that it would suppress germination of the forbs because as the season progresses and gets really dry, if you don’t have forbs in the system, you lose a lot of ground cover.”
Surprises Lead to A New Study
Dr. Hallett was surprised that within her three treatments there seemed to be differences in when the functional groups were drying down the soil. This inspired new questions, leading her to use her dissertation data to generate a larger grant through the USDA. Her new study will perform extensive rainfall manipulations to measure the effects of early-season versus late-season dryout, and vary species within those parameters. She says, “One of the reasons you have grass years versus forb years is the timing of rainfall. For instance, if you have a really dry fall, you tend to have more forbs because their seedlings are more drought resistant. Conversely, if you have a wet fall, you tend to see more grasses because you have continual germination throughout the season. So, the timing of rainfall matters in terms of what species are in the system. We are going to look at the coupling between the species that gets selected for the fall versus what would be able to grow well in the spring, and we will be studying how that affects a whole range of things such as ground cover, above-ground production for forage, below-ground investment of different functional groups, and how these things might relate to nutrient cycling and carbon storage.”
You can read more about Dr. Hallett’s rangeland research and her current projects here.
We are entering an era of cheap data. Sensor technology has advanced to the point where it has become easy to collect large amounts of measurement data at high spatiotemporal resolution.
Hydroserver map screen: Using an off-the-shelf open source informatics system like Hydroserver kept us from reinventing what’s already out there, but allowed flexibility to program to our own needs.
We are now to the point where we have gigabytes worth of data on soil moisture, plant canopy processes, precipitation, wind speed, and temperature, but the amount of data is so overwhelming that we are having a difficult time dealing with it. The cost of measurement data is dropping so quickly, people are forced to change from a historical mindset where they analyzed individual data points to the mindset of turning gigabytes of data into knowledge.
Because Bioinformatics students are used to working with DNA data, they understand how to write computer programs that analyze large amounts of data in near real-time.
One approach suggested by my colleague Rick Gill, a BYU Ecologist, is to collaborate with bioinformatics students. Because they are used to working with DNA data, these students understand how to write computer programs that analyze large amounts of data in near real-time. Rick came up with the idea to tap these students’ expertise in order to analyze the considerable information he anticipates collecting in our Desert FMP Project, an experiment which will use TEROS 21 and SRS sensors to determine the role of varying environmental and biological factors involved in rangeland fire recovery.
Rick and I are predicting that near real-time data analysis will give us several advantages. First, we need readily available information so we can tell that sensors and systems are working at the remote site. Large gaps in data are common for sites that aren’t visited often, and sensor failures are missed when data are collected but never analyzed. With our new approach, all data are databased instantly, and the results are visualized as we go. Not only that, we’ll be able to control what’s being analyzed as we see what’s happening. We can tell the bioinformatics students what we need as we begin to see the results come in. If we see important trends, we can assign them to analyze new data that may be relevant right away.
These techniques have the potential to help scientists from all disciplines become more efficient at collection and analysis of large data streams. Although we’ve started the process, we have yet to determine its effectiveness. I will post more information as we see how well it is working and as new developments arise.