Skip to main content

Is benchmarking the best route to water efficiency in the UK’s irrigated agriculture?

Irrigation pump. Image credit Wikimedia Commons.
From August 2015 to January 2016, I was lucky enough to enjoy an ESRC-funded placement at the Environment Agency. Located within the Water Resources Team, my time here was spent writing a number of independent reports on behalf of the agency. This blog is a short personal reflection of one of these reports, which you can find here. All views within this work are my own and do not represent any views, plans or policies of the Environment Agency. 

Approximately 71% of UK land (17.4 million hectares) is used for agriculture - with 9.3 million hectares (70%) of land in England used for such operations. The benefits of this land use are well-known - providing close to 50% of the UK’s food consumption.  Irrigated agriculture forms an important fulcrum within this sector, as well as contributing extensively to the rural economy. In eastern England alone, it is estimated that 50,000 jobs depend upon irrigated agriculture – with the sector reported to contribute close to £3 billion annually to the region’s economy.

It is estimated that only 1-2% of the water abstracted from rivers and groundwater in England is consumed by irrigation. When compared to the figures from other nations, this use of water by agriculture is relatively low.  In the USA, agricultural operations account for approximately 80-90% of national consumptive water use. In Australia, water usage by irrigation over 2013/14 totalled 10,730 gigalitres (Gl) – 92% of the total agricultural water usage in that period (11,561 Gl).

However, the median prediction of nine forecasts of future demand in the UK’s agricultural sector has projected a 101% increase in demand between today and 2050. In this country, irrigation’s water usage is often concentrated during the driest periods and in the catchments where resources are at their most constrained. Agriculture uses the most water in the regions where water stress is most obvious: such as East Anglia. The result is that, in some dry summers, agricultural irrigation may become the largest abstractor of water in these vulnerable catchments.

With climate change creating a degree of uncertainty surrounding future water availability across the country, it has become a necessity for policy and research to explore which routes can provide the greatest efficiency gains for agricultural resilience. A 2015 survey by the National Farmers Union  found that many farmers lack confidence in securing long term access to water for production - with only a third of those surveyed feeling confident about water availability in five years’ time. In light of this decreasing availability, the need to reduce water demand within this sector has never been more apparent.

Evidence from research and the agricultural practice across the globe provides us with a number of possible routes. Improved on-farm management practice, the use of trickle irrigation, the use of treated wastewater for irrigation and the building of reservoirs point to a potential reduction in water usage.

Yet, something stands in the way of the implementation of these schemes and policies that support them: People. The adoption of new practices tends to be determined by a number of social factors – depending on the farm and the farmer. As farmers are the agents within this change, it is important to understand the characteristics that often guide their decision-making process and actions in a socio-ecological context.

Let’s remember, there is no such thing as your ‘average farmer’. Homogeneity is not a word that British agriculture is particularly aware of. As a result, efforts to increase water use efficiency need to understand how certain characteristics influence the potential for action. Wheeler et al. have found a number of characteristics that can influence adaptation strategies. For example, a farmer with a greater belief in the presence of climate change is more likely to adopt mitigating or adaptive measures. Importantly, this can also be linked to more-demographic factors. As Islam et al. have argued, risk scepticism can be the result of a number of factors (such as: age, economic status, education, environmental and economic values) and that these can be linked to the birth cohort effect.

This is not to say that all farmers of a certain age are climate-sceptics but it does point to an important understanding of demography as a factor in the adoption of innovative measures. Wheeler et al. went on to cite variables of environment values, commercial orientation, perceptions of risk and the presence of an identified farm successor as potentially directing change in practice . Research by Stephenson has shown that farmers who adopt new technologies tend to be younger and more educated, have higher incomes, larger farm operations and are more engaged with primary sources of information.

Yet, there is one social pressure that future policy must take into account – friendly, neighbourly competition. Keeping up with the Joneses. Not wanting Farmer Giles down the lane knowing that you overuse water in an increasingly water-scarce future. This can be harnessed within a system of benchmarking. Benchmarking involves the publication of individual farm’s water use, irrigation characteristics and efficiency and farming practice. Although data is supplied anonymously, individual farmers will be able to see how they measure up against their neighbours, competitors and others elsewhere.

Benchmarking is used in other agricultural sub-sectors. A 2010 survey found that 24% of farmers from different sectors used benchmarking in their management processes. This is particularly evident in the dairy sector, where both commercial and public organisations use the methods as a way to understand individual farm performance – an important example of this would be DairyCo’s Milkbench+ initiative. In 2004, over 950,000 hectares of irrigated land in Australia, 385,000 hectares in China and 330, 000 hectares in Mexico were subjected to benchmarking processes as a mean to gauge their environmental, operational and financial characteristics.

The result is that irrigators would have the means to compare how they are performing relative to other growers – allowing the answering of important questions of ‘How well am I doing?’ ‘How much better could I do?’ and ‘How do I do it?’ Furthermore, this route can be perceived as limiting the potential for ‘free-riding’ behaviour within a catchment as well emphasise the communal nature of these vulnerable resources. We’ve all seen ‘Keeping up with the Joneses’ result in increased consumption – benchmarking provides us with an important route to use this socialised nudging for good.
--------------------------------------------------------------
This blog is written by Cabot Institute member Ed Atkins, a PhD student at the University of Bristol who studies water scarcity and environmental conflict.


Ed Atkins

Popular posts from this blog

Converting probabilities between time-intervals

This is the first in an irregular sequence of snippets about some of the slightly more technical aspects of uncertainty and risk assessment.  If you have a slightly more technical question, then please email me and I will try to answer it with a snippet. Suppose that an event has a probability of 0.015 (or 1.5%) of happening at least once in the next five years. Then the probability of the event happening at least once in the next year is 0.015 / 5 = 0.003 (or 0.3%), and the probability of it happening at least once in the next 20 years is 0.015 * 4 = 0.06 (or 6%). Here is the rule for scaling probabilities to different time intervals: if both probabilities (the original one and the new one) are no larger than 0.1 (or 10%), then simply multiply the original probability by the ratio of the new time-interval to the original time-interval, to find the new probability. This rule is an approximation which breaks down if either of the probabilities is greater than 0.1. For example

1-in-200 year events

You often read or hear references to the ‘1-in-200 year event’, or ‘200-year event’, or ‘event with a return period of 200 years’. Other popular horizons are 1-in-30 years and 1-in-10,000 years. This term applies to hazards which can occur over a range of magnitudes, like volcanic eruptions, earthquakes, tsunamis, space weather, and various hydro-meteorological hazards like floods, storms, hot or cold spells, and droughts. ‘1-in-200 years’ refers to a particular magnitude. In floods this might be represented as a contour on a map, showing an area that is inundated. If this contour is labelled as ‘1-in-200 years’ this means that the current rate of floods at least as large as this is 1/200 /yr, or 0.005 /yr. So if your house is inside the contour, there is currently a 0.005 (0.5%) chance of being flooded in the next year, and a 0.025 (2.5%) chance of being flooded in the next five years. The general definition is this: ‘1-in-200 year magnitude is x’ = ‘the current rate for eve

Coconuts and climate change

Before pursuing an MSc in Climate Change Science and Policy at the University of Bristol, I completed my undergraduate studies in Environmental Science at the University of Colombo, Sri Lanka. During my final year I carried out a research project that explored the impact of extreme weather events on coconut productivity across the three climatic zones of Sri Lanka. A few months ago, I managed to get a paper published and I thought it would be a good idea to share my findings on this platform. Climate change and crop productivity  There has been a growing concern about the impact of extreme weather events on crop production across the globe, Sri Lanka being no exception. Coconut is becoming a rare commodity in the country, due to several reasons including the changing climate. The price hike in coconuts over the last few years is a good indication of how climate change is affecting coconut productivity across the country. Most coconut trees are no longer bearing fruits and thos