Managing water scarcity in European and Chinese cropping systems

Partner Publication (CVUT):

Thomas Weninger 1 , Edith Kamptner 1, Tomas Dostal 2 , Adelheid Spiegel 3 , and Peter Strauss 1
Institute for Land and Water Management, Federal Agency of Water Management, Pollnbergstraße 1, 3252 Petzenkirchen, Austria

Faculty of Civil Engineering, Department of Landscape Water Conservation, Czech Technical University Prague, Thákurova 7,
16629 Prague 6, Czech Republic

Institute for Soil Health and Plant Nutrition, Austrian Agency for Health and Food Safety, Spargelfeldstraße 191, 1220 Vienna, Austria
Received October 1, 2020; accepted November 13, 2020
Int. Agrophys., 2020, 34, 463-471

A b s t r a c t.

Reliable estimations of soil physical quality provide valuable information for the evaluation and advancement
of agricultural soil management strategies. In the agriculturally highly productive Pannonian basin in Eastern Austria, little emphasis has been placed on the determination of soil physical quality and corresponding soil degradation risks. Nevertheless, ongoing climate change, especially prolonged drought periods and higher rainfall intensity, will raise the need for appropriate soil management strategies. Soil physical quality was therefore assessed in nine soil profiles in a long-term tillage experiment which has been in operation since 1988 in Eastern Austria. Soil
samples from depths of between 2 and 37 cm and under three different tillage systems (conventional, reduced and minimal tillage) were analysed for various indicators of soil physical quality. The resulting classifications of soil physical quality in the different profiles were compared qualitatively and quantitatively together with an estimation concerning the representativeness of the soil physical quality indicators used. The outcomes showed severe soil
compaction under all tillage treatments and slight improvements in soil physical quality marginally above the working depth for the different treatments. Additionally, conversion to conservation tillage led to less pronounced improvements in soil physical quality under Pannonian conditions than have been reported in more humid climates.
K e y w o r d s: tillage intensity, soil compaction, soil water balance, soil management

Xun Wu,  Jianchu Shi, Qiang Zuo, Mo Zhang, Xuzhang Xue, Lichun Wang, Ting Zhang & Alon Ben-Gal


Rational parameterization of the soil water stress reduction function in root water uptake model is crucial for accurate description of root water uptake and simulation of soil water dynamics in a soil–plant system. In this study, we propose three improvements to a popular transpiration-based approach to parameterize the water stress reduction function in a widely used macroscopic root water uptake model. The improvements are based on the interdependent relationships between soil and plant water status and consideration of effects of (1) relative distribution of soil water to roots on transpiration; (2) differences in growth levels of plants exposed to different levels of water stresses on potential transpiration; and (3) hysteresis of water stress on parameter optimization through identifying and discarding the data involved in the recovery periods when the discrepancy between soil and plant water availability is significant. Lysimetric experiments with winter wheat planted alternatively in greenhouse soil columns and in a field were conducted to test the proposed improvements. Through minimizing the residuals between the measured and estimated actual transpiration, the optimized parameterization was used to set up the root water uptake model. Thereupon, actual transpiration and relative transpiration were estimated and soil water content distributions were simulated. The estimated actual (RMSE ≤ 0.09 cm day−1) and relative (RMSE = 0.06) transpiration agreed well with the measurements. The simulated soil water content distributions also matched the measured values well for both experiments (RMSE ≤ 0.023 cm3 cm−3). Omitting any of the three proposed improvements reduced the estimation accuracy of relative transpiration, as the individual contribution ratio for each improvement was between 21.2 and 51.2%. The improvements should be reasonable in providing rational parameter estimation for the water stress reduction function, from which root water uptake models can be established to accurately evaluate plant transpiration and simulate soil water flow in a soil–plant system. The parameterization strategy for the water stress reduction function of root water uptake not only benefits accurate evaluation of plant transpiration under drought conditions but also contributes to further study and description regarding the apparent hysteresis of root water uptake after re-watering.

M.Biddoccua, G.Guzmánb, G.Capelloa, T.Thielkec, P.Straussc, S.Winterd, J.G.Zallere, A.Nicolaif, D.Cluzeauf, D.Popescug, C.Buneah, A.Hobleh, E.Cavalloa, J.A.Gómezb
aInstitute for Agricultural and Earthmoving Machines (IMAMOTER), National Research Council of Italy (CNR), Torino, Italy
bInstitute for Sustainable Agriculture, CSIC, Cordoba, Spain
cInstitute for Land and Water Management Research, Federal Agency for Water Management, Petzenkirchen, Austria
dInstitute of Plant Protection and Institute of Integrative Nature Conservation Research, University of Natural Resources and Life Sciences, Vienna, Austria
eInstitute of Zoology, University of Natural Resources and Life Sciences Vienna, Vienna, Austria
fUniversité Rennes 1, Station Biologique de Paimpont, UMR 6553 EcoBio, 35380, Paimpont, France
gSC JIDVEI SRL, Research Department, Jidvei, Romania
hUniversity of Agriculture Science and Veterinary Medicine, Cluj Napoca, Romania

Received 16 March 2020, Revised 3 July 2020, Accepted 7 July 2020, Available online 17 July 2020.

Department of Irrigation, Drainage and Landscape Engineering, Faculty of Civil Engineering, Czech Technical University in Prague, Thakurova 7, 16629 Prague, Czech Republic
*Author to whom correspondence should be addressed.
Water 202012(6), 1787;
Received: 26 April 2020 / Revised: 11 June 2020 / Accepted: 20 June 2020 / Published: 23 June 2020
Accelerated soil erosion by water has many offsite impacts on the municipal infrastructure. This paper discusses how to easily detect potential risk points around municipalities by simple spatial analysis using GIS. In the Czech Republic, the WaTEM/SEDEM model is verified and used in large scale studies to assess sediment transports. Instead of computing actual sediment transports in river systems, WaTEM/SEDEM has been innovatively used in high spatial detail to define indices of sediment flux from small contributing areas. Such an approach has allowed for the modeling of sediment fluxes in contributing areas with above 127,484 risk points, covering the entire Czech Republic territory. Risk points are defined as outlets of contributing areas larger than 1 ha, wherein the surface runoff goes into residential areas or vulnerable bodies of water. Sediment flux indices were calibrated by conducting terrain surveys in 4 large watersheds and splitting the risk points into 5 groups defined by the intensity of sediment transport threat. The best sediment flux index resulted from the correlation between the modeled total sediment input in a 100 m buffer zone of the risk point and the field survey data (R2 from 0.57 to 0.91 for the calibration watersheds). Correlation analysis and principal component analysis (PCA) of the modeled indices and their relation to 11 lumped characteristics of the contributing areas were computed (average K-factor; average R-factor; average slope; area of arable land; area of forest; area of grassland; total watershed area; average planar curvature; average profile curvature; specific width; stream power index). The comparison showed that for risk definition the most important is a combination of morphometric characteristics (specific width and stream power index), followed by watershed area, proportion of grassland, soil erodibility, and rain erosivity (described by PC2).

FranciscoPedreroa, S.R.Grattanb, AlonBen-Galc, Gaetano AlessandroVivaldid

aDepartment of Irrigation, CEBAS-CSIC, Campus Universitario de Espinardo, 30100, Murcia, Spain
bDepartment of Land, Air and Water Resources, University of California, Davis, 95616, USA
cInstitute of Soil, Water and Environmental Sciences, Agricultural Research Organization, Gilat Research Center, M.P. Negev, 85280, Israel
dDipartimento di Scienze Agro-Ambientali e Territoriali, Università Degli Studi Di BariAldo Moro, Via Amendola 165/A, 70126 Bari, Italy

Received 29 April 2020, Revised 9 June 2020, Accepted 11 June 2020, Available online 23 June 2020.


Olive trees are iconic to the Mediterranean landscape and in recent times, have expanded to other regions across the globe that share similar climatic conditions. Olive oil production benefits from irrigation, but with a changing climate and uncertainty in precipitation patterns, wastewaters will likely play a larger role supplementing irrigation water requirements. However, due to their relatively poor quality, wastewaters present challenges for sustained long-term use in olive production. Wastewaters include all effluents from municipalities, agricultural drainage, animal production facilities, agricultural processing and industrial processes. This review focuses on potential opportunities and limitations of sustaining olive oil production in the Mediterranean region using wastewater of various sources. The primary challenges for using such wastewaters include concerns related to salinity, sodicity, metals and trace elements, nutrients, organics, and pathogens. Organics and plant nutrients in the effluents are typically beneficial but depend on dosages.

Many studies have shown that saline wastewaters have been successfully used to irrigate olives in Greece, Israel, Italy, Jordan and Tunisia. Still, olive varieties and rootstocks have different tolerances to salinity and could respond differently and oil quality may improve or be compromised. Salts and trace elements need to be monitored in plants and soil to make sure accumulation does not continue from year to year and that soil physical conditions are not affected. Some food industries generate effluents with suitable characteristics for irrigation but one must balance the benefits (e.g. addition of nutrients), detriments (e.g. addition of salts or other limiting chemicals) and costs when determining the feasibility and practicality of reuse. Long-term accumulation of trace elements and metals will likely limit the feasibility of using industrial-originating effluents without treatment processes that would remove the toxic constituents prior to reuse. Therefore, untreated wastewaters from the many industries have limited long-term potential for reuse at this time. Application of olive mill wastewater may be agronomically and economically beneficial, particularly as a local disposal solution, but there are concerns associated with high-concentrations of polyphenols that may be phytotoxic and toxic to soil microbial populations.

With regards to human safety, risk of contamination of table olives and olive oil is very low because irrigation methods deliver water below the canopy, fruits are not picked from the ground, processing itself eliminates pathogens and the irrigation season typically ends days or weeks before the harvest (depending on the climate condition). Finally, considering physiological, nutritional and intrinsic characteristics of this species, it is clear that olive trees are appropriate candidates for the reuse of recycled water as an irrigation source.

R.López-Urreaa, J.M.Sánchezb, A.Montoroa, F.Mañasa, D.S.Intriglioloc

a Instituto Técnico Agronómico Provincial (ITAP), Parque Empresarial Campollano, 2ª Avda. Nº 61, 02007, Albacete, Spain
b Dept. of Applied Physics, Regional Development Institute (IDR), Univ. of Castilla-La Mancha, Av. España, s/n, 02071 Albacete, Spain
c Departamento de Riego, Centro de Edafología y Biología Aplicada del Segura (CEBAS-CSIC) Espinardo, Murcia, Spain

Received 18 December 2019, Revised 25 May 2020, Accepted 31 May 2020, Available online 16 June 2020.