Several Dates (several + date)

Distribution by Scientific Domains


Selected Abstracts


Assessment of the nitrogen status of grassland

GRASS & FORAGE SCIENCE, Issue 2 2004
A. Farruggia
Abstract Two types of diagnostics are used for N management in grasslands: diagnostics based on N concentration of shoots and diagnostics based on soil mineral N. The Nitrogen Nutrition Index (NNI) is an example of the first type. However, its evaluation requires the determination of shoot dry weight per unit area and, thus, constitutes a practical limit to its utilization in the context of farm studies. In order to simplify its evaluation, a method based on the N concentration of the upper sward layer (Nup) has been proposed. The objectives of this study were to test the relationship between NNI and Nup in the context of permanent grassland and to examine the relationship between Nup and soil mineral status. The study was conducted as two experiments, one on small cut-plots receiving contrasting rates of mineral N fertilization, and a second on plots of an existing field-scale lysimeter experiment. In each plot and at several dates, shoot biomass within quadrats was measured, N concentration was determined on the upper leaves and on the entire shoots, and mineral nitrogen of the soil below the vegetation sampled was determined. N concentration of the upper lamina layer of the canopy was linearly related to the NNI determined on the entire shoots. Therefore, determining N concentration in leaves at the top of canopy appears to be an alternative means to evaluate NNI without having to measure shoot biomass. The absence of an overall significant correlation between soil mineral N content and sward N index, observed over the two studies, indicates that each of these two indicators has to be considered specifically in relation to the objective of the diagnostic procedure. As sward N index may vary independently of soil mineral N content, the sward N indicator does not appear to be a suitable indicator for diagnosis of environmental risks related to nitrate leaching. However, soil mineral N content does not allow the prediction of sward N status and thus is not a suitable indicator of sward growth rate. Although soil mineral N content is an important environmental indicator for nitrate-leaching risks during potential drainage periods, it has a limited diagnosis value with respect to the herbage production function of grasslands. [source]


Effects of 15N Split-application on Soil and Fertiliser N Uptake of Barley, Oilseed Rape and Wheat in Different Cropping Systems

JOURNAL OF AGRONOMY AND CROP SCIENCE, Issue 1 2007
K. Sieling
Abstract In intensive farming systems, farmers split up and apply the N fertilization to winter cereals and oilseed rape (OSR) at several dates to meet the need of the crop more precisely. Our objective was to determine how prior fertilizer N application as slurry and/or mineral N affects contributions of fertilizer- and soil-derived N to N uptake of barley (1997), oilseed rape (OSR; 1998) and wheat (1999). In addition, residual fertilizer N effects were observed in the subsequent crop. Since autumn 1991, slurry (none, slurry in autumn, in spring, in autumn plus in spring) and mineral N fertilizer (0, 12 and 24 g N m,2) were applied annually. Each year, the treatments were located on the same plots. In 1997,1999, the splitting rates of the mineral N fertilization were labelled with 15N. Non-fertilizer N uptake was estimated from the total N uptake and the fertilizer 15N uptake. All three crops utilized the splitting rates differently depending on the time of application. Uptake of N derived from the first N rate applied at the beginning of spring growth was poorer than that from the second splitting rate applied at stem elongation (cereals) or third splitting rate applied at ear emergence or bud formation (all three crops). In contrast, N applied later in the growing season was taken up more quickly, resulting in higher fertilizer N-use efficiency. Mineral N fertilization of 24 g N m,2 increased significantly non-fertilizer N uptake of barley and OSR at most of the sampling dates during the growing season. In cereals, slurry changed the contribution of non-fertilizer N to the grain N content only if applied in spring, while OSR utilized more autumn slurry N. In OSR and wheat, only small residual effects occurred. The results indicate that 7 years of varying N fertilization did not change the contribution of soil N to crop N uptake. [source]


Nutrient Uptake in a Large Urban River,

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 3 2007
Catherine A. Gibson
Abstract:, Small streams have been shown to be efficient in retaining nutrients and regulating downstream nutrient fluxes, but less is known about nutrient retention in larger rivers. We quantified nutrient uptake length and uptake velocity in a regulated urban river to determine the river's ability to retain nutrients associated with wastewater treatment plant (WWTP) effluent. We measured net uptake of soluble reactive phosphorus (SRP), dissolved organic phosphorus, ammonium (NH4), nitrate, and dissolved organic nitrogen in the Chattahoochee River, Atlanta, GA by following the downstream decline of nutrients and fluoride from WWTP effluent on 10 dates under low flow conditions. Uptake of all nutrients was sporadic. On many dates, there was no evidence of measurable nutrient uptake lengths within the reach; indeed, on several dates release of inorganic N and P within the sample reach led to increased nutrient export downstream. When uptake occurred, SRP uptake length was negatively correlated with total suspended solids and temperature. Uptake velocities of SRP and NH4 in the Chattahoochee River were lower than velocities in less-modified systems, but they were similar to those measured in other WWTP impacted systems. Lower uptake velocities indicate a diminished capacity for nutrient uptake. [source]


Identification and significance of sources of spatial variation in grapevine water status

AUSTRALIAN JOURNAL OF GRAPE AND WINE RESEARCH, Issue 1 2010
J.A. TAYLOR
Abstract Background and Aims:, Water stress in grapevines is directly linked to grape quality. Differential vine water management should therefore be strongly linked to the water stress in the vine. To do this, an understanding of the dominant drivers and indicators of vine water status are needed from a sub-block to whole vineyard level. This understanding will help generate effective vine water status models for variable rate irrigation systems. Methods and Results:, A vineyard in the south of France was sampled for pre-dawn leaf water potential (,PD) at several dates during the growing season for two consecutive years. Sampling was stratified by soil types and relative within-block vegetative expression. A recursive partitioning analysis identified that cultivar had a dominant effect at low water stress, while vegetative expression and then soil unit effects became dominant as water restriction increased. Variance in ,PD was calculated at difference scales (plant, site, block and vineyard) and Smith's heterogeneity law was used to evaluate the scalar nature of ,PD variance. Spatial heterogeneity increased as the season and water restriction increased. Conclusion:, Variance in ,PD changed temporally through a season and the dominant drivers/indicators also changed. The opportunity to spatially manage water stress (irrigation) increased as water restriction increased. Significance of the Study:, Managing vine water stress helps optimise production and a ,PD model would be a useful addition to a viticulture decision support system. This study identified how the variance in ,PD evolved during a season and the best ancillary indicators of ,PD for spatial and temporal modelling. [source]