Residue Management (residue + management)

Distribution by Scientific Domains


Selected Abstracts


Incidence of Cotton Seedling Diseases Caused by Rhizoctonia solani and Thielaviopsis basicola in Relation to Previous Crop, Residue Management and Nutrients Availability in Soils in SW Spain

JOURNAL OF PHYTOPATHOLOGY, Issue 11-12 2005
A. Delgado
Abstract Cotton seedling damping-off is considered a disease complex, in which several pathogens can be involved. In SW Spain, postemergence damping-off seems to be mainly associated with Rhizoctonia solani and Thielaviopsis basicola, posing a serious limitation for crop, especially in cold springs. Ninety-seven commercial plots, where postemergence damping-off of cotton seedlings was observed during previous years, were selected in April 2001. In each plot, plants were randomly sampled between cotyledon to three true-leaf stage and soil samples besides the plants were taken. Symptomatic plants were separated according to the main observable seedling disease symptom: black necrosis (black root rot), brown necrosis and other symptoms. Thielaviopsis basicola inoculum was estimated in soil samples. Soil samples were also analysed for nutrient availability (N, P, K, Ca, Mg, Fe, Cu, Mn and Zn). All the sampled plants showed some seedling disease symptom. Macroscopic symptoms can provide a reasonable distinction between these two major pathogens involved in seedling disease symptoms in the studied area: the percentage of T. basicola isolates (18%) from black necrosis symptomatic plants was significantly higher than that of R. solani (4.1%), whereas in brown necrosis symptomatic plants, the situation was reversed (10.7 vs. 12.8%). The percentage of plants with black necrosis symptoms was inversely related to the portion of plants with brown necrosis in each plot. The mean incidence of black necrosis was significantly lower in plots with residue incorporation (sugar beet as the preceding crop) than in plots without residue incorporation. No significant effect of preceding crop or residue management on brown necrosis incidence was observed. Incidence of black necrosis was negatively correlated with available N measured as NO3 -N when corn or sunflower were the preceding crop. The incidence of black necrosis was positively related to Fe availability in soil after cotton as preceding crop, whereas brown necrosis was negatively related to the availability of this micronutrient. [source]


Developing Web-based Interdisciplinary Modules to Teach Solid Waste/Residue Management in the Food Chain

JOURNAL OF FOOD SCIENCE EDUCATION, Issue 3 2003
C.W. Shanklin
ABSTRACT: A Web-based interdisciplinary instructional resource was developed to provide information that will increase food science educators' knowledge of waste management in the food chain. The 4 modules are: legal implications for management of wastes/residues; identification, quantification, and characterization of wastes/residues; management of wastes/residues; and economic ramifications of wastes/residues. Instructional materials are available for faculty and GTAs for use in teaching the 4 modules. Food science educators can use this Web-based instructional tool as an educational resource in their undergraduate classes to enhance students' knowledge and ability to solve critical environmental problems in the food chain. Seehttp://www.oznet.ksu.edu/swr/home/welcome.htm [source]


Effect of Straw on Yield Components of Rice (Oryza sativa L.) Under Rice-Rice Cropping System

JOURNAL OF AGRONOMY AND CROP SCIENCE, Issue 2 2006
K. Surekha
Abstract Field experiments were conducted at the Directorate of Rice Research experimental farm, ICRISAT campus, Patancheru, Hyderabad, during 1998,2000 for five consecutive seasons (three wet and two dry seasons) with five treatments [T1 , 100 % straw incorporation; T2 , 50 % straw incorporation; T3 , 100 % straw + green manure (GM) incorporation; T4 , 100 % straw burning and T5 , 100 % straw removal (control)] along with the recommended dose of fertilizers to evaluate the effect of different crop residue management (CRM) practices on yield components and yield of rice in rice,rice cropping sequence. The ammonium N measured at active tillering was higher in 100 % straw-added plots over 50 % straw addition and straw removal with maximum values in the straw + GM-incorporated plots. Among the yield components, tillers, panicles and spikelets were influenced from the second season of residue incorporation with significant increase in 100 % straw-added treatments. The increase in tiller and panicle number could be attributed to the increased NH4 -N in these treatments, which is evident from the significant correlation between tiller number and NH4 -N (r = 0.82**) and panicle number and NH4 -N (r = 0.87**). The influence of residue treatments on rice grain yield was observed from the third season onwards where incorporation of straw alone or in combination with GM and burning of straw significantly increased grain and straw yields. Grain yield showed significant positive correlation with the number of tillers (r = 0.74*,0.81**) and panicles (r = 0.74*,0.84**) in three treatments (T1, T3 andT4) where grain yields were significantly higher. The regression analysis showed that 57,66 % and 64,75 % of the variation in yield could be explained by tillers and panicles together in these three treatments during wet and dry seasons respectively. Thus, CRM practices such as addition of 100 % straw either alone or with GM and straw burning influenced the yield components (tillers, panicles and spikelets) positively and thereby increased rice grain yields. [source]


Incidence of Cotton Seedling Diseases Caused by Rhizoctonia solani and Thielaviopsis basicola in Relation to Previous Crop, Residue Management and Nutrients Availability in Soils in SW Spain

JOURNAL OF PHYTOPATHOLOGY, Issue 11-12 2005
A. Delgado
Abstract Cotton seedling damping-off is considered a disease complex, in which several pathogens can be involved. In SW Spain, postemergence damping-off seems to be mainly associated with Rhizoctonia solani and Thielaviopsis basicola, posing a serious limitation for crop, especially in cold springs. Ninety-seven commercial plots, where postemergence damping-off of cotton seedlings was observed during previous years, were selected in April 2001. In each plot, plants were randomly sampled between cotyledon to three true-leaf stage and soil samples besides the plants were taken. Symptomatic plants were separated according to the main observable seedling disease symptom: black necrosis (black root rot), brown necrosis and other symptoms. Thielaviopsis basicola inoculum was estimated in soil samples. Soil samples were also analysed for nutrient availability (N, P, K, Ca, Mg, Fe, Cu, Mn and Zn). All the sampled plants showed some seedling disease symptom. Macroscopic symptoms can provide a reasonable distinction between these two major pathogens involved in seedling disease symptoms in the studied area: the percentage of T. basicola isolates (18%) from black necrosis symptomatic plants was significantly higher than that of R. solani (4.1%), whereas in brown necrosis symptomatic plants, the situation was reversed (10.7 vs. 12.8%). The percentage of plants with black necrosis symptoms was inversely related to the portion of plants with brown necrosis in each plot. The mean incidence of black necrosis was significantly lower in plots with residue incorporation (sugar beet as the preceding crop) than in plots without residue incorporation. No significant effect of preceding crop or residue management on brown necrosis incidence was observed. Incidence of black necrosis was negatively correlated with available N measured as NO3 -N when corn or sunflower were the preceding crop. The incidence of black necrosis was positively related to Fe availability in soil after cotton as preceding crop, whereas brown necrosis was negatively related to the availability of this micronutrient. [source]


INTEGRATED MANAGEMENT OF IN-FIELD, EDGE-OF-FIELD, AND AFTER-FIELD BUFFERS,

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 1 2006
Seth M. Dabney
ABSTRACT: This review summarizes how conservation benefits are maximized when in-field and edge-of-field buffers are integrated with each other and with other conservation practices such as residue management and grade control structures. Buffers improve both surface and subsurface water quality. Soils under permanent buffer vegetation generally have higher organic carbon concentrations, higher infiltration capacities, and more active microbial populations than similar soils under annual cropping. Sediment can be trapped with rather narrow buffers, but extensive buffers are better at transforming dissolved pollutants. Buffers improve surface runoff water quality most efficiently when flows through them are slow, shallow, and diffuse. Vegetative barriers - narrow strips of dense, erect grass - can slow and spread concentrated runoff. Subsurface processing is best on shallow soils that provide increased hydrologic contact between the ground water plume and buffer vegetation. Vegetated ditches and constructed wetlands can act as "after-field" conservation buffers, processing pollutants that escape from fields. For these buffers to function efficiently, it is critical that in-field and edge-of-field practices limit peak runoff rate and sediment yield in order to maximize contact time with buffer vegetation and minimize the need for cleanout excavation that destroys vegetation and its processing capacity. [source]


Erosion modelling approach to simulate the effect of land management options on soil loss by considering catenary soil development and farmers perception

LAND DEGRADATION AND DEVELOPMENT, Issue 6 2008
A. C. Brunner
Abstract The prevention of soil erosion is one of the most essential requirements for sustainable agriculture in developing countries. In recent years it is widely recognized that more site-specific approaches are needed to assess variations in erosion susceptibility in order to select the most suitable land management methods for individual hillslope sections. This study quantifies the influence of different land management methods on soil erosion by modelling soil loss for individual soil-landscape units on a hillslope in Southern Uganda. The research combines a soil erosion modelling approach using the physically based Water Erosion Prediction Project (WEPP)-model with catenary soil development along hillslopes. Additionally, farmers' perceptions of soil erosion and sedimentation are considered in a hillslope mapping approach. The detailed soil survey confirmed a well-developed catenary soil sequence along the hillslope and the participatory hillslope mapping exercise proved that farmers can distinguish natural soil property changes using their local knowledge. WEPP-model simulations show that differences in soil properties, related to the topography along the hillslope, have a significant impact on total soil loss. Shoulder and backslope positions with steeper slope gradients were most sensitive to changes in land management. Furthermore, soil conservation techniques such as residue management and contouring could reduce soil erosion by up to 70 percent on erosion-sensitive slope sections compared to that under tillage practices presently used at the study site. The calibrated model may be used as a tool to provide quantitative information to farmers regarding more site-specific land management options. Copyright © 2008 John Wiley & Sons, Ltd. [source]


Soil organic matter decline and compositional change associated with cereal cropping in southern Tanzania

LAND DEGRADATION AND DEVELOPMENT, Issue 1 2001
J. F. McDonagh
Abstract The spatial analogue method and 13C analytical techniques were used to reveal medium- to long-term changes in soil organic matter (SOM) in farmers' fields under maize in southern Tanzania. Aerial photography and detailed farmer interviews were used to relate land-use history to declines in SOM concentration and changes in composition. The research attempted to measure the rate of SOM decline and the extent to which farmers' residue management practice was allowing cereal residues to contribute to SOM. The combination of research methods employed in this study proved to be highly complementary. Results indicate that native SOM decreased by on average 50 per cent; after 25 years of cultivation. Under current residue management with cereal residues mostly grazed and burnt there is only a relatively modest contribution from cereal residues to SOM. When cereal residues are retained in the field it is likely they will contribute significantly to SOM but they are much less likely to build SOM in the medium to long term. The paper concludes that in many situations it is probably best for farmers to allow the majority of the residues to be eaten by cattle in these systems rather than attempt to build SOM or risk nitrogen immobilization in cropped fields. The greater importance of inputs of high-quality (e.g. legume) residues for nutrient supply in the short term is highlighted, in contrast to inputs of poor-quality (e.g. cereal) residues in an attempt to build SOM in the longer term. Copyright © 2001 John Wiley & Sons, Ltd. [source]