Scholarly Work - Research Centers

Permanent URI for this collection

Browse

Recent Submissions

Now showing 1 - 20 of 236
  • Item
    Postharvest Treatment Effects on ‘Somerset Seedless’ Cold-Hardy Table Grapes
    (Informa UK Limited, 2023-12) Wang, Zhuoyu; Svyantek, Andrej; Miller, Zachariah; Jarrett, Bridgid; Green, Stacy; Kapus, Ashley
    Limited amount of information is available for cold-hardy table grape postharvest storage and strategies to extend the storage time. ‘Somerset Seedless’ is a cold-hardy table grape with a potential market for the Upper Midwest and Northern Great Plains. Postharvest treatments were assessed as a possible route to increase cold-hardy table grape shelf-life. In this study, a 1.4% chitosan postharvest treatment was tested on ‘Somerset Seedless’ grapes through 7 weeks of storage with temperature 1–4°C, humidity ≥90%. The effects were compared to two controls: one was diH2O, and the other one was acetic acid. The assessment included grape appearance traits (rachis, decay, mold, scattering, and splitting), physiochemical properties (TSS, pH and total acidity), antioxidant activity, as well as total phenolics and flavonoid content. In general, 1.4% chitosan, 1% acetic acid, and diH2O had a large impact on grape appearance during storage with limited impact on chemistry. Throughout 7 weeks of storage, no significant differences were observed in grape physiochemical and phytochemical changes. Chitosan performed the best for ‘Somerset Seedless’ postharvest storage in regard to the low rate of visible damage. Although acetic acid had similarly positive effects on mold control as chitosan treatment, it caused the highest split rate after 1 week and the highest shatter rate after 5 weeks of storage. Chitosan treated ‘Somerset Seedless’ still met the USDA standard of table grapes after 5 weeks. This study suggests chitosan postharvest treatments may have applications for enhancing the shelf-life of cold-hardy table grapes.
  • Item
    Sentinel-2-based predictions of soil depth to inform water and nutrient retention strategies in dryland wheat
    (Elsevier BV, 2023-11) Fordyce, Simon I.; Carr, Patrick M.; Jones, Clain; Eberly, Jed O.; Sigler, W. Adam; Ewing, Stephanie; Powell, Scott L.
    The thickness or depth of fine-textured soil (zf) dominates water storage capacity and exerts a control on nutrient leaching in semi-arid agroecosystems. At small pixel sizes (< 1 m; ‘fine resolution’), the normalized difference vegetation index (NDVI) of cereal crops during senescence (Zadoks Growth Stages [ZGS] 90–93) offers a promising alternative to destructive sampling of zf using soil pits. However, it is unclear whether correlations between zf and NDVI exist (a) at larger pixel sizes (1–10 m; ‘intermediate resolution’) and (b) across field boundaries. The relationship of zf to NDVI of wheat (Triticum aestivum L.) was tested using images from a combination of multispectral sensors and fields in central Montana. NDVI was derived for one field using sensors of fine and intermediate spatial resolution and for three fields using intermediate resolution sensors only. Among images acquired during crop senescence, zf was correlated with NDVI (p < 0.05) independent of sensor (p = 0.22) and field (p = 0.94). The zf relationship to NDVI was highly dependent on acquisition day (p < 0.05), but only when pre-senescence (ZGS ≤ 89) images were included in the analysis. Results indicate that cereal crop NDVI of intermediate resolution can be used to characterize zf across field boundaries if image acquisition occurs during crop senescence. Based on these findings, an empirical index was derived from multi-temporal Sentinel-2 imagery to estimate zf on fields in and beyond the study area.
  • Item
    Rumen microbiome response to sustained release mineral bolus supplement with low- and high-quality forages
    (Frontiers Media SA, 2023-06) Eberly, Jed O.; Wyffels, Samuel A.; Carlisle, Tanner J.; DelCurto, Timothy
    Introduction: Limited forage quantity and quality are challenges faced in livestock production systems in semi-arid rangelands of the western United States, particularly when livestock face stressors such as cold weather or have increased nutritional requirements such as during pregnancy and lactation. To meet livestock nutrient requirements, producers frequently provide supplemental nutrition, however there is limited knowledge regarding the effects of these practices on the rumen microbiome in these environments. Methods: A study was conducted to evaluate changes in the rumen microbiome in response to high- and low- quality forage with sustained release mineral boluses. The study consisted of 16 ruminally-cannulated 2–3-year-old black angus cows fed high quality grass alfalfa hay or low-quality grass hay with a 90 or 180 day sustained release mineral bolus. Rumen samples were collected pre-feeding and 8 hours post feeding and bacterial 16S rRNA gene amplicons were sequenced from the rumen fluid. Results: Alpha diversity as measured by Shannon’s diversity index decreased significantly over time (p<0.01) and averaged 5.6 pre-feeding and 5.4 post- feeding and was not significantly different between high- and low-quality forages or between mineral bolus types (p>0.05). Principal coordinates analysis (PCoA) of the Bray-Curtis dissimilarity matrix showed distinct grouping by feed quality and time but not by mineral bolus type. Bacteroidetes and Firmicutes were the dominant phyla in all treatments and significant increases (p<0.05) in the relative abundance of the family Lachnospiraceae and the genus Prevotella were observed in high quality forage diets. Rumen VFA and NH3-N concentrations were also strongly associated with the high-quality forage diet. Predictive functional profiling indicated that functions associated with methanogenesis were negatively correlated with feed quality. Discussion: The results of this study suggest that mineral bolus type is unlikely to affect rumen bacterial community structure or function while forage quality can significantly alter community structure and predicted functions associated with methanogenesis and VFA production.
  • Item
    Rumen microbiome response to sustained release mineral bolus supplement with low- and high-quality forages
    (Frontiers Media SA, 2023-06) Eberly, Jed O.; Wyffels, Samuel A.; Carlisle, Tanner J.; DelCurto, Timothy
    Introduction: Limited forage quantity and quality are challenges faced in livestock production systems in semi-arid rangelands of the western United States, particularly when livestock face stressors such as cold weather or have increased nutritional requirements such as during pregnancy and lactation. To meet livestock nutrient requirements, producers frequently provide supplemental nutrition, however there is limited knowledge regarding the effects of these practices on the rumen microbiome in these environments. Methods: A study was conducted to evaluate changes in the rumen microbiome in response to high- and low- quality forage with sustained release mineral boluses. The study consisted of 16 ruminally-cannulated 2–3-year-old black angus cows fed high quality grass alfalfa hay or low-quality grass hay with a 90 or 180 day sustained release mineral bolus. Rumen samples were collected pre-feeding and 8 hours post feeding and bacterial 16S rRNA gene amplicons were sequenced from the rumen fluid. Results: Alpha diversity as measured by Shannon’s diversity index decreased significantly over time (p<0.01) and averaged 5.6 pre-feeding and 5.4 post- feeding and was not significantly different between high- and low-quality forages or between mineral bolus types (p>0.05). Principal coordinates analysis (PCoA) of the Bray-Curtis dissimilarity matrix showed distinct grouping by feed quality and time but not by mineral bolus type. Bacteroidetes and Firmicutes were the dominant phyla in all treatments and significant increases (p<0.05) in the relative abundance of the family Lachnospiraceae and the genus Prevotella were observed in high quality forage diets. Rumen VFA and NH3-N concentrations were also strongly associated with the high-quality forage diet. Predictive functional profiling indicated that functions associated with methanogenesis were negatively correlated with feed quality. Discussion: The results of this study suggest that mineral bolus type is unlikely to affect rumen bacterial community structure or function while forage quality can significantly alter community structure and predicted functions associated with methanogenesis and VFA production.
  • Item
    Intercropping chickpea–flax for yield and disease management
    (Wiley, 2023-03) Zhou, Yi; Chen, Chengci; Franck, William L.; Khan, Qasim; Franck, Sooyoung; Crutcher, Frankie K.; McVay, Kent; McPhee, Kevin
    Ascochyta blight (caused by Ascochyta rabiei) is a primary concern of chickpea production worldwide. Intercropping chickpea with a non-host crop has the potential to suppress this disease and improve resource use efficiency for enhanced crop yield. This study aimed to evaluate the effects of seeding rate and row configuration of chickpea (Cicer arietinum L.)–flax (Linum usitatissimum L) intercropping on (1) yield and seed quality, (2) disease incidence and severity of Ascochyta blight of chickpea, and (3) land productivity of this intercropping system. Field trials were conducted at the Eastern Agricultural Research Center, Sidney, MT, and the Southern Agricultural Research Center, Huntley, MT, in 2020 and 2021. Chickpea was planted with flax in four intercropping configurations (70% chickpea–30% flax in mixed rows, 50% chickpea–50% flax in alternate rows, 50% chickpea–50% flax in mixed rows, and 30% chickpea–70% flax in mixed rows). Chickpea yield decreased with increased flax proportion in the mixed rows intercrop. Flax displayed higher competitiveness than chickpea, resulting in decreased yield and protein concentration in chickpea but increased yield and protein content in flax. Land equivalent ratio of intercropping was greater than one, showing improved land productivity (2%–23% greater than monocropping). Intercropping reduced Ascochyta blight disease incidence and severity; the 50% chickpea–50% flax and 30% chickpea–70% flax intercropping configurations could reduce the disease severity to 50% (in Huntley) and 67% (in Sidney) of that in the monocropping. These results indicated that seed ratio and planting configurations of chickpea–flax intercropping may be manipulated to increase land use efficiency and reduce Ascochyta blight in chickpea. Canadian Development Center ‘CDC Leader’ yielded greater than Royal in the higher disease pressure environment in Huntley indicated that selection of disease resistant cultivars is important for managing Ascochyta blight in chickpea.
  • Item
    Soil bacterial community response to cover crop introduction in a wheat-based dryland cropping system
    (Frontiers Media SA, 2022-11) Eberly, Jed O.; Bourgault, Maryse; Dafo, Julia M.; Yeoman, Carl J.; Wyffels, Samuel A.; Lamb, Peggy F.; Boss, Darrin L.
    The incorporation of cover crops into cropping systems is important for enhancing soil health in agricultural systems. Soil microbes contribute to soil health by supplying key nutrients and providing protection against plant pests, diseases, and abiotic stress. While research has demonstrated the connection between cover crops and the soil microbiology, less is known regarding the impact of cover crops on the soil microbial community in semi-arid regions of the Northern Great Plains. Our objectives were to evaluate changes in the soil bacterial community composition and community networks in wheat grown after multi-species cover crops. Cover crops were compared to continuous cropping and crop/fallow systems and the effects of cover crop termination methods were also evaluated. Cover crops consisted of a cool season multispecies mix, mid-season multispecies mix, and a warm season multispecies mix, which were grown in rotation with winter wheat. A continuous cropping (wheat/barley) and wheat/fallow system were also included along with cover crop termination by grazing, herbicide application, and haying. Cover crop treatments and termination methods had no significant impact on microbial community alpha diversity. Cover crop termination methods also had no significant impact on microbial community beta diversity. Families belonging to the phyla Actinobacteria, Bacterioidota, and Proteobacteria were more abundant in the cool season cover crop treatment compared to the warm season cover crop treatment. Co-occurrence network analysis indicated that incorporation of cool season cover crops or mid-season mixes in a wheat-based cropping system led to greater complexity and connectivity within these microbial networks compared to the other treatments which suggests these communities may be more resilient to environmental disturbances.
  • Item
    Effects of Maternal Protein Supplementation at Mid-Gestation of Cows on Intake, Digestibility, and Feeding Behavior of the Offspring
    (MDPI AG, 2022-10) Nascimento, Karolina Batista; Galvão, Matheus Castilho; Moreno Meneses, Javier Andrés; Miranda Moreira, Gabriel; Ramírez-Zamudio, German Darío; Priscilla de Souza, Stefania; Dias Prezotto, Ligia; Haddad Lima Chalfun, Luthesco; de Souza Duarte, Marcio; Rume Casagrande, Daniel; Pies Gionbelli, Mateus
    This study aimed to assess the effects of maternal protein supplementation and offspring sex (OS) on the intake parameters of the offspring. Forty-three Tabapuã cows were randomly allocated in the following treatments: protein supplementation (PS) during days 100–200 of gestation (RES, 5.5% total crude protein (CP), n = 2, or CON, 10% total CP, n = 19) and OS (females, n = 20; males, n = 23). The offspring were evaluated during the cow–calf (0–210 days), backgrounding (255–320 days), growing 1 (321–381 days), and growing 2 (382–445 days) phases. The CON offspring tended to present higher dry matter intake (DMI) at weaning (p = 0.06). The CON males presented lower digestibility of major diet components in the growing 2 phase (p ≤ 0.02). The CON offspring spent 52% more time per day eating supplements at 100 days and 17% less time in idleness at 210 days. The CON males spent 15 min more per day ruminating than RES males in the feedlot phase (p = 0.01). We concluded that protein supplementation over gestation alters the offspring feed intake pattern as a whole, while protein restriction promotes compensatory responses on nutrient digestibility in males.
  • Item
    Intercropping chickpea-flax for yield and disease management
    (Wiley, 2022-12) Zhou, Yi; Chen, Chengci; Franck, William L.; Khan, Qasim; Franck, Sooyoung; Crutcher, Frankie K.; McVay, Kent; McPhee, Kevin
    Ascochyta blight (caused by Ascochyta rabiei) is a primary concern of chickpea production worldwide. Intercropping chickpea with a non-host crop has the potential to suppress this disease and improve resource use efficiency for enhanced crop yield. This study aimed to evaluate the effects of seeding rate and row configuration of chickpea (Cicer arietinum, L.)-flax (Linum usitatissimum, L) intercropping on 1) yield and seed quality, 2) disease incidence and severity of Ascochyta blight of chickpea, and 3) land productivity of this intercropping system. Field trials were conducted at the Eastern Agricultural Research Center (EARC), Sidney, MT, and the Southern Agricultural Research Center (SARC), Huntley, MT, in 2020 and 2021. Chickpea was planted with flax in 4 intercropping configurations (70% chickpea – 30% flax in mixed rows, 50% chickpea – 50% flax in alternate rows, 50% chickpea – 50% flax in mixed rows, and 30% chickpea – 70% flax in mixed rows). Chickpea yield decreased with increased flax proportion in the mixed rows intercrop. Flax displayed higher competitiveness than chickpea, resulting in decreased yield and protein concentration in chickpea but increased yield and protein content in flax. Land equivalent ratio (LER) of intercropping was greater than 1, showing improved land productivity (2% -23% greater than monocropping). Intercropping reduced Ascochyta blight disease incidence and severity; the 50% chickpea – 50% flax and 30% chickpea – 70% flax intercropping configurations could reduce the disease severity to 50% (in Hunley) and 67% (in Sidney) of that in the monocropping. These results indicated that seed ratio and planting configurations of chickpea-flax intercropping may be manipulated to increase land use efficiency and reduce Ascochyta blight in chickpea. CDC Leader yielded greater than Royal in the higher disease pressure environment in Huntley indicated that selection of disease resistant cultivar is important for managing Ascochyta blight on chickpea.
  • Item
    Introducing cover crops as a fallow replacement in the Northern Great Plains: I. Evaluation of cover crop mixes as a forage source for grazing cattle
    (Cambridge University Press, 2021-09) Wyffels, Samuel A.; Bourgault, Maryse; Dafoe, Julia M.; Lamb, Peggy F.; Boss, Darrin L.
    Crop-livestock integration has demonstrated that cover crops can be terminated using livestock grazing with minimal negative impacts on soil health, however, provides little information on system-level approaches that mutually benefit soil health and both crop and livestock production. Therefore, the objective of this research was to examine the effects of cover crop mixtures on biomass production, quality and the potential for nitrate toxicity on a dryland wheat-cover crop rotation. This research was conducted at the Montana State University-Northern Agricultural Research Center near Havre, MT (48°29′N, −109°48′W) from 2012 to 2019. This experiment was conducted as a randomized-complete-block design, where 29 individual species were utilized in 15 different cover crop mixtures in a wheat-cover crop rotation. Cover crop mixtures were classified into four treatment groups, including (1) cool-season species, (2) warm-season species dominant, (3) cool and warm-season species mixture (mid-season), and (4) a barley (Hordeum vulgare) control. All cover crop mixtures were terminated at anthesis of cool-season cereal species to avoid volunteer cereal grains in the following wheat crop. At the time of cover crop termination, dry matter forage production was estimated and analyzed for crude protein, total digestible nutrients and nitrates as indicators of forage quality. All mixtures containing oats (Avena sativa) had greater (P ⩽ 0.03) biomass production than other mixtures within their respective treatment groups (cool- and mid-season). Forage biomass was influenced by cover crop treatment group, with the barley producing the greatest (P < 0.01) amount of forage biomass when compared to cool-, mid- and warm-season cover crop treatments. Total digestible nutrients were greater (P < 0.01) in the barley control compared to the cool- and mid-season treatment groups. Crude protein was greatest in the warm-season treatment group (P < 0.01) compared to the barley control, cool- and mid-season treatment groups. The barley control produced fewer nitrates (P ⩽ 0.05) than the cool-, mid- and warm-season treatment groups; however, all cover crop mixtures produced nitrates at levels unsafe for livestock consumption at least one year of the study. The relatively high and variable nitrate levels of all cover crop mixtures across years in this study suggest that forage should be tested for nitrates before grazing. In conclusion, our research suggests that in a dryland wheat-cover crop rotation that requires early-July termination, cool-season cover crop mixtures are the most suitable forage source for livestock grazing most years.
  • Item
    Distinctive germination attributes of feather fingergrass (Chloris virgata) biotypes in response to different thermal conditions
    (Cambridge University Press, 2022-06) Desai, Het Samir; Chauhan, Chauhan
    An in-depth understanding of the germination response of troublesome weed species, such as feather fingergrass (Chloris virgata Sw.), to environmental factors (temperature, soil moisture, etc.) could play an essential role in the development of sustainable site-specific weed control programs. A laboratory experiment was conducted to understand the germination response of 10 different biotypes of C. virgata to five temperature regimes (ranging from 15/5 to 35/25 C) under a 12/12-h (light/dark) photoperiod. No consistent germination behavior was observed between biotypes, as some biotypes demonstrated high final cumulative germination (FCG) at low alternating temperature regimes (15/5 and 20/10 C) and some biotypes exhibited high FCG at a high alternating temperature regime (30/20 C). All biotypes revealed late germination initiation (T10, time taken to reach 10% germination) at the lowest temperature range (15/5 C), ranging from 171 to 173 h. However, less time was required to reach 90% germination (T90), ranging from 202 to 756 h. At higher alternating temperature regimes (30/20 and 35/25 C), all biotypes initiated germination (T10) within 40 h, and a wide range of hours was required to reach 90% germination (T90), ranging from 284 to 1,445 h. Differences in FCG of all the biotypes at all the temperature ranges showcased the differential germination nature among biotypes of C. virgata. The cool temperatures delayed germination initiation compared with warmer temperatures, even though FCGs were similar across a wide range of thermal conditions, indicating that this species will be problematic throughout the calendar year in different agronomic environments. The data from this study have direct implications on scheduling herbicide protocols, tillage timing, and planting time. Therefore, data generated from this study can aid in the development of area- and species-specific weed control protocols to achieve satisfactory control of this weed species.
  • Item
    Crop rotation influences yield more than soil quality at a semiarid location
    (Wiley, 2022-07) McVay, Kent; Khan, Qasim A.
    Winter wheat (Triticum aestivum L.) is often rotated with a fallow in semiarid regions to conserve soil moisture and minimize crop failures. We hypothesized that direct seed systems coupled with intensified and more diverse crop rotations would produce an equivalent or greater annualized grain yield than a traditional winter wheat–fallow (WF) system. Furthermore, continuous cropping would lead to an accumulation of greater soil carbon than the traditional WF. A 6-yr study was conducted to evaluate crop rotations, which included pea (Pisum sativum L.), lentil (Lens culinaris Medik), lentil as a cover crop, spring wheat, or camelina [Camelina sativa (L.) Crantz] in rotation with winter wheat in 2- and 3-yr rotations. The results of this study averaged over 6 yr showed that increased cropping intensity produced an annualized yield equal to that of WF, provided that either a fallow or a cover crop rather than another grain crop was present prior to winter wheat. The soil quality indices showed particulate organic matter (POM) increased with rotations with greater cropping intensity (1.00 vs. 0.67), although the POM of these rotations was not different from that of WF.
  • Item
    Haskap maturity stages and their influence on postharvest berry quality
    (Canadian Science Publishing, 2022-06) Leisso, Rachel S.; Jarrett, Bridgid; Richter, Rebecca; Miller, Zachariah J.
    Limited information is available regarding haskap berry maturity and corresponding postharvest characteristics. We assessed detached berry quality, respiration rate, and ethylene production at five stages of maturity and compared postharvest storage influence on berries harvested at half-blue and softening stages. Ethylene’s increase at successive stages suggests its involvement with berry maturation, but concomitant respiration does not support classifying haskap ripening as climacteric. Results indicate harvesting at the less mature half-blue stage is not recommended, as berries had lower fresh weight and inferior quality relative to those harvested at the softening stage, both at harvest and following 14 d storage.
  • Item
    Does Elevated [CO2] Only Increase Root Growth in the Topsoil? A FACE Study with Lentil in a Semi-Arid Environment
    (MDPI, 2021-03) Bourgault, Maryse; Tausz-Posch, Sabine; Greenwood, Mark; Löw, Markus; Henty, Samuel; Armstrong, Roger D.; O’Leary, Garry L.; Fitzgerald, Glenn J.; Tausz, Michael
    Atmospheric carbon dioxide concentrations [CO2] are increasing steadily. Some reports have shown that root growth in grain crops is mostly stimulated in the topsoil rather than evenly throughout the soil profile by e[CO2], which is not optimal for crops grown in semi-arid environments with strong reliance on stored water. An experiment was conducted during the 2014 and 2015 growing seasons with two lentil (Lens culinaris) genotypes grown under Free Air CO2 Enrichment (FACE) in which root growth was observed non-destructively with mini-rhizotrons approximately every 2–3 weeks. Root growth was not always statistically increased by e[CO2] and not consistently between depths and genotypes. In 2014, root growth in the top 15 cm of the soil profile (topsoil) was indeed increased by e[CO2], but increases at lower depths (30–45 cm) later in the season were greater than in the topsoil. In 2015, e[CO2] only increased root length in the topsoil for one genotype, potentially reflecting the lack of plant available soil water between 30–60 cm until recharged by irrigation during grain filling. Our limited data to compare responses to e[CO2] showed that root length increases in the topsoil were correlated with a lower yield response to e[CO2]. The increase in yield response was rather correlated with increases in root growth below 30 cm depth.
  • Item
    Genotypic variability in root length in pea (Pisum sativum L.) and lentil (Lens culinaris Medik.) cultivars in a semi-arid environment based on mini-rhizotron image capture
    (Wiley, 2022-01) Bourgault, Maryse; Lamb, Peggy F.; McPhee, Kevin; McGee, Rebecca J.; Vandenberg, Albert; Warkentin, Tom
    Physiological breeding is an approach that complements conventional breeding by providing characterizations of traits present in breeding populations. This allows breeders the ability to choose crosses based on desirable and adaptive traits, an approach that may be more reliable than selection on yield alone. In this study, we determined how much genotypic variability was present in selected lines of modern field pea (Pisum sativum L.) and lentil (Lens culinaris Medik.) cultivars from Montana, North Dakota, Washington, and Saskatchewan, Canada, and if root growth, particularly at depth, improves the fitness of lines to semi-arid environments. We conducted experiments at the Northern Agricultural Research Center of Montana State University from 2017 to 2019 inclusively to investigate root growth with mini-rhizotrons in 29 field pea lines and 25 lentil lines. Results suggest there is large genotypic variability in root length across the soil profile and the proportion of root length found below 30 cm in both crops, and these root traits appear independent of each other. In field pea, the highest yielding cultivars were intermediary in both total root length and the proportion of root length below 30 cm, suggesting large root systems and/or deeper root profiles are not necessarily beneficial in this environment. By contrast, in lentil, total root length and root length found below 30 cm was well correlated with biomass and yield. For breeders interested in in improved adaptation to semi-arid environments, it may be too early to optimize root systems, and above-ground traits may still yield a better return on investment.
  • Item
    Introducing cover crops as fallow replacement in the Northern Great Plains: II. Impact on following wheat crops
    (Cambridge University Press, 2021-12) Bourgault, Maryse; Wyffels, Samuel A.; Dafoe, Julia M.; Lamb, Peggy F.; Boss, Darrin L.
    Crop-livestock integration has demonstrated that cover crops can be terminated using livestock grazing with minimal negative impacts on soil health, however, provides little information on system-level approaches that mutually benefit soil health and both crop and livestock production. Therefore, the objective of this research was to examine the effects of cover crop mixtures on biomass production, quality and the potential for nitrate toxicity on a dryland wheat-cover crop rotation. This research was conducted at the Montana State University-Northern Agricultural Research Center near Havre, MT (48°29′N, −109°48′W) from 2012 to 2019. This experiment was conducted as a randomized-complete-block design, where 29 individual species were utilized in 15 different cover crop mixtures in a wheat-cover crop rotation. Cover crop mixtures were classified into four treatment groups, including (1) cool-season species, (2) warm-season species dominant, (3) cool and warm-season species mixture (mid-season), and (4) a barley (Hordeum vulgare) control. All cover crop mixtures were terminated at anthesis of cool-season cereal species to avoid volunteer cereal grains in the following wheat crop. At the time of cover crop termination, dry matter forage production was estimated and analyzed for crude protein, total digestible nutrients and nitrates as indicators of forage quality. All mixtures containing oats (Avena sativa) had greater (P ⩽ 0.03) biomass production than other mixtures within their respective treatment groups (cool- and mid-season). Forage biomass was influenced by cover crop treatment group, with the barley producing the greatest (P < 0.01) amount of forage biomass when compared to cool-, mid- and warm-season cover crop treatments. Total digestible nutrients were greater (P < 0.01) in the barley control compared to the cool- and mid-season treatment groups. Crude protein was greatest in the warm-season treatment group (P < 0.01) compared to the barley control, cool- and mid-season treatment groups. The barley control produced fewer nitrates (P ⩽ 0.05) than the cool-, mid- and warm-season treatment groups; however, all cover crop mixtures produced nitrates at levels unsafe for livestock consumption at least one year of the study. The relatively high and variable nitrate levels of all cover crop mixtures across years in this study suggest that forage should be tested for nitrates before grazing. In conclusion, our research suggests that in a dryland wheat-cover crop rotation that requires early-July termination, cool-season cover crop mixtures are the most suitable forage source for livestock grazing most years.
  • Item
    Influence of residual feed intake and cow age on dry matter intake postweaning and peak lactation
    (Oxford University Press, 2021-11) Parsons, Cory T; Dafoe, Julia M; Wyffels, Samuel A; DelCurto, Timothy; Boss, Darrin L
    Supplemental nutrition for cattle is the greatest operating cost for cow-calf producers, accounting for 65% of the annual expenses (Meyer et al., 2008). Therefore, selection pressure for efficient animals that have lower feed intake but maintain production, or average intake with higher production, could have positive impacts on cow–calf profitability (Meyer et al., 2008). Thus, improving feed efficiency through genetic selection holds significant opportunity for the beef industry. Residual feed intake (RFI) is currently being used as a selection tool for purchasing and retaining heifers and for selecting bulls and semen. Most studies have used steers and terminal heifers when evaluating RFI impact on various aspects of beef cattle production (Kelly et al., 2010). Additionally, the majority of RFI studies have included energy-dense diets and rations focusing on feedlot performance (Lawrence et al., 2011). However, the use and relevance of RFI as a selection tool for the commercial cow–calf industry needs further research (Manafiazar et al., 2015). Research pertaining to RFI of cattle offered forage-based diets is limited (Arthur et al., 2005), with even fewer data available related to beef cattle forage-based production systems (Meyer et al., 2008). As a result, more research is needed to evaluate the utility of RFI estimates on the beef production in extensive forage base systems (Kenny et al., 2018). Therefore, the objectives of this study were to evaluate the influence of heifer postweaning RFI and cow age on dry matter intake (DMI), intake behavior, as well as milk production of dry-lotted black Angus beef cows. We hypothesized that heifers identified as low RFI eat less and the influence of RFI may interact with cow age.
  • Item
    Durum wheat yield and protein influenced by nitrogen management and cropping rotation
    (Informa UK Limited, 2022-04) Chen, Chengci; Zhou, Shuang; Afshar, Reza Keshavarz; Franck, William; Zhou, Yi
    Nitrogen (N) is the major input for cereal grain production. N management in durum wheat (Triticum turgidum subsp. durum) is critical for optimizing grain yield, protein concentration, and utilization efficiency of nitrogen fertilizer. A two-year study was conducted in the semi-arid region of the US Northern Great Plains (NGP) to investigate nitrogen input levels and application methods under fallow-durum and pea-durum systems. A durum wheat (cv. Joppa) was planted in the field following fallow or field pea with N input levels of 65 and 135 kg ha−1 and four application methods for each N input level. Results showed that water was the major limiting factor determining grain yield and protein concentration. Grain yield was greater but with similar protein concentration following fallow (1958 kg ha−1, 16.7%) than following field pea (1754 kg ha−1, 16.4%). Increasing N input from 65 kg ha−1 to 135 kg ha−1decreased grain yield from 1933 to 1779 kg ha−1 but improved protein concentration from 16.3 to 16.8%, which resulted in a negative nitrogen use efficiency (NUE). Application method of N did not significantly affect yield and protein, but there was a trend of yield increase via split application of N at the lower rate in a wetter year. The drought in 2017 resulted in lower test weight and harvest index (HI). The HI was lower in the135 kg ha−1 N rate than in the 65 kg ha−1 N rate, especially in the dryer year. Excessive N inputs in a water-limited environment may result in ‘haying-off’.
  • Item
    Influence of planting date and herbicide program on Amaranthus palmeri control in dry bean
    (Cambridge University Press, 2021-11) Beiermann, Clint W.; Creech, Cody F.; Knezevic, Stevan Z.; Jhala, Amit J.; Harveson, Robert; Lawrence, Nevin C.
    Late-emerging summer annual weeds are difficult to control in dry bean production fields. Dry bean is a poor competitor with weeds, due to its slow rate of growth and delayed canopy formation. Palmer amaranth is particularly difficult to control due to season-long emergence and resistance to acetolactate synthase (ALS)-inhibiting herbicides. Dry bean growers rely on PPI and preemergence residual herbicides for the foundation of their weed control programs; however, postemergence herbicides are often needed for season-long weed control. The objective of this experiment was to evaluate effect of planting date and herbicide program on late-season weed control in dry bean in western Nebraska. Field experiments were conducted in 2017 and 2018 near Scottsbluff, NE. The experiment was arranged in a split-plot design, with planting date and herbicide program as main-plot and subplot factors, respectively. Delayed planting was represented by a delay of 15 d after standard planting time. The treatments EPTC + ethalfluralin, EPTC + ethalfluralin followed by (fb) imazamox + bentazon, and pendimethalin + dimethenamid-P fb imazamox + bentazon, resulted in the lowest Palmer amaranth density at 3 wk after treatment and the highest dry bean yield. The imazamox + bentazon treatment provided poor Palmer amaranth control and did not consistently result in Palmer amaranth density and biomass reduction compared with the nontreated control. In 2018, the delayed planting treatment had reduced Palmer amaranth biomass with the pendimethalin + dimethenamid-P treatment, as compared with standard planting. Delaying planting did not reduce dry bean yield and had limited benefit in improving weed control in dry bean.
  • Item
    Haskap Preharvest Fruit Drop and Stop-drop Treatment Testing
    (American Society for Horticultural Science, 2021-12) Leisso, Rachel S.; Jarrett, Bridgid; Miller, Zachariah J.
    Haskap (Lonicera caerulea), also known as honeyberry, is a relatively new fruit crop in North America. To date, most academic activity and research in North America involving haskap has focused on cultivar development and health benefits, with relatively few field experiments providing information to guide field planning and harvest management for the recently released cultivars. In 2020, we documented preharvest fruit drop (PHFD) rates for 15 haskap cultivars planted in a randomized block design at our research center in western Montana with the aim of preliminarily determining whether certain cultivars may be prone to this phenomenon. Additionally, we evaluated two plant growth regulators (PGRs) to reduce PHFD in two cultivars previously observed to have high rates of PHFD. Results suggest cultivar-specific variations in PHFD near berry maturation. Because haskap harvest indices are not well-defined and may be cultivar-specific, we share our 1-year study results as preliminary information and as a call for further research. Cultivars Aurora, Boreal Blizzard, Borealis, Indigo Gem, Kapu, and Tana all had PHFD rates less than 12% of yield, where yield is the weight of berries lost to PHFD plus marketable yield and marketable yield is fruit remaining on the shrub at harvest. Cultivars Chito, Kawai, and Taka had the highest rates of PHFD, although marketable yields were still relatively high, especially for Kawai. We note that ease of fruit detachment is an important consideration in mechanical harvest, and this characteristic could be advantageous if managed appropriately. The PGRs evaluated (1-napthaleneacetic acid and aminoethoxyvinylglycine) did not influence PHFD rates; however, our study was limited by the sample size and by the lack of information regarding haskap abscission physiology. In summary, the haskap cultivars evaluated exhibited variable PHFD rates in the year of the study, and further research is needed to understand haskap fruit maturation, harvest indices, and abscission.
  • Item
    Cover crops to improve soil health in the North American Great Plains
    (Wiley, 2021-09) Obour, Augustine K.; Simon, Logan M.; Holman, Johnathon D.; Carr, Patrick M.; Schipanski, Meagan; Fonte, Steven; Ghimire, Rajan; Nleya, Thandiwe; Blanco‐Canqui, Humberto
    Rotating cereal crops (e.g., wheat [Triticum aestivum L.] with a 10- to 21-mo summer fallow period [fallow]) is a common farming practice in dryland (rainfed) agricultural regions. Fallow is associated with several challenges including low precipitation storage efficiency, depletion of soil organic carbon (SOC), loss of soil fertility, little crop residue retention and soil erosion, and few control options for herbicide-resistant (HR) weeds. The inability to effectively control HR weeds poses a major challenge to maintaining soil and water conservation practices such as no-tillage, as some producers are considering tillage to control weeds. Cover crop (CC) integration into wheat-based production systems to replace portions of the fallow period provides an opportunity to increase SOC, improve soil fertility, suppress weeds, and increase profitability of dryland crop production, especially when CCs are used as forage. This forum paper used the North American Great Plains as a model region to review information on (a) challenges of dryland agriculture; (b) integrating CCs in dryland agriculture; (c) benefits, challenges, and limitations of CCs in dryland crop production; (d) management options for CC integration in dryland grain systems; and (e) recommendations for future research efforts.
Copyright (c) 2002-2022, LYRASIS. All rights reserved.