Heifers with expected increased risk of bovine respiratory disease (BRD; n = 360; initial BW = 241.3 +/- 16.6 kg) were assembled at a Kentucky order-buyer facility and delivered to Stillwater, OK, in September 2007 to determine the effects of clinical BRD observed during preconditioning on subsequent feedlot performance, carcass characteristics, and meat attributes. During a 63-d preconditioning period, morbidity and mortality attributed to BRD were 57.6 and 8.6%, respectively. Immediately after preconditioning, heifers were grouped according to health outcome category and allotted to finishing pens (5 to 7 heifers/pen). Heifers were never treated for BRD (0X; n = 9 pens), treated 1 time (1X; n = 9 pens), 2 times (2X; n = 6 pens), 3 times (3X; n = 6 pens), or designated as chronically ill (CI; n = 2 pens). Arrival BW was not different (P = 0.21) among treatment categories. However, disease incidence during preconditioning decreased (P < 0.001) growth, resulting in BW of 318, 305, 294, 273, and 243 kg for 0X, 1X, 2X, 3X, and CI, respectively, at the start of the finishing phase. Estimates on the LM, taken by ultrasound on d 65 and 122, were combined with BW and visual appraisal to target common average endpoint within category and block. On average, heifers were slaughtered on d 163 for 0X, 1X, and 2X, d 182 for 3X, and d 189 for CI (P < 0.01). Final BW was similar (P > or = 0.18) for heifers treated 0, 1, 2, or 3 times, but heifers deemed CI weighed less (P = 0.01) than 3X heifers. Considering the finishing phase only, ADG was linearly increased (P < 0.001) with increasing BRD treatments, but was linearly decreased (P = 0.003) as BRD treatments increased from arrival to slaughter. Therefore, G:F was greater (P = 0.007) for CI than 3X and linearly increased (P = 0.002) from 0X to 3X. Similar to BW, HCW was less (P = 0.03) for CI than 3X. Marbling score tended (P = 0.06) to decrease linearly as the number of treatments increased, but no other differences (P > or = 0.24) in carcass traits were detected. No differences were observed in beef tenderness (P = 0.65), and no consistent trends were noted in retail display or palatability data. Less than 20 additional days on feed were required for heifers treated 3 times to have similar BW and carcass characteristics to heifers never treated for BRD. Segregating animals with multiple BRD treatments and feeding them to an acceptable carcass endpoint may be a viable strategy for increasing value of animals treated for BRD.
The objective was to determine effects of an intratracheal Mannheimia haemolytica challenge after 72-h exposure to bovine viral diarrhea virus type 1b (BVDV1b) persistently infected (PI) calves on serum antibody production, white blood cell count (WBC), cytokine concentrations, and blood gases in feedlot steers. Twenty-four steers (initial BW = 314 +/- 31 kg) were randomly allocated to 1 of 4 treatments (6 steers/treatment) arranged as a 2 x 2 factorial. Treatments were 1) steers not exposed to steers PI with BVDV nor challenged with M. haemolytica (control; CON); 2) steers exposed to 2 steers PI with BVDV for 72 h (BVD); 3) steers intratracheally challenged with M. haemolytica (MH); and 4) steers exposed to 2 steers PI with BVDV for 72 h and challenged with M. haemolytica (BVD+MH). There were 12 h between exposure to PI steers and challenge with M. haemolytica. Rectal temperature was increased (P < 0.001) for MH and BVD+MH during the initial 24 h after the M. haemolytica challenge. For MH and BVD+MH, total WBC count was increased (P < 0.01) at 36 h post M. haemolytica challenge compared with CON, whereas in BVD steers, WBC count was decreased (P < 0.01). Total lymphocyte count was increased (P = 0.004) during the initial 72 h post BVDV exposure for the BVD and BVD+MH groups compared with MH and CON, and this difference remained at 96 h post M. haemolytica challenge. An increased (P < 0.001) total neutrophil count was observed during the initial 36 h for the MH group and at 72 h for the BVD+MH challenge group. Interleukin 1beta, IL-6, and tumor necrosis factor alpha (TNFalpha) concentrations were greater (P
Remote rumen temperature monitoring is a potential method for early disease detection in beef cattle. This experiment was conducted to determine if remotely monitored rumen temperature boluses could detect a temperature change in steers exposed to bovine viral diarrhea virus (BVDV) and challenged with a common bovine respiratory disease pathogen, Mannheimia haemolytica (MH). Twenty-four Angus crossbred steers (BW = 313 ± 31 kg) were allotted to 1 of 4 treatments: 1) no challenge (control); 2) challenge by a 72-h exposure to 2 steers persistently infected with BVDV; 3) bacterial challenge with MH; and 4) viral challenge by a 72-h exposure to 2 steers persistently infected with BVDV followed by bacterial challenge with MH (BVDV + MH). Remotely monitored rumen temperature boluses programmed to transmit temperature every minute were placed in the rumen before the time of exposure to steers persistently infected with BVDV. Rectal temperatures were taken before MH challenge (0) and at 2, 4, 6, 12, 18, 24, 36, 48, 72, and 96 h after MH challenge. Rumen temperatures were recorded 3 d before (-72 h; period of BVDV exposure) through 14 d after (336 h) MH challenge. Rumen temperatures were analyzed as a randomized complete block design with a 2 × 2 factorial arrangement of treatments and a first-order autoregressive covariance structure for repeated measures. A treatment × day interaction was observed for average daily rumen temperature (P < 0.01). A treatment difference (P < 0.01) was observed on d 0, when MH-challenged steers had greater rumen temperatures than steers not challenged with MH. There was no BVDV × day interaction (P > 0.01). Rumen temperatures averaged every 2 h resulted in a BVDV × hour interaction (P < 0.01) and an MH × hour interaction (P < 0.01). The BVDV × hour differences occurred at h -18 to -14, 40 to 46, 110, 122, and 144 to 146 (P < 0.01). The MH × hour difference occurred at h 4 to 24 (P < 0.01). Maximum rumen temperature was increased (P < 0.01) for BVDV (0.8 °C), MH (1.2 °C), and BVDV + MH (1.3 °C) compared with the control. On average, rumen temperatures measured by the boluses at the same time points as the rectal temperatures were 0.13 °C less than rectal temperatures, and the 2 body temperatures were highly correlated (r = 0.89). Rumen temperature boluses appear to have potential as a tool for detecting temperature changes associated with adverse health events such as exposure to bovine respiratory disease and BVDV.
The objective was to evaluate the effects of an extended withdrawal period after feeding the beta-adrenergic agonist zilpaterol hydrochloride (ZH) for 20 d at the end of the feeding period. Three hundred eighty-four crossbred beef steers were blocked by BW and randomly allocated into 64 pens (6 steers/pen). Pens were assigned to treatments in a 2 x 4 factorial arrangement in a randomized complete block design. Main effects were the addition of 0 (control) or 8.3 mg/kg of ZH (DM basis) to the finishing diet for 20 d before estimated average slaughter date and paired withdrawal periods of 3, 10, 17, or 24 d before slaughter. Individual BW were measured initially, 1 d before ZH feeding, and 1 d before slaughter. The ZH feeding period was initiated so that control cattle in the 3-d withdrawal group would be expected to average 65% USDA Choice Quality grade and have 1.27 cm of 12th-rib fat based on visual appraisal. Carcass data were collected at slaughter. For the 3-d withdrawal steers, 2 steers from each pen were selected to determine visceral organ and total offal mass at slaughter. The ZH x withdrawal day interaction was not significant (P > 0.10) for the majority of variables. There was no difference (P > or = 0.12) due to ZH feeding for final BW, carcass-adjusted final BW, or ADG. However, DMI was decreased (P = 0.02) and G:F increased (P = 0.01) in steers fed ZH vs. control steers. As day after withdrawal of ZH increased, there was a linear increase (P < 0.001) in final BW and carcass-adjusted final BW, but a linear decrease (P < 0.001) in ADG over the finishing period and over the ZH plus withdrawal period. Overall, HCW was 380 and 369 kg (P < 0.001) for ZH and control steers, respectively. However, the difference between ZH and control was 14, 17, 5, and 6 kg with 3, 10, 17, and 24 d withdrawal, respectively (ZH x withdrawal day, P = 0.09). Feeding ZH increased dressing percentage (65.8 vs. 64.6%; P < 0.001) and LM area (94.8 vs. 89.7 cm(2); P < 0.001), and decreased calculated yield grade (2.69 vs. 2.91; P = 0.03) and percentage of cattle grading USDA Choice (31.1 vs. 42.3%; P = 0.03) compared with controls. Small intestinal mass (g/kg of empty BW) was greater (P = 0.03) for steers fed ZH compared with controls. There were no other differences (P > or = 0.11) in mass of body components, expressed in kilograms or as a fraction of empty BW. In this experiment, improvements in animal performance and HCW due to feeding ZH were generally maintained when withdrawal was extended through 10 d.
Although breath analysis was successfully implemented in a research feedlot, arrival rumen temperature, eN(2)O, eCO, and haptoglobin concentration were not accurate in predicting occurrence of BRD during a preconditioning program. However, these biomarkers might support the diagnosis of BRD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.