The subjective nature of self-reported dietary intake assessment methods presents numerous challenges to obtaining accurate dietary intake and nutritional status. This limitation can be overcome by the use of dietary biomarkers, which are able to objectively assess dietary consumption (or exposure) without the bias of self-reported dietary intake errors. The need for dietary biomarkers was addressed by the Institute of Medicine, who recognized the lack of nutritional biomarkers as a knowledge gap requiring future research. The purpose of this article is to review existing literature on currently available dietary biomarkers, including novel biomarkers of specific foods and dietary components, and assess the validity, reliability and sensitivity of the markers. This review revealed several biomarkers in need of additional validation research; research is also needed to produce sensitive, specific, cost-effective and noninvasive dietary biomarkers. The emerging field of metabolomics may help to advance the development of food/nutrient biomarkers, yet advances in food metabolome databases are needed. The availability of biomarkers that estimate intake of specific foods and dietary components could greatly enhance nutritional research targeting compliance to national recommendations as well as direct associations with disease outcomes. More research is necessary to refine existing biomarkers by accounting for confounding factors, to establish new indicators of specific food intake, and to develop techniques that are cost-effective, noninvasive, rapid and accurate measures of nutritional status.
Group housing and computerized feeding of preweaned dairy calves is gaining popularity among dairy producers worldwide, yet disease incidence and detection remain a challenge in these systems. The aim of this prospective observational cohort study was to describe the relationship between morbidity and feeding behavior around the period of illness detection. Calves were enrolled upon entrance to the group pen on 10 farms in Minnesota (n = 4) and Virginia (n = 6) utilizing group housing and computerized feeding from February until October 2014. Morbidity and mortality events were recorded by the calf caregiver. Farms were visited either every week (Minnesota) or every other week (Virginia) to collect calf enrollment data, feeding behavior data, and health records. Daily average feeding behaviors (drinking speed, mL/min; daily consumption, L/d; rewarded visits to the feeder; and unrewarded visits to the feeder) were described both overall and for sick and healthy calf days. Multivariable mixed models were built to assess the differences in daily average feeding behaviors (drinking speed, daily consumption, rewarded visits, unrewarded visits) between matched sick and healthy calves around the time of an illness event (-10 to 10 d). Final models were controlled for calf age, region (Minnesota and Virginia), group size, disease diagnosis, the random effect of farm, and repeated measurements on calf. A stratified analysis was performed by both day from treatment event and disease diagnosis. We enrolled 1,052 calves representing 43,607 calf days over 9 mo. From these, 176 sick calves had a matched control and were carried forward to the matched pair analysis. Fifty-five percent of sick calves (97/176) were treated for diarrhea, 30% (53/176) were treated for pneumonia, and 15% (26/176) were treated for ill thrift. Sick calves drank 183 ± 27 mL/min (mean ± standard error) more slowly, drank 1.2 ± 0.6 L/d less, and had 3.1 ± 0.7 fewer unrewarded visits than control calves on the first day of treatment. These differences began up to 4 d before the calf was detected as sick, and persisted for 7 to 10 d after treatment. However, changes in feeding behaviors varied by disease diagnosed. Rewarded visits were not associated with morbidity status. The results of this study indicate that sick calves change their feeding behavior before and during an illness event, suggesting that feeding behavior may be a useful tool to detect disease onset.
The influence of several environmental factors (e.g., light intensity, temperature, nitrogen, and phosphorus) on population density and odor-compound production of two chrysophytes, Synura petersenii and Dinobryon cylindricum; and two cyanobacteria, Anabaena laxa and Phormidium calcicola was investigated. The odors associated with each alga were evaluated by flavor profile analysis (FPA) at several intervals during their initial culturing in defined media. Algal cell and media extracts were analyzed individually by capillary gas chromatography-mass spectrometry (GC-MS). Both cyanobacteria produced geosmin (“earthy” and “corn” odors); however, the P. calcicola also produced relatively large amounts of MIB (“musty-earthy” odors). Both chrysophyte cultures contained 2t,4c,7c-decatrienal (“fishy” odor); and in addition, 2t,6c-nonadienal (“cucumber” odor) was isolated from the S. petersenii. Young cultures of Anabaena laxa (e.g., <20 days) retained most of the geosmin produced. Throughout its population growth, more than 80 percent of the MIB and geosmin produced by P. calcicola was detected in the media rather than in the cells. Synura petersenii,produced more 2t,4c,7c-decatrienal than 2t,6c-nonadienal and retained nearly 90 percent of both compounds throughout the algal population growth. Dinobryon cylindricum produced 2t,4c,7c-decatrienal, and, like the S. petersenii, retained most of the compound. Greater production of the compounds by the two chrysophytes was apparently associated with log-phase growth rather than specific environmental conditions; extended log-phase growth (and prolonged production of compounds) was observed in the S. petersenii culture during the low-temperature treatment.
Group housing and computerized feeding of preweaned dairy calves are gaining in popularity among dairy producers, yet disease detection remains a challenge for this management system. The aim of this study was to investigate the application of statistical process control charting techniques to daily average feeding behavior to predict and detect illness and to describe the diagnostic test characteristics of using this technique to find a sick calf compared with detection by calf personnel. This prospective cross-sectional study was conducted on 10 farms in Minnesota (n = 4) and Virginia (n = 6) utilizing group housing and computerized feeding from February until October 2014. Calves were enrolled upon entrance to the group pen. Calf personnel recorded morbidity and mortality events. Farms were visited either every week (MN) or every other week (VA) to collect calf enrollment data, computer-derived feeding behavior data, and calf personnel-recorded calf morbidity and mortality. Standardized self-starting cumulative sum (CUSUM) charts were generated for each calf for each daily average feeding behavior, including drinking speed (mL/min), milk consumption (L/d), and visits to the feeder without a milk meal (no.). A testing subset of 352 calves (176 treated, 176 healthy) was first used to find CUSUM chart parameters that provided the highest diagnostic test sensitivity and best signal timing, which were then applied to all calves (n = 1,052). Generalized estimating equations were used to estimate the diagnostic test characteristics of a single negative mean CUSUM chart signal to detect a sick calf for a single feeding behavior. Combinations of feeding behavior signals were also explored. Single signals and combinations of signals that included drinking speed provided the most sensitive and timely signal, finding a sick calf up to an average (±SE) of 3.1 ± 8.8 d before calf personnel. However, there was no clear advantage to using CUSUM charting over calf observation for any one feeding behavior or combination of feeding behaviors when predictive values were considered. The results of this study suggest that, for the feeding behaviors monitored, the use of CUSUM control charts does not provide sufficient sensitivity or predictive values to detect a sick calf in a timely manner compared with calf personnel. This approach to examining daily average feeding behaviors cannot take the place of careful daily observation.
The six algal metabolites, at concentrations of 20-225 μg/l, were oxidized with potassium permanganate, chlorine, or chlorine dioxide at doses of 0.25-3 mg/l. Flavor profile analysis (FPA) was used to determine the odors of the solutions before and after oxidation. Linoleic and palmitic acids, which are odorless compounds, were oxidized to odorous products by all three oxidants. The odor intensity of β-cyclocitral (grape, sweet tobacco) and phenethyl alcohol (rose, floral) was only slightly decreased by any of the oxidants. Oxidation by permanganate or chlorine either eliminated or greatly reduced the odors associated with linolenic acid (watermelon) and 2t,6c-nonadienal (cucumber); chlorine dioxide was ineffective at reducing the cucumber odor of 2t,6c-nonadienal. Oxidation, at doses typically applied for drinking water treatment, can result in the destruction of certain algae-related odors but in the formation of other odors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.