The Seveso Directive of the European Union demands that information be provided to the public by companies and authorities about facts, risks, and behaviors related to hazardous facilities, in particular chemical facilities. On behalf of the Commission of the European Communities, a survey was run in five European countries on the credibility of various information sources. This article describes the results of the German study. 430 persons were interviewed with a questionnaire of 50 items, in particular about their perceptions and evaluations of technical risks, the credibility of sources of information about chemical risks, their preferences for receiving risk information from these sources, and their interests in receiving information. Major findings are great differences in credibility, differentiated information preferences, and strong information interests. Surprisingly, credibility played only a minor role with regard to the respondents' information preferences and interests.
Objectives To explore agreement among healthcare professionals assessing eligibility for work disability benefits.Design Systematic review and narrative synthesis of reproducibility studies.Data sources Medline, Embase, and PsycINFO searched up to 16 March 2016, without language restrictions, and review of bibliographies of included studies.Eligibility criteria Observational studies investigating reproducibility among healthcare professionals performing disability evaluations using a global rating of working capacity and reporting inter-rater reliability by a statistical measure or descriptively. Studies could be conducted in insurance settings, where decisions on ability to work include normative judgments based on legal considerations, or in research settings, where decisions on ability to work disregard normative considerations.Teams of paired reviewers identified eligible studies, appraised their methodological quality and generalisability, and abstracted results with pretested forms. As heterogeneity of research designs and findings impeded a quantitative analysis, a descriptive synthesis stratified by setting (insurance or research) was performed.Results From 4562 references, 101 full text articles were reviewed. Of these, 16 studies conducted in an insurance setting and seven in a research setting, performed in 12 countries, met the inclusion criteria. Studies in the insurance setting were conducted with medical experts assessing claimants who were actual disability claimants or played by actors, hypothetical cases, or short written scenarios. Conditions were mental (n=6, 38%), musculoskeletal (n=4, 25%), or mixed (n=6, 38%). Applicability of findings from studies conducted in an insurance setting to real life evaluations ranged from generalisable (n=7, 44%) and probably generalisable (n=3, 19%) to probably not generalisable (n=6, 37%). Median inter-rater reliability among experts was 0.45 (range intraclass correlation coefficient 0.86 to κ−0.10). Inter-rater reliability was poor in six studies (37%) and excellent in only two (13%). This contrasts with studies conducted in the research setting, where the median inter-rater reliability was 0.76 (range 0.91-0.53), and 71% (5/7) studies achieved excellent inter-rater reliability. Reliability between assessing professionals was higher when the evaluation was guided by a standardised instrument (23 studies, P=0.006). No such association was detected for subjective or chronic health conditions or the studies’ generalisability to real world evaluation of disability (P=0.46, 0.45, and 0.65, respectively).Conclusions Despite their common use and far reaching consequences for workers claiming disabling injury or illness, research on the reliability of medical evaluations of disability for work is limited and indicates high variation in judgments among assessing professionals. Standardising the evaluation process could improve reliability. Development and testing of instruments and structured approaches to improve reliability in evaluation of disability are urgently ...
This work investigates the nature of two distinct response patterns in a probabilistic truth table evaluation task, in which people estimate the probability of a conditional on the basis of frequencies of the truth table cases. The conditional-probability pattern reflects an interpretation of conditionals as expressing a conditional probability. The conjunctive pattern suggests that some people treat conditionals as conjunctions, in line with a prediction of the mental-model theory. Experiments 1 and 2 rule out two alternative explanations of the conjunctive pattern. It does not arise from people believing that at least one case matching the conjunction of antecedent and consequent must exist for a conditional to be true, and it does not arise from people adding the converse to the given conditional. Experiment 3 establishes that people's response patterns in the probabilistic truth table task are very consistent across different conditionals, and that the two response patterns generalize to conditionals with negated antecedents and consequents. Individual differences in rating the probability of a conditional were loosely correlated with corresponding response patterns in a classical truth table evaluation task, but there was little association with people's evaluation of deductive inferences from conditionals as premises. A theoretical framework is proposed that integrates elements from the conditional-probability view with the theory of mental models.
Gedruckt auf säurefreiem und chlorfrei gebleichtem PapierPlanung: Marion Krämer Die Darstellung von manchen Formeln und Strukturelementen war in einigen elektronischen Ausgaben nicht korrekt, dies ist nun korrigiert. Wir bitten damit verbundene Unannehmlichkeiten zu entschuldigen und danken den Lesern für Hinweise.V Vorwort zur 4. Auflage
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.