This book develops and defends a new externalist, reliabilist theory of knowledge and evidence, and develops a new view about scientific realism. Knowledge is viewed as a tracking theory that has a conditional probability rather than counterfactual formulation, and the property of closure under known implication is imposed on knowledge. It is argued that the tracking theory of evidence is best formulated and defended as a confirmation theory based on the Likelihood Ratio. These tracking theories of knowledge and evidence fit together to provide a deep explanation of why having better evidence makes one more likely to know. The new tracking theory of knowledge is argued to be superior to all currently known externalist rivals. It provides a distinctive explanation of why knowledge is more valuable than mere true belief, and explains why knowledge is power in the Baconian sense. Finally, the book argues that confirmation theory is relevant to debates about scientific realism, and defends a position intermediate between realism and anti-realism based on a view about what having evidence requires.
I argue that pessimistic inductions over the history of science have not made the case that the failures of our predecessors give us reason to dial down our confidence in our scientific theories. I explain why an effective pessimistic induction must do more than meets the eye; it must show that 1) the supposed unreliability of our predecessors is relevant to what we should think about our own reliability, and 2) if we believe that we are lacking reliability-a second-order property-then we are rationally obligated to withdraw confidence in our first-order beliefs. I assist the pessimist by providing an account of fallibility that explains why the second is true. However, I show that the fact that we use different methods than our predecessors breaks the pessimistic induction from their unreliability to a conclusion that we are unreliable. To threaten our rational confidence in our particular theories, the pessimist needs much more than he has offered.
I develop a general framework with a rationality constraint that shows how coherently to represent and deal with second-order information about one's own judgmental reliability. It is a rejection of and generalization away from the typical Bayesian requirements of unconditional judgmental self-respect and perfect knowledge of one's own beliefs, and is defended by appeal to the Principal Principle. This yields consequences about maintaining unity of the self, about symmetries and asymmetries between the first- and third-person, and a principled way of knowing when to stop second-guessing oneself. Peer disagreement is treated as a special case where one doubts oneself because of news that an intellectual equal disagrees. This framework, and variants of it, imply that the typically stated belief that an equally reliably peer disagrees is incoherent, and thus that pure rationality constraints without further substantive information cannot give an answer as to what to do. The framework also shows that treating both ourselves and others as thermometers in the disagreement situation does not imply the Equal Weight view.
No abstract
There is a widespread view that in order to be rational we must mostly know what we believe. In the probabilistic tradition this is defended by arguments that a person who failed to have this knowledge would be vulnerable to sure loss, or probabilistically incoherent. I argue that even gross failure to know one's own beliefs need not expose one to sure loss, and does not if we follow a generalization of the standard bridge principle between first-order and second-order beliefs. This makes it possible for a subject to use probabilistic decision theory to manage in a rational way cases of potential failure of this self-knowledge, as we find in implicit bias. Through such cases I argue that it is possible for uncertainty about what our beliefs are to be not only rationally permissible but advantageous.Must we have more or less accurate beliefs about our beliefs? Many otherwise diverse thinkers have taken this to be a requirement for rational beings (e.g.tradition that defines rationality by means of the axioms of probability the reason for this view is arguments to the effect that a subject who either failed to be certain that he had a degree of belief he did have, or failed to have a degree of belief he was certain he had, would be vulnerable to sure loss. That is, there is a set of bets that such a subject would accept as fair and that would give him a loss no matter how the events he bet on turned out. This kind of vulnerability, which the word "incoherence" 2 will refer to here, is according to this tradition what rationality protects us from. I will argue here that contrary to two entrenched arguments for this view sure loss does not follow from failure to have accurate beliefs about our own beliefs. Mistaken belief about one's own belief is a failure, but it is a lack of knowledge and not a failure of rationality in the sense expressed by the probability axioms. If a rational being needs reasonably good knowledge of her own beliefs, we will need more than the constraint of probabilistic coherence to explain why.Usage of this word varies between vulnerability to sure loss and violation of the axioms, two concepts that are largely extensionally equivalent but that can come apart depending on how sure loss is defined. One can violate the axioms and, arguably, not have the relevant vulnerability (Hacking 1967), and here the issue will be whether having not violated the axioms you can be vulnerable to sure loss by not having your higher-and lower-order beliefs in sync.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.