Suppose an author exercises due diligence in the researching and writing of a nonfiction book. He has good reason to believe that all of the statements he makes in the book are true. But he is also well aware of human fallibility and that he is no exception to the rule. And so, aware of his fallibility, he has good reason to believe that it is not the case that all of the statements he makes in the book are true. He makes mention of this in the book's preface. Hence 'paradox of the preface.' Thus:
1. It is rational for the author to believe that each statement in his book is true. (Because he has exercised due diligence.)
2. It is rational for the author to believe that some statement in his book is not true. (Because to err is human.)
Therefore
3. It is rational for the author to believe that (each statement in his book is true & some statement in his book is not true.)
Therefore
4. There are cases in which it is rational for a person to believe statements of the form (p & ~p).
"What the paradox shows is that we need to give up the claim that it is always irrational to believe statements that are mutually inconsistent." (Michael Clark, Paradoxes From A to Z, Routledge 2002, p. 144)
Is that what the paradox shows? I doubt it. The paradox cannot arise unless the following schema is valid:
a. It is rational for S to believe that p.
b. It is rational for S to believe that ~p.
Ergo
c. It is rational for S to believe that (p & ~p).
It is not clear that the schema is valid. Rational believability, unlike truth, is a relative property. What it is rational to believe is relative to background knowledge among other things. Relative to the author's knowledge that he exercised due diligence in the researching and writing of his book, it is rational for him to believe that every statement in the book is true. But relative to considerations of human fallibility, it is rational for him to believe that it is not the case that every statement in his book is true. So what (a) and (b) above really amount to is the following where 'BK' abbreviates 'background knowledge':
a*. It is rational for S to believe relative to BK1 that p.
b*. It is rational for S to believe relative to BK2 that ~p.
From these two premises one cannot arrive at the desired conclusion. So my solution to the paradox is to reject the inference from (1) and (2) to (3).
"But doesn't the author's background knowledge (BK) include both the truth that he exercised due diligence and the truth that human beings are fallible?" Well suppose it does. Then how could it be rational for him to believe that every statement in the book is true? It is rational for him to believe that every statement is true only if he leaves out of consideration that people are fallible. Relative to his total background knowledge, it is not rational for him to believe that every statement in his book is true.
In this way I avoid Clark's draconian conclusion that it is sometimes rational to believe statements that are mutually inconsistent.
There are two separate questions here. The first is whether this example shows that there exists a proposition p such that the following are both true:
a. It is rational for S to believe that p.
b. It is rational for S to believe that ~p.
In what follows, Rp denotes *the writer is rationally justified in believing that p*. The second question is if the following is true:
(*) Rp & Rq ⇒ R(p&q)
which given a proposition p as in (a) - (b) would imply that it is rational to believe some necessarily false proposition.
I think Clark fails to provide a proposition p as in (a) - (b) and his argument rests on a confusion concerning quantificators. The following true:
1. For every proposition p expressed in the book, the writer is rationally justified in believing that p.
2. The writer is rationally justified in believing that there exists a proposition p expressed in the book such that p is false.
This pair is not inconsistent. Indeed, to form a pair of the type (a) - (b), (1) must be replaced with
1'. The writer is rationally justified in believing that every proposition p expressed in the book is true.
(1') is however simply not true. The argument that is given for it is in fact an argument for (1) with which (1') is confused. The difference between (1) and (1') is clearly seen by using symbolic logic:
1. ∀p R(p)
2. R ( ∃p ¬p )
and
1'. R( ∀p p )
Expressing the point in a somewhat pompous way, the operator R does not commute with the universal quantifier. This example does however constitute a counterexample to (*). (1) is equivalent to
1*. Rp &...& Rq ( p, ..., q being the propositions from the book).
By (*) (and induction) we have
2*. R(p &...& q)
(2*) is easily seen to be ¬(2). Therefore, an aporetic polyad cannot be formed unless (*) is given strong independent support, which I don't think can be done. This may be to the chagrin of the proprietor of the blog, who as we know loves them aporetic polyads.
In Bayesian terms, (*) says that an intersection of big sets is big (whatever big exactly means). This need not be true under more reasonable understandings of 'big' (and not very pathological underlying spaces).
Posted by: Jan | Thursday, January 19, 2012 at 10:34 AM
I'm not a philosopher, I'm an economist, so you may have to excuse any abuse of nomenclature or other mistakes, but it seems to me that this example is easily resolved by introducing a very basic form of uncertainty. That is, I believe that every proposition is true with near-certainty. So proposition q=1 if true and q=0 if false, and I believe that each q=1 with probability p=.99. So the "expected truth value" of any given proposition q is 1*.99=.99. Thus I believe each proposition is true with something very close to certainty, but if I amass a collection of hundreds of such propositions, then it is also nearly certain that at least one of those statements is false (e.g., 500 statements have a probability of containing at least one false statement with a probability around 99.3%).
In this case then, applying the same concept to both statements, it is rational for me to believe with near-certainty that each statement, considered individually, is true, and that that at least one statement in the book is false.
Posted by: Matthew | Monday, January 23, 2012 at 11:29 AM