I was very interested by Jon’s post on “The Truth Connection” and all the comments that it inspired. I’ll try to post something on that soon. But I noticed that in his comments, Jim was making a lot of use of the idea of “degrees of justification”. So I want to ask a question about this idea of “degrees of justificaition”.
I take it that it’s obvious that we have more justification for some beliefs than we have for others. E.g. I have more justification for believing that St Andrews is in Scotland than I have for believing that Dushanbe is in Tajikistan. It also seems pretty plausible that the more justification I have for believing a proposition, the more confidently I should believe that proposition. (I do believe that Dushanbe is in Tajikistan, but rather less confidently than that St Andrews is in Scotland.)
So propositions can be ranked in terms of how much justification a thinker has for believing them at a particular time. But what are the properties of this ranking? E.g. is it a weak ordering or merely a partial ordering? (Is there always a definite answer to the question of whether or not I have more justification for believing p than for believing q?)
It might seem tempting to say that it must be possible to represent this ranking by means of a probability function. (Obviously, there usually won’t be a unique probability function that can represent this ranking; there will usually be a very large class of such probability functions). But apart from the fact that this ensures that the justification ranking is at least a weak ordering and not a mere partial ordering, this will also lead to an analogue of the idea of “logical omniscience”. If the justification ranking is representable by a probability function, then if both p and q are logical truths, it will be impossible for any thinker to have more justification for p than for q; in general, all logical truths will be maximally justified — no proposition can be more justified than a logical truth is.
Intuitively, I’m inclined to think that it must be possible to have more justification for some logical truths than for others. But I don’t know whether I have good reasons for thinking this. The idea that all logical truths are maximally justified doesn’t conflict with internalism. Nor does it obviously conflict with the reasons that incline me to accept internalism in the first place — viz. the idea that the facts about justification must be capable of “directly” guiding us in our forming and revising our beliefs. I don’t want to understand this “direct guidance” condition in a strongly person-relative way. (Even if you are wired up so that you have a sudden attack of narcolepsy whenever you start to consider a certain proposition, you could still have justification for believing that proposition, I think.)
So perhaps I should accept that the justification ranking can be represented by a probability function, and stop worrying about the fact that it would make all logical truths maximally justified?