Just got back from a very fun Central Meeting at the fantastic Palmer House. In the session on evidence, Tom talked about the following fact (apparently well-confirmed empirically):
You and I disagree about some claim (his example was about whether capital punishment is a deterrent), but neither of us is strongly certain of our beliefs. We are then presented with mixed evidence: studies that are balanced between those that suggest I’m right and those that suggest you’re right. The result is that our beliefs become polarized: I become more confident that I’m right and you that you are right.
Tom defended a view on this phenomenon, but rather than post what his view is, it might be fun to think abuot the case itself first. Armchair psychological theorizing about why this would happen are of course welcome, but the real question for epistemologists is whether such polarization is, or can be, rational (or justified).