A series of recent experimental studies by James Beebe, Wesley Buckwalter and others have shown that people’s moral judgments can actually impact their intuitions about knowledge. In a striking new paper, Buckwalter reports three experiments showing that this effect can even arise for Gettier cases.
For example, in one of his studies, all participants were told about a mayor who decides to sign a contract with a local corporation. But participants were randomly assigned to one of two conditions. In one, participants were told that the mayor believes (on good evidence) that the contract will create jobs and thereby help the community; in the other, they are told that he believes the contract will cut jobs and thereby harm the community. Either way, they then receive some surprising information about what happens to the contract:
The corporation, however, didn’t take any chances. They secretly switched the contract with a totally different one right before the mayor signed it. By changing all the fine print, in some cases the opposite of what the mayor thought he was signing, the corporation could be sure it got what it wanted. Sure enough, shortly after the mayor signed the contract, a number of members of the community [got/lost] jobs, and the mayor received a huge donation to his reelection campaign.
Participants were then asked whether they agreed or disagreed with the sentence:
The mayor knew that by signing the contract he would create/cut jobs.
Strikingly, the responses differed radically between conditions. When participants were told that the mayor’s action created jobs, they showed the expected reluctance to say that he had knowledge… but when they were told that the action cut jobs, they were perfectly happy to say that he knew exactly what he would be doing!
I find these results extremely surprising and interesting, and I would love to hear any thoughts people might have about how they might be explained.