Had an interesting exchange with Eliezer Yudkowski earlier that led me to this paper. I still haven't finished reading it, but I just had to post this passage (which, by the way, is summarizing a mathematical result, not just stating an opinion):
In the vast majority of disputed topics, the available evidence does not pin down with absolute certainty what we should believe. A consequence of this is that if there are no constraints on which priors are rational, there are almost no constraints on which beliefs are rational. People who think that some beliefs are irrational are thus forced to impose constraints on what priors are rational.
This ought to be required reading for any atheist who looks down their nose at a religious person.
I also have the feeling that it might be possible to construct a formal mathematical defense of withholding information in an argument even if both parties to the dispute are rational. The intuition goes something like this: part of a rational agent's worldview is a Bayesian prior about the how much weight one should lend to an opinion espoused by someone else. When one party in a dispute makes a statement, the other party updates not only their estimates regarding the subject matter of the statement, but also their estimates of the reliability and rationality of the speaker. For example, if someone says to me, "Men never walked on the moon. It was all a conspiracy." I am less likely to be persuaded by other things that they say. A rational person can map this phenomenon into their own reasoning and conclude that they are more likely to persuade someone that A is true if they do NOT say that B is true even though they believe that B is in fact true (and even if B *is* in fact true).
UPDATE: another gem from the paper:
These common criticisms suggest that most people implicitly uphold rationality standards that disapprove of self-favoring priors, such as priors that violate indexical independence. These criticisms also suggest that people in fact tend to form beliefs as if they had such priors. That is, people do seem to think they can reason substantially better than others, in the absence of much evidence favoring this conclusion. People thus seem to violate the rationality standards they uphold. And as we have mentioned, such tendencies seem capable of explaining a great deal of human disagreement.