Had an interesting exchange with Eliezer Yudkowski earlier that led me to this paper. I still haven't finished reading it, but I just had to post this passage (which, by the way, is summarizing a mathematical result, not just stating an opinion):
In the vast majority of disputed topics, the available evidence does not pin down with absolute certainty what we should believe. A consequence of this is that if there are no constraints on which priors are rational, there are almost no constraints on which beliefs are rational. People who think that some beliefs are irrational are thus forced to impose constraints on what priors are rational.
This ought to be required reading for any atheist who looks down their nose at a religious person.
I also have the feeling that it might be possible to construct a formal mathematical defense of withholding information in an argument even if both parties to the dispute are rational. The intuition goes something like this: part of a rational agent's worldview is a Bayesian prior about the how much weight one should lend to an opinion espoused by someone else. When one party in a dispute makes a statement, the other party updates not only their estimates regarding the subject matter of the statement, but also their estimates of the reliability and rationality of the speaker. For example, if someone says to me, "Men never walked on the moon. It was all a conspiracy." I am less likely to be persuaded by other things that they say. A rational person can map this phenomenon into their own reasoning and conclude that they are more likely to persuade someone that A is true if they do NOT say that B is true even though they believe that B is in fact true (and even if B *is* in fact true).
UPDATE: another gem from the paper:
These common criticisms suggest that most people implicitly uphold rationality standards that disapprove of self-favoring priors, such as priors that violate indexical independence. These criticisms also suggest that people in fact tend to form beliefs as if they had such priors. That is, people do seem to think they can reason substantially better than others, in the absence of much evidence favoring this conclusion. People thus seem to violate the rationality standards they uphold. And as we have mentioned, such tendencies seem capable of explaining a great deal of human disagreement.
Ron wrote: they are more likely to persuade someone that A is true if they do NOT say that B is true even though they believe that B is in fact true (and even if B *is* in fact true).
Yes, yes, fine. I agree that total honesty is not the most direct route to effective persuasion.
But it's a slippery slope. Surely you've heard of the infamous smear campaign that Bush waged against McCain during the 2000 primaries. Using "push polling", they asked voters if they would be "more or less likely to vote for McCain if they knew he had fathered an illegitimate child who was black."
They never made a direct assertion. McCain did not, in fact, have any such child. But McCain did have a (dark-skinned) adopted daughter from India.
This was a smear that the Bush campaign did not themselves believe. Nor did they directly lie. But they were very careful to craft language that left a strong (and deliberate) negative connotation of McCain. And it was extremely effective at persuading those voters at that time to vote for Bush instead of McCain.
If you are willing to abandon truth (even indirect) for effectiveness, then do you see anything wrong with Bush's actions in 2000 against McCain?
I, for one, don't think the end justifies the means.
> If you are willing to abandon truth (even indirect) for effectiveness, do you see anything wrong with Bush's actions in 2000 against McCain?
I lament the fact that it was effective more than I condemn the Bush campaign for engaging in the tactic. If people want to fall for a cheap trick who am I to deny them the opportunity?
But I think there's a significant difference between abandoning the truth and postponing it. In the passage with which you take issue I did not say anything that was untrue. I merely left out some truths that would in my judgement have undermined the point I was trying to make. I'm quite comfortable with that from an ethical point of view.
> I, for one, don't think the end justifies the means.
Imagine you are an astronomer. One day you discover a large asteroid on a collision course with earth. It will definitely hit the earth in one week. The amount of energy released will melt the earth's crust, so there is no hope of any meaningful preparations. The world is doomed. You are the only one with access to this information. If this information becomes widely known, the most likely outcome is world-wide panic and all the associated suffering that would entail. Might you hesitate a moment before telling someone what you have found? If so, how are your actions morally different from mine?
I'll agree that the distinction is not clear cut. But it seems to me that in one case you are simply "taking the fifth", and choosing not to communicate some information you have. "No comment", as a response to a question.
In the original case, you are actually bringing up a topic, and misleading the reader to believe that the two of you have shared values and beliefs. The point of your deception is to say, "all this other stuff might seem confusing, but on the fundamental issue that you feel most strongly, there is no conflict." Except, you only imply that, and it turns out that there is in fact a conflict after all.
When the misled people find out the truth, do they feel betrayed? I'd argue that your (hypothetical) readers would. Whereas, for the astronomer, they might be angry -- but they wouldn't feel betrayed. That would require the astronomer to proactively say something that (falsely) implied he had searched for an asteroid, but found nothing. In your case, you could have simply not mentioned the question of the origin of life. Or at the least, merely said that "evolution doesn't address the question of the origin of life", in a completely neutral tone.
I suppose it's the difference between a sin of omission and a sin of commission. It feels like there is more moral culpability in the latter.
Post a Comment