It’s true that the question of God’s existence is epistemologically fairly trivial and doesn’t require its own category of justifications, and it’s also true that even many atheists don’t seem to notice this. But even with that in mind, it almost never actually helps in convincing people to become atheists (most theists won’t respond to a crash course in Bayesian epistemology and algorithmic information theory, but they sometimes respond to careful refutation of the real reasons they believe in God), which is probably why this point is often forgotten by people who spend a lot of time arguing for atheism.
It’s true that the question of God’s existence is epistemologically fairly trivial and doesn’t require its own category of justifications
It’s really epistemologically difficult to find out what people mean by God in the first case; how then can it be epistemologically trivial to judge the merits of such a hypothesis?
I strongly suspect that there is a lot of coherence among many different spiritualists’ and theologians’ conception of God, and I strongly suspect that most atheists have no idea what kind of God the more enlightened spiritualists are talking about, and are instead constructing a straw God made up of secondhand half-remembered Bible passages. In general I think LW is embarrassingly bad at steel-manning.
Coherence isn’t necessary factor for a good theory. In artificial intelligence it’s sometimes preferable to allow incoherence to have higher robustness.
It’s true that the question of God’s existence is epistemologically fairly trivial and doesn’t require its own category of justifications, and it’s also true that even many atheists don’t seem to notice this. But even with that in mind, it almost never actually helps in convincing people to become atheists (most theists won’t respond to a crash course in Bayesian epistemology and algorithmic information theory, but they sometimes respond to careful refutation of the real reasons they believe in God), which is probably why this point is often forgotten by people who spend a lot of time arguing for atheism.
Choosing good priors isn’t something that’s epistemologically fairly trivial.
Using the majority opinion of the human race as a prior is a general strategy that you can defend rationally.
Use it as a prior all you want; but then you have to update on the (rest of the) evidence.
It’s really epistemologically difficult to find out what people mean by God in the first case; how then can it be epistemologically trivial to judge the merits of such a hypothesis?
Difficult to pin down within a range of trivial-to-judge positions.
With, possibly, vanishingly rare exceptions.
If a given hypothesis is incoherent even to its strongest proponents, then it’s not very meritorious. It’s in “not even wrong” territory.
I strongly suspect that there is a lot of coherence among many different spiritualists’ and theologians’ conception of God, and I strongly suspect that most atheists have no idea what kind of God the more enlightened spiritualists are talking about, and are instead constructing a straw God made up of secondhand half-remembered Bible passages. In general I think LW is embarrassingly bad at steel-manning.
Coherence isn’t necessary factor for a good theory. In artificial intelligence it’s sometimes preferable to allow incoherence to have higher robustness.
Could you expand?