As an atheist, your prior was low that the Christianity book would convince you, but as a Zoroastrian, your prior is now high that the Christianity book would convince you? I’m saying that you seem to have changed your opinion about the books.
I can add this to initial conditions: AI must pick a valid argument for any proposition it argues
(Can’t find SMBC Delphic comic. Looking.)
in general minimize the number of logical errors / tricks used.
It’s arguing for false propositions. You can specify a “Sudden volcanic eruption sufficient to destroy every Island in Indonesia that minimizes harm to humans”, but don’t be surprised if a few people are inconvenienced by it, considering what the minimum requirements to meet the first conditions are.
I see now. No, my P(any book on X will convince me of X) is high, for all X. P(religion X is true) is low for all X, except X I actually believe in.
in general minimize the number of logical errors / tricks used.
For a true proposition, it should be possible to bring it to 0. For all else, use as few as possible (even if it means thousands). It’s probably a good policy anyway, as I originally claimed.
There are a few hundred people in deep caves on the Anatolian plateau that thank you for minimizing the force of the Indonesian caldera, sparing them and allowing them to attempt to continue the human race.
The magnitude of the wrongness isn’t really an issue. The point was that with the rule that “real arguments have to be used when available”, he can think that the book he just read convinced him with real arguments.
As an atheist, your prior was low that the Christianity book would convince you, but as a Zoroastrian, your prior is now high that the Christianity book would convince you? I’m saying that you seem to have changed your opinion about the books.
(Can’t find SMBC Delphic comic. Looking.)
It’s arguing for false propositions. You can specify a “Sudden volcanic eruption sufficient to destroy every Island in Indonesia that minimizes harm to humans”, but don’t be surprised if a few people are inconvenienced by it, considering what the minimum requirements to meet the first conditions are.
I see now. No, my P(any book on X will convince me of X) is high, for all X. P(religion X is true) is low for all X, except X I actually believe in.
For a true proposition, it should be possible to bring it to 0. For all else, use as few as possible (even if it means thousands). It’s probably a good policy anyway, as I originally claimed.
There are a few hundred people in deep caves on the Anatolian plateau that thank you for minimizing the force of the Indonesian caldera, sparing them and allowing them to attempt to continue the human race.
The magnitude of the wrongness isn’t really an issue. The point was that with the rule that “real arguments have to be used when available”, he can think that the book he just read convinced him with real arguments.
I was wrong about the importance of this factor.