It looks like false logic to me too, but I’m very aware that that is how many Christians “prove” their religion to be true. ‘The Bible says this historical/Godly event happened and this archeological evidence supports the account in the Bible, therefore the Bible must be true about everything so God exists and I’m going to Heaven.’ Which sounds very similar to ‘This is a part of what you say about your religion and it may be proved false one day, so your religion might be too.’
Is it okay to slip into the streams of thought that the other considers logic in order to beat them at it and potentially shake their beliefs?
Is it okay to slip into the streams of thought that the other considers logic in order to beat them at it and potentially shake their beliefs?
Basically, the question here is whether you can use the Dark Arts with purely Light intentions. In the ideal case, I have to say “of course you can”. Assuming that you know a method which you believe is more likely to cause your partner to gain true beliefs rather than false ones, you can use that method even if it involves techniques that are frowned upon in rationalist circles. However, in the real world, doing so is incredibly dangerous. First, you have to consider the knock-on effects of being seen to use such lines of reasoning; it could damage your reputation or that of rationalists in general for those that hear you, it could cause people to become more firm in a false epistemology which makes them more likely to just adopt another false belief, etc. You also have to consider that you run on hostile hardware; you could damage your own rationality if you aren’t very careful about handling the cognitive dissonance. There are a lot of failure modes you open yourself up to when you engage in that sort of anti-reasoning, and while it’s certainly possible to navigate through it unscathed (I suspect Eliezer has done so in his AI box experiments), I don’t think it is a good idea to expose yourself to the risk without a good reason.
An unrelated but also relevant point: everything is permissible, but not all things are good. Asking “is it okay to...” is the wrong question, and is likely to expose you to some of the failure modes of Traditional Rationality. You don’t automatically fail by phrasing it like that, but once again it’s an issue of unnecessarily risking mental contamination. The better question is “is it a good idea to...” or “what are the dangers of...” or something similar that voices what you really want answered, which should probably not be “will LWers look down at me for doing …” (After all, if something is a good idea but we look down at it then we want to be told so so that we can stop doing silly things like that.)
It looks like false logic to me too, but I’m very aware that that is how many Christians “prove” their religion to be true. ‘The Bible says this historical/Godly event happened and this archeological evidence supports the account in the Bible, therefore the Bible must be true about everything so God exists and I’m going to Heaven.’ Which sounds very similar to ‘This is a part of what you say about your religion and it may be proved false one day, so your religion might be too.’
Is it okay to slip into the streams of thought that the other considers logic in order to beat them at it and potentially shake their beliefs?
Basically, the question here is whether you can use the Dark Arts with purely Light intentions. In the ideal case, I have to say “of course you can”. Assuming that you know a method which you believe is more likely to cause your partner to gain true beliefs rather than false ones, you can use that method even if it involves techniques that are frowned upon in rationalist circles. However, in the real world, doing so is incredibly dangerous. First, you have to consider the knock-on effects of being seen to use such lines of reasoning; it could damage your reputation or that of rationalists in general for those that hear you, it could cause people to become more firm in a false epistemology which makes them more likely to just adopt another false belief, etc. You also have to consider that you run on hostile hardware; you could damage your own rationality if you aren’t very careful about handling the cognitive dissonance. There are a lot of failure modes you open yourself up to when you engage in that sort of anti-reasoning, and while it’s certainly possible to navigate through it unscathed (I suspect Eliezer has done so in his AI box experiments), I don’t think it is a good idea to expose yourself to the risk without a good reason.
An unrelated but also relevant point: everything is permissible, but not all things are good. Asking “is it okay to...” is the wrong question, and is likely to expose you to some of the failure modes of Traditional Rationality. You don’t automatically fail by phrasing it like that, but once again it’s an issue of unnecessarily risking mental contamination. The better question is “is it a good idea to...” or “what are the dangers of...” or something similar that voices what you really want answered, which should probably not be “will LWers look down at me for doing …” (After all, if something is a good idea but we look down at it then we want to be told so so that we can stop doing silly things like that.)
The framing of the first sentence gives me a desperately unfair expectation for the discussion inside HPMOR- I’m excited.