I didn’t downvote this post, but I can’t say I endorse seeing more posts like it. The concept of this post is one of the least interesting in a huge conceptspace of decision theory problems, especially decision theory problems in an ensemble universe. To focus on ‘having faith’ and ‘rationality’ in particular might seem clever, but it fails to illuminate in the same way that e.g. Nesov’s counterfactual mugging does. When you start thinking about various things simulators might do, you’re probably wrong about how much measure is going to be taken up by any given set of simulations. Especially so once you consider that a superintelligence is extremely likely to occur before brain emulations and that a superintelligence is almost assuredly not going to be running simulations of the kind you specify.
Instead of thinking “What kind of scenario involving simulations could I post to Less Wrong and still be relevant?”, as it seems to me you did, it would be much better to ask the more purely curious question “What is the relative power of optimization processes that would cause universes that include agents in my observer moment reference class to find themselves in a universe that looks like this one instead of some other universe?” Asking this question has led me to some interesting insights, and I imagine it would interest others as well.
Actually, I didn’t try to be relevant or interesting to the LW community. I’m just currently genuinely very interested in the kinds of questions this post was about, and selfishly thought I’d get very useful criticism and comments if I’d post here like this (as did indeed happen).
Getting downvoted so much is something that I for some reason enjoy very much :) It probably has to do with me thinking, that while there are very valid points on which my post and my decision to post it can be criticized, I suspect that instead of thinking of those valid reasons to dislike what I did, and seeing them as a reason to downvote, the downvoters probably just had a knee-jerk reaction to religion as a topic (probably even suspecting that I’d have religious views that I don’t actually have). If this suspicion is true, I would have ended up demonstrating a form of irrationality somewhat widespread within LW readership.
Or then the typical voting LW member is smarter than I thought, and they did actually notice that I was largely just looking for criticism and comments useful to me, instead of formulating my thoughts further on my own, and then perhaps later posting on LW something more polished and with a somewhat different focus.
I’m just currently genuinely very interested in the kinds of questions this post was about, and selfishly thought I’d get very useful criticism and comments if I’d post here like this (as did indeed happen).
I’ve found it hard to avoid doing this kind of thing. Luckily I have people at the Singularity Institute to discuss this kind of hypothesis with. If you post it to the Open Thread, it will like as not be ignored, and if you make a top level post about it, then it will like as not be downvoted (but at least you’ll get feedback). Perhaps it would be best if a lot of Less Wrongers made blogs and advertised those blogs in a top level post using some sort of endorsement method? I’ve thought about writing my own blog before, but it’d be annoying to have to ask a lot of people to check it out or subscribe. But if a lot of LWers did it, it wouldn’t be nearly as annoying. Then posts like the one you wrote could still get feedback without taking up space in the minds of a multitude of Less Wrong readers who don’t care about the decision theory of simulations.
Getting downvoted so much is something that I for some reason enjoy very much :) It probably has to do with me thinking, that while there are very valid points on which my post and my decision to post it can be criticized, I suspect that instead of thinking of those valid reasons to dislike what I did, and seeing them as a reason to downvote, the downvoters probably just had a knee-jerk reaction to religion as a topic (probably even suspecting that I’d have religious views that I don’t actually have).
This totally does not work unless you have a way of discovering evidence to distinguish between the two hypotheses, and I don’t think you have such a method. Commenters are more likely to have more sophisticated reasons for disagreeing than the average LW lurker who sees the word ‘faith’ in anything but a totally negative light and immediately downvotes, so posting this gave you little evidence. The downvotes are most likely to come from both the least and most sophisticated of LWers: the least because they’re allergic to anything to do with religion, the most because they’re allergic to hypotheses that fail to carve reality at its joints. If I was going to downvote the post it’d be because of the latter, but I still don’t know what the median reason would be for a downvote.
“the downvoters probably just had a knee-jerk reaction to religion as a topic (probably even suspecting that I’d have religious views that I don’t actually have)”
With some exceptions when the poster is well known here, my impression is that posts and comments on the topic of religion do get treated this way on a regular basis.
In my experience bringing up religion to make a point is often a bad call (or so community norms suggest) because politics is the mind killer and alienation for no good reason is generally frowned upon, and bringing up religion as a topic of discussion in itself is often done by those who want to bash religion for a reason that is not sophisticated enough.
I didn’t downvote this post, but I can’t say I endorse seeing more posts like it. The concept of this post is one of the least interesting in a huge conceptspace of decision theory problems, especially decision theory problems in an ensemble universe. To focus on ‘having faith’ and ‘rationality’ in particular might seem clever, but it fails to illuminate in the same way that e.g. Nesov’s counterfactual mugging does. When you start thinking about various things simulators might do, you’re probably wrong about how much measure is going to be taken up by any given set of simulations. Especially so once you consider that a superintelligence is extremely likely to occur before brain emulations and that a superintelligence is almost assuredly not going to be running simulations of the kind you specify.
Instead of thinking “What kind of scenario involving simulations could I post to Less Wrong and still be relevant?”, as it seems to me you did, it would be much better to ask the more purely curious question “What is the relative power of optimization processes that would cause universes that include agents in my observer moment reference class to find themselves in a universe that looks like this one instead of some other universe?” Asking this question has led me to some interesting insights, and I imagine it would interest others as well.
Actually, I didn’t try to be relevant or interesting to the LW community. I’m just currently genuinely very interested in the kinds of questions this post was about, and selfishly thought I’d get very useful criticism and comments if I’d post here like this (as did indeed happen).
Getting downvoted so much is something that I for some reason enjoy very much :) It probably has to do with me thinking, that while there are very valid points on which my post and my decision to post it can be criticized, I suspect that instead of thinking of those valid reasons to dislike what I did, and seeing them as a reason to downvote, the downvoters probably just had a knee-jerk reaction to religion as a topic (probably even suspecting that I’d have religious views that I don’t actually have). If this suspicion is true, I would have ended up demonstrating a form of irrationality somewhat widespread within LW readership.
Or then the typical voting LW member is smarter than I thought, and they did actually notice that I was largely just looking for criticism and comments useful to me, instead of formulating my thoughts further on my own, and then perhaps later posting on LW something more polished and with a somewhat different focus.
I’ve found it hard to avoid doing this kind of thing. Luckily I have people at the Singularity Institute to discuss this kind of hypothesis with. If you post it to the Open Thread, it will like as not be ignored, and if you make a top level post about it, then it will like as not be downvoted (but at least you’ll get feedback). Perhaps it would be best if a lot of Less Wrongers made blogs and advertised those blogs in a top level post using some sort of endorsement method? I’ve thought about writing my own blog before, but it’d be annoying to have to ask a lot of people to check it out or subscribe. But if a lot of LWers did it, it wouldn’t be nearly as annoying. Then posts like the one you wrote could still get feedback without taking up space in the minds of a multitude of Less Wrong readers who don’t care about the decision theory of simulations.
This totally does not work unless you have a way of discovering evidence to distinguish between the two hypotheses, and I don’t think you have such a method. Commenters are more likely to have more sophisticated reasons for disagreeing than the average LW lurker who sees the word ‘faith’ in anything but a totally negative light and immediately downvotes, so posting this gave you little evidence. The downvotes are most likely to come from both the least and most sophisticated of LWers: the least because they’re allergic to anything to do with religion, the most because they’re allergic to hypotheses that fail to carve reality at its joints. If I was going to downvote the post it’d be because of the latter, but I still don’t know what the median reason would be for a downvote.
“the downvoters probably just had a knee-jerk reaction to religion as a topic (probably even suspecting that I’d have religious views that I don’t actually have)”
With some exceptions when the poster is well known here, my impression is that posts and comments on the topic of religion do get treated this way on a regular basis.
In my experience bringing up religion to make a point is often a bad call (or so community norms suggest) because politics is the mind killer and alienation for no good reason is generally frowned upon, and bringing up religion as a topic of discussion in itself is often done by those who want to bash religion for a reason that is not sophisticated enough.