We mentioned both. My hope is that this letter is just an early draft so my preference is to put in material to get comments. I personally think our situation is already extremely dangerous so I’m more willing to mention things deemed dangerous. Finally, while it might have been a good idea initially to treat Roko’s basilisk as an information hazard to be ignored, that is no longer possible so the marginal cost of mentioning it seems tiny.
Did you, though? Besides Roko’s basilisk, the references to acausal trade seem vague, but to me they sound like the kinds that could easily make things worse. In particular, you don’t explicitly discuss superrationality, right?
>Finally, while it might have been a good idea initially to treat Roko’s basilisk as an information hazard to be ignored, that is no longer possible so the marginal cost of mentioning it seems tiny.
I agree that due to how widespread the idea of Roko’s basilisk is, it overall matters relatively little whether this idea is mentioned, but I think this applies similarly in both directions.
No mention of superrationality although we make references to how decision theory might work out which I think implies we are going beyond a simple model of game theory rationality. Acausal trade was a hard one to write about because I wanted the letter to be understandable to lots of people and that one isn’t something lots of people understand, compared to, say, this all perhaps being a simulation.
We mentioned both. My hope is that this letter is just an early draft so my preference is to put in material to get comments. I personally think our situation is already extremely dangerous so I’m more willing to mention things deemed dangerous. Finally, while it might have been a good idea initially to treat Roko’s basilisk as an information hazard to be ignored, that is no longer possible so the marginal cost of mentioning it seems tiny.
>We mentioned both.
Did you, though? Besides Roko’s basilisk, the references to acausal trade seem vague, but to me they sound like the kinds that could easily make things worse. In particular, you don’t explicitly discuss superrationality, right?
>Finally, while it might have been a good idea initially to treat Roko’s basilisk as an information hazard to be ignored, that is no longer possible so the marginal cost of mentioning it seems tiny.
I agree that due to how widespread the idea of Roko’s basilisk is, it overall matters relatively little whether this idea is mentioned, but I think this applies similarly in both directions.
No mention of superrationality although we make references to how decision theory might work out which I think implies we are going beyond a simple model of game theory rationality. Acausal trade was a hard one to write about because I wanted the letter to be understandable to lots of people and that one isn’t something lots of people understand, compared to, say, this all perhaps being a simulation.