The phrasing of the question was quite specific: “Which disaster do you think is most likely to wipe out greater than 90% of humanity before the year 2100?”
If I estimate a very small probability of either FAI or UFAI before 2100, then I’m not likely to choose UFAI as “most likely to wipe out 90% of humanity before 2100” if I think there’s a solid chance for something else to do so.
Consider that I interpreted the singularity question to mean “if you think there is any real chance of a singularity, then in the case that the singularity happens, give the year by which you think it has 50% probability.” and answered with 2350, while thinking that the singularity had less than a 50% probability of happening at all.
Yes, Yvain did say to leave it blank if you don’t think there will be a singularity. Given the huge uncertainty involved in anyone’s prediction of the singularity or any question related to it, I took “don’t believe it will happen” to mean that my estimated chance was low enough to not be worth reasoning about the case where it does happen, rather than that my estimate was below 50%.
The phrasing of the question was quite specific: “Which disaster do you think is most likely to wipe out greater than 90% of humanity before the year 2100?”
If I estimate a very small probability of either FAI or UFAI before 2100, then I’m not likely to choose UFAI as “most likely to wipe out 90% of humanity before 2100” if I think there’s a solid chance for something else to do so.
Consider that I interpreted the singularity question to mean “if you think there is any real chance of a singularity, then in the case that the singularity happens, give the year by which you think it has 50% probability.” and answered with 2350, while thinking that the singularity had less than a 50% probability of happening at all.
Yes, Yvain did say to leave it blank if you don’t think there will be a singularity. Given the huge uncertainty involved in anyone’s prediction of the singularity or any question related to it, I took “don’t believe it will happen” to mean that my estimated chance was low enough to not be worth reasoning about the case where it does happen, rather than that my estimate was below 50%.