“most attempts will fail, but if we are allowed 3^^^3 attempts at a significant technological gain, we will expect to advance.” I think you need some sort of prior distribution for number of attempts to make it analogous to SIA doomsday.
And you need a similar prior distribution for Katja_Grace’s SIA argument, which makes it underdetermined as well. (My “no anthropic reasoning” paradigm is really starting to pan out!)
James Miller is the author of this top-level post.
James Miller is the author of this top-level post.
Thanks, that underscores my difficulty finding relevant details.
I think you’re saying the relevant detail here is the applicability of anthropic reasoning to the universe we actually live in: Actually using the island argument doesn’t help us learn about the real world as much as looking up historic data about islands, and the SIA doomsday argument fails similarly in the face of real-world astronomy. Is this correct?
And you need a similar prior distribution for Katja_Grace’s SIA argument, which makes it underdetermined as well. (My “no anthropic reasoning” paradigm is really starting to pan out!)
James Miller is the author of this top-level post.
Thanks, that underscores my difficulty finding relevant details.
I think you’re saying the relevant detail here is the applicability of anthropic reasoning to the universe we actually live in: Actually using the island argument doesn’t help us learn about the real world as much as looking up historic data about islands, and the SIA doomsday argument fails similarly in the face of real-world astronomy. Is this correct?