Okay, so that’s the Doomsday Argument then: Since being able to conquer the universe implies we’re 10^70 special, we must not be able to conquer the universe.
Calling the converse of this an arcane meta-argument about probability hardly seems fair. You can make a case for Doomsday but it’s not non-arcane.
Perhaps this is hairsplitting but the principle I am employing is not arcane: it is that I should doubt theories which imply astronomically improbable things. The only unusual step is to realize that theories with vast future populations have such an implication.
I am unable to state what the SIA counterargument is.
In the theory that there are astronomically large numbers of people, it is a certainty that some of them came first. The probability that YOU are one of those people equal to the probability that YOU are any one of those other people. However, it does define a certain small narrow equivalence class that you happen to be a member of.
It’s a bit like the difference between theorizing that: A) given that you bought a ticket, you’ll win the lottery, and B) given that the lottery folks gave you a large sum, that you had the winning ticket.
That’s not the “SIA counterargument”, which is what I want to hear (in a compact form, that makes it sound straightforward). You’re just saying “accept the evidence that something ultra-improbable happened to you, because it had to happen to someone”.
But it’s still a simple idea once you grasp it. I was hoping you could state the counterargument with comparable simplicity. What is the counterargument at the level of principles, which neutralizes this one?
Okay, so that’s the Doomsday Argument then: Since being able to conquer the universe implies we’re 10^70 special, we must not be able to conquer the universe.
Calling the converse of this an arcane meta-argument about probability hardly seems fair. You can make a case for Doomsday but it’s not non-arcane.
Perhaps this is hairsplitting but the principle I am employing is not arcane: it is that I should doubt theories which imply astronomically improbable things. The only unusual step is to realize that theories with vast future populations have such an implication.
I am unable to state what the SIA counterargument is.
In the theory that there are astronomically large numbers of people, it is a certainty that some of them came first. The probability that YOU are one of those people equal to the probability that YOU are any one of those other people. However, it does define a certain small narrow equivalence class that you happen to be a member of.
It’s a bit like the difference between theorizing that: A) given that you bought a ticket, you’ll win the lottery, and B) given that the lottery folks gave you a large sum, that you had the winning ticket.
That’s not the “SIA counterargument”, which is what I want to hear (in a compact form, that makes it sound straightforward). You’re just saying “accept the evidence that something ultra-improbable happened to you, because it had to happen to someone”.
I was only replying to the first paragraph, really. Even under the SSA there’s no real problem here. I don’t see how the SIA makes matters worse.
Right. That’s arcane. Mundane theories have no need to measure the population of the universe.
But it’s still a simple idea once you grasp it. I was hoping you could state the counterargument with comparable simplicity. What is the counterargument at the level of principles, which neutralizes this one?