As mentioned by others, it’s important to draw a distinction between having no information about the relative probability of two events, and saying they’re equally probable.
If you aren’t going to apply occam’s razor here to eliminate the matrix possibility, then don’t you also have to give equal weight to all other alternate possibilities which are mutually exclusive with these two, no matter how complex? Each time you think of a new idea, it will have to share probability with the other ideas and the probability of each isolated scenario will approach zero.
For example, what if our brains are controlling artificial bodies that exist outside of any matrix, but our sensory input is being edited to alter specific details we observe (such as making us unable to notice clues that this is happening)?
As mentioned by others, it’s important to draw a distinction between having no information about the relative probability of two events, and saying they’re equally probable.
If you aren’t going to apply occam’s razor here to eliminate the matrix possibility, then don’t you also have to give equal weight to all other alternate possibilities which are mutually exclusive with these two, no matter how complex? Each time you think of a new idea, it will have to share probability with the other ideas and the probability of each isolated scenario will approach zero.
For example, what if our brains are controlling artificial bodies that exist outside of any matrix, but our sensory input is being edited to alter specific details we observe (such as making us unable to notice clues that this is happening)?