Pick your favourite philosophical claim. I’m sure there are very smart possible entities that don’t believe this and very smart ones that do
If there’s a philosophical claim that intelligent agents across the universe wouldn’t display massive agreement on, then I don’t really think it is worth its salt. I think that this principle can be used to eliminate a lot of nonsense from philosophy.
Which of anti-realism or weak realism is true seems to be a question we can eliminate. Whether strong realism is true or not seems substantive, because it matters to our policy which is true.
If there’s a philosophical claim that intelligent agents across the universe wouldn’t display massive agreement on, then I don’t really think it is worth its salt. I think that this principle can be used to eliminate a lot of nonsense from philosophy.
Which of anti-realism or weak realism is true seems to be a question we can eliminate. Whether strong realism is true or not seems substantive, because it matters to our policy which is true.