True. But if I assumed that I was correct there really wouldn’t be much point in me being here.
Why not? I’m here to improve my rationality I suppose you are here for the same reason. Would you dismiss Newtonian physics once you discover that Newton had irrational mystical beliefs?
Hence rationality. I do not consider the rational/irrational line to be fuzzy. Our vision of it is certainly fuzzy, but I think the art of rationality depends on there being an Answer. Am I wrong in this thought?
I tend to agree. But as far as my understanding of Bayesianism goes, even two bayesian superintelligences starting off with different priors will disagree on what is rational(they will arrive at different beliefs). The question then is: is there one correct prior and if there is how do we find it?
The relevant question is how LessWrong plans to deal with irrational people that slip through the karma system.
I didn’t notice an answer to the primary question: “How do we deal with the irrational amongst us?” Did I miss it?
Who will be the judge of what is rational/irrational? I have seen perfectly rational posts being downvoted into negative numbers while irrational ones have been upvoted. From my observation points are really given more by the heuristic “does this comment conform to my point of view”. And when more people join LW the average level of rationality here will approximate the average in the general population. So hoping that people will be able to vote correctly is wishful thinking.
Why not? I’m here to improve my rationality I suppose you are here for the same reason. Would you dismiss Newtonian physics once you discover that Newton had irrational mystical beliefs?
If I assumed I was correct, I would be going to other people who believed the way I did and learning from them. I don’t assume I am correct, so I try learning from people who believe differently and seeing what sticks.
I tend to agree. But as far as my understanding of Bayesianism goes, even two bayesian superintelligences starting off with different priors will disagree on what is rational(they will arrive at different beliefs). The question then is: is there one correct prior and if there is how do we find it?
I thought there was a way to deal with this? I could be wrong and I haven’t read the relevant articles. I just remember people talking about it.
I am not sure I agree with this use of “rational”. I would expect these two superintelligences to be able to explain their priors and see that the other has arrived at a rational conclusion given those priors.
What I am talking about is someone who is arriving at an obviously irrational conclusion.
Who will be the judge of what is rational/irrational? I have seen perfectly rational posts being downvoted into negative numbers while irrational ones have been upvoted. From my observation points are really given more by the heuristic “does this comment conform to my point of view”. And when more people join LW the average level of rationality here will approximate the average in the general population. So hoping that people will be able to vote correctly is wishful thinking.
Okay, let me reword the question: “How do we deal with the obviously irrational among us?” I am not talking about people near or close to the line. I am talking about people who are clearly irrational.
It sounds like you are saying, “Not with karma because people are not using it that way.” I agree.
Why not? I’m here to improve my rationality I suppose you are here for the same reason. Would you dismiss Newtonian physics once you discover that Newton had irrational mystical beliefs?
I tend to agree. But as far as my understanding of Bayesianism goes, even two bayesian superintelligences starting off with different priors will disagree on what is rational(they will arrive at different beliefs). The question then is: is there one correct prior and if there is how do we find it?
Who will be the judge of what is rational/irrational? I have seen perfectly rational posts being downvoted into negative numbers while irrational ones have been upvoted. From my observation points are really given more by the heuristic “does this comment conform to my point of view”. And when more people join LW the average level of rationality here will approximate the average in the general population. So hoping that people will be able to vote correctly is wishful thinking.
If I assumed I was correct, I would be going to other people who believed the way I did and learning from them. I don’t assume I am correct, so I try learning from people who believe differently and seeing what sticks.
I thought there was a way to deal with this? I could be wrong and I haven’t read the relevant articles. I just remember people talking about it.
I am not sure I agree with this use of “rational”. I would expect these two superintelligences to be able to explain their priors and see that the other has arrived at a rational conclusion given those priors.
What I am talking about is someone who is arriving at an obviously irrational conclusion.
Okay, let me reword the question: “How do we deal with the obviously irrational among us?” I am not talking about people near or close to the line. I am talking about people who are clearly irrational.
It sounds like you are saying, “Not with karma because people are not using it that way.” I agree.