the fact that a lot of people here think that being religious is irrational doesn’t mean that it actually is irrational.
True. But if I assumed that I was correct there really wouldn’t be much point in me being here.
there are people here, including some in the very top according to karma points who hold beliefs that are not religious but clearly irrational IMHO.
I think this is relevant to the post and the subject at hand. I am not sure I see why you brought it up, however.
what one humans deems rational another might deem irrational
Hence rationality. I do not consider the rational/irrational line to be fuzzy. Our vision of it is certainly fuzzy, but I think the art of rationality depends on there being an Answer. Am I wrong in this thought?
I don’t see any problem with you having obtained a lot of karma here. Again, a lot of people(probably all) here have irrational beliefs why should you be different? Aumann is considered irrational in respect to his religious beliefs but no one denies his contributions to science.
The reason I bring up karma is because I see flaws in the karma system. These flaws are not so much that I happened to get some, but rather than karma has a limit in its ability to predict rationality. As in, it can’t. The relevant question is how LessWrong plans to deal with irrational people that slip through the karma system. The answer I feel coming from the community in the short time this post has been live is that it will be handled on a case-by-case basis. (No one has explicitly said this. I am reading between the lines.) I see no problem with that until LessWrong gets ridiculously large.
Religion is now considered a great example of irrationality but only because of the current political and social context(being atheist is “in”). There are beliefs as irrational as but that are simply not pointed out as vehemently.
Religion was chosen because I am religious. I completely agree with you.
I’m not going to read all of your post, but from what I understood I will say the following:
[...]
Enough said.
Okay. If you are curious, I didn’t notice an answer to the primary question: “How do we deal with the irrational amongst us?” Did I miss it?
True. But if I assumed that I was correct there really wouldn’t be much point in me being here.
Why not? I’m here to improve my rationality I suppose you are here for the same reason. Would you dismiss Newtonian physics once you discover that Newton had irrational mystical beliefs?
Hence rationality. I do not consider the rational/irrational line to be fuzzy. Our vision of it is certainly fuzzy, but I think the art of rationality depends on there being an Answer. Am I wrong in this thought?
I tend to agree. But as far as my understanding of Bayesianism goes, even two bayesian superintelligences starting off with different priors will disagree on what is rational(they will arrive at different beliefs). The question then is: is there one correct prior and if there is how do we find it?
The relevant question is how LessWrong plans to deal with irrational people that slip through the karma system.
I didn’t notice an answer to the primary question: “How do we deal with the irrational amongst us?” Did I miss it?
Who will be the judge of what is rational/irrational? I have seen perfectly rational posts being downvoted into negative numbers while irrational ones have been upvoted. From my observation points are really given more by the heuristic “does this comment conform to my point of view”. And when more people join LW the average level of rationality here will approximate the average in the general population. So hoping that people will be able to vote correctly is wishful thinking.
Why not? I’m here to improve my rationality I suppose you are here for the same reason. Would you dismiss Newtonian physics once you discover that Newton had irrational mystical beliefs?
If I assumed I was correct, I would be going to other people who believed the way I did and learning from them. I don’t assume I am correct, so I try learning from people who believe differently and seeing what sticks.
I tend to agree. But as far as my understanding of Bayesianism goes, even two bayesian superintelligences starting off with different priors will disagree on what is rational(they will arrive at different beliefs). The question then is: is there one correct prior and if there is how do we find it?
I thought there was a way to deal with this? I could be wrong and I haven’t read the relevant articles. I just remember people talking about it.
I am not sure I agree with this use of “rational”. I would expect these two superintelligences to be able to explain their priors and see that the other has arrived at a rational conclusion given those priors.
What I am talking about is someone who is arriving at an obviously irrational conclusion.
Who will be the judge of what is rational/irrational? I have seen perfectly rational posts being downvoted into negative numbers while irrational ones have been upvoted. From my observation points are really given more by the heuristic “does this comment conform to my point of view”. And when more people join LW the average level of rationality here will approximate the average in the general population. So hoping that people will be able to vote correctly is wishful thinking.
Okay, let me reword the question: “How do we deal with the obviously irrational among us?” I am not talking about people near or close to the line. I am talking about people who are clearly irrational.
It sounds like you are saying, “Not with karma because people are not using it that way.” I agree.
True. But if I assumed that I was correct there really wouldn’t be much point in me being here.
I think this is relevant to the post and the subject at hand. I am not sure I see why you brought it up, however.
Hence rationality. I do not consider the rational/irrational line to be fuzzy. Our vision of it is certainly fuzzy, but I think the art of rationality depends on there being an Answer. Am I wrong in this thought?
The reason I bring up karma is because I see flaws in the karma system. These flaws are not so much that I happened to get some, but rather than karma has a limit in its ability to predict rationality. As in, it can’t. The relevant question is how LessWrong plans to deal with irrational people that slip through the karma system. The answer I feel coming from the community in the short time this post has been live is that it will be handled on a case-by-case basis. (No one has explicitly said this. I am reading between the lines.) I see no problem with that until LessWrong gets ridiculously large.
Religion was chosen because I am religious. I completely agree with you.
Okay. If you are curious, I didn’t notice an answer to the primary question: “How do we deal with the irrational amongst us?” Did I miss it?
Why not? I’m here to improve my rationality I suppose you are here for the same reason. Would you dismiss Newtonian physics once you discover that Newton had irrational mystical beliefs?
I tend to agree. But as far as my understanding of Bayesianism goes, even two bayesian superintelligences starting off with different priors will disagree on what is rational(they will arrive at different beliefs). The question then is: is there one correct prior and if there is how do we find it?
Who will be the judge of what is rational/irrational? I have seen perfectly rational posts being downvoted into negative numbers while irrational ones have been upvoted. From my observation points are really given more by the heuristic “does this comment conform to my point of view”. And when more people join LW the average level of rationality here will approximate the average in the general population. So hoping that people will be able to vote correctly is wishful thinking.
If I assumed I was correct, I would be going to other people who believed the way I did and learning from them. I don’t assume I am correct, so I try learning from people who believe differently and seeing what sticks.
I thought there was a way to deal with this? I could be wrong and I haven’t read the relevant articles. I just remember people talking about it.
I am not sure I agree with this use of “rational”. I would expect these two superintelligences to be able to explain their priors and see that the other has arrived at a rational conclusion given those priors.
What I am talking about is someone who is arriving at an obviously irrational conclusion.
Okay, let me reword the question: “How do we deal with the obviously irrational among us?” I am not talking about people near or close to the line. I am talking about people who are clearly irrational.
It sounds like you are saying, “Not with karma because people are not using it that way.” I agree.