Let us suppose one day will be one day when everyone in the rationalist community is either being too sceptical or disagreeing with some aspect of the story; or, in this case, too confident having read and understood what the grand insight could be in the context of the story.
You’ve made this argument (in the context of the story) a meta-level point—it’s a much harder question to hold with any facts about people. If I am writing this essay as though it were about an EA project they are not well understood, that’s really not a valid argument for the EA project, because we need to be careful about testing for different sets of assumptions, and even if we could find that one day there will be no doubt that it will be, then it is totally impossible to do any testing, even if we could find that one day there will be no doubt that the EA project is more likely, and so I am writing this essay as though it was about an EA group.
If the EA project is successful, that suggests that I would be persuaded by it if it were successful, but it’s not really my view.
You’ve made a good point here, but if you don’t believe it’s true? What are you trying to explain to your opponent, or don’t see how the EA community should go forward?
I think the answer is probably to take a look around the EA’s website, or your own personal perspective. There’s a bunch of links that just appear to show that whatever you’re doing is pretty good and it seems to be, well, good.
It’s interesting that I got that impression from the same post I linked back in, but I’m not sure how the post was intended to fit it. (I’m guessing it’s intended as a descriptive example, but I’m less familiar with the rest of the community here.)
The funniest part is that I have a friend who really talks like this. We often listen to what he has to say intently to try to parse any meaning from it, but it seems like he’s either too far beyond us or just mad (I’d say both). Guess I have a new nickname for him
I don’t think I am particularly proud of our positive commentary on people. I think it’s a common complaint, but it’s a problem I couldn’t handle because it’s from people who don’t understand the rationality of the rationalist community.
From the first paragraph of my article, in the first paragraph, seems to be the claim that this community doesn’t exist and isn’t worth looking into. What’s wrong with it? I know it was from someone who understood a lot of the rationality communities, but I don’t know what it’s like to them and so for that to make sense you have to make that assertion, which is a claim that, given their true and unverifiable reasons, has never had anyone to trust.
It seems like the biggest barrier to appreciating the extent of what you’re doing to yourself and your interactions with your other friends and family is a cost to your ability to participate in them at the moment in such an environment.
Is it the actual fault with you, or the problem of others?
(I am not sure how much I understand or understand the issue, but this seems to me just as true, and in any case, to my “little-mild” cynical view?)
Let us suppose one day will be one day when everyone in the rationalist community is either being too sceptical or disagreeing with some aspect of the story; or, in this case, too confident having read and understood what the grand insight could be in the context of the story.
You’ve made this argument (in the context of the story) a meta-level point—it’s a much harder question to hold with any facts about people. If I am writing this essay as though it were about an EA project they are not well understood, that’s really not a valid argument for the EA project, because we need to be careful about testing for different sets of assumptions, and even if we could find that one day there will be no doubt that it will be, then it is totally impossible to do any testing, even if we could find that one day there will be no doubt that the EA project is more likely, and so I am writing this essay as though it was about an EA group.
If the EA project is successful, that suggests that I would be persuaded by it if it were successful, but it’s not really my view.
You’ve made a good point here, but if you don’t believe it’s true? What are you trying to explain to your opponent, or don’t see how the EA community should go forward?
I think the answer is probably to take a look around the EA’s website, or your own personal perspective. There’s a bunch of links that just appear to show that whatever you’re doing is pretty good and it seems to be, well, good.
Was this comment actually just run through GPT2?
It’s interesting that I got that impression from the same post I linked back in, but I’m not sure how the post was intended to fit it. (I’m guessing it’s intended as a descriptive example, but I’m less familiar with the rest of the community here.)
The funniest part is that I have a friend who really talks like this. We often listen to what he has to say intently to try to parse any meaning from it, but it seems like he’s either too far beyond us or just mad (I’d say both). Guess I have a new nickname for him
I don’t think I am particularly proud of our positive commentary on people. I think it’s a common complaint, but it’s a problem I couldn’t handle because it’s from people who don’t understand the rationality of the rationalist community.
From the first paragraph of my article, in the first paragraph, seems to be the claim that this community doesn’t exist and isn’t worth looking into. What’s wrong with it? I know it was from someone who understood a lot of the rationality communities, but I don’t know what it’s like to them and so for that to make sense you have to make that assertion, which is a claim that, given their true and unverifiable reasons, has never had anyone to trust.
It seems like the biggest barrier to appreciating the extent of what you’re doing to yourself and your interactions with your other friends and family is a cost to your ability to participate in them at the moment in such an environment.
Is it the actual fault with you, or the problem of others?
(I am not sure how much I understand or understand the issue, but this seems to me just as true, and in any case, to my “little-mild” cynical view?)