Hmm, I think you’re not going to get a lot of response to this as posted because it’s not asking for something specific enough (instead just asking for people who consider themselves “the most debugged”) and because you’ve used terminology that suggests, at least to me, that you have only a surface level of engagement with LW-style rationality as evidenced by your ask for folks who are “debugged” since, although “debugging” is sometimes a metaphor people here use to talk about the process of clearing away their confusion, folks within the community rarely talk about degrees of debuggedness, thus it seems offputting and outsiderish, which I think works against you since are making an ask for people to trust you have a project worth investing time in.
Sorry for the unsolicited advice, but I see you’re getting downvoted and wanted to try to help make sense of why I think that’s happening.
Nod. I came here to say basically this and mostly endorse Gordon’s phrasing.
I’d add that we’ve specifically had bad experience with people who didn’t quite grok LW-style rationality (and using somewhat similar language to you), so this is something many of us have more of an allergic reaction to than you might naively expect.
The reason I was asking for debugged brains was I already experienced similar negative reception to my posts here. I was hoping to find at least a few guys here that are REALLY free of fallacies, so they would take my request just for what it is, not for what they interpret or assume it is.
I admit, the anticipation of this certain kind of ignorance here made ME not invest too much time into this post, which turned out to be a boomerang. I will be back with more specifics.
But—Isn’t this kind of judgement a negative authority fallacy?
And: how to prevent this in the future?
PS: No need to be sorry, thanks for your thoughts.
The reason I was asking for debugged brains was I already experienced similar negative reception to my posts here. I was hoping to find at least a few guys here that are REALLY free of fallacies, so they would take my request just for what it is, not for what they interpret or assume it is.
So to me this notion of finding folks “free of fallacies” sounds a bit strange since a fallacy is not something a person has and instead it’s a way of describing an action as failing to satisfy the requirements of some purpose. “Fallacy” is most often used in this sense surrounding failures of (modal) logic, that is reasoning that fails to satisfy the rules of modal logic, so someone’s reasoning may be said to be “free of fallacies” but not their person.
But—Isn’t this kind of judgement a negative authority fallacy?
I think this gets into other territory, which is that fallacies exist mostly within the confines of modal logic only, yet modal logic is insufficient to completely explain reality as we find it, and in fact one of the core ideas of LW is that traditional rationality, with modal logic and failure to adhere to it as traditional rationality’s spearhead, is inadequate because it often ignores useful evidence. Hence LW-style rationality takes a more nuanced stance in that fallacies are often only fallacies of logic and may actually be expressions of correct probability updates. That is, although some fallacy would disqualify you from saying something is “wrong” or “right” because it would invalidate the logic, it may instead be an expression of a valid probability update as a result of new evidence. Or not. Fallacies are somewhat correlated with correct probabilistic reasoning but not perfectly so (although probabilistic reasoning has its own set of fallacies not recognized within the fallacies of modal logic).
Sorry for the lack of links on this; it’s stuff that’s covered in the Sequences, but where exactly I’m not sure. I’ll gladly strong upvote a reply to this comment linking to relevant material on these points.
Sorry for the lack of links on this; it’s stuff that’s covered in the Sequences, but where exactly I’m not sure. I’ll gladly strong upvote a reply to this comment linking to relevant material on these points.
I don’t think it’s in single posts? Like, there’s the Robin Hanson post The Fallacy Fallacy, or JGWeissman’s Catchy Fallacy Name Fallacy, but those are mostly about “here are specific issues with focusing on fallacies” as opposed to “and also here’s how Bayesian epistemology works instead.” If I were to point to a single post, it might be Science Isn’t Strict Enough, which of course is about science instead of about logic, and is doing the “this is how this standard diverges from what seems to be the right standard” argument but in the opposite direction, sort of.
Firstly, let me say that I think the idea of bringing rationalism to the masses is a great idea. I think the best we have so far is HPMoR so that should be the standard to try to improve upon.
Secondly, it is a very difficult task, as you are aware. That means that my prior for any individual succeeding at this would be very low, even if I’ve seen lots of evidence showing that they have the kind of skill set that would be required. If I hadn’t read HPMoR I would have put a low expectation on Eliezer managing it—he himself says he would have only put a 10% chance of the kind of success that it has achieved.
If I have yet to witness that individual’s skills then my prior is tiny and I need alot of evidence to suggest that they are capable. I think this is what you’re seeing when you perceive a judgment on negative authority—I’m not saying you can’t do it, only that I want more evidence before I believe that you can.
***
With your last post I think you were doing the right thing—putting your ideas out there and seeing what happens. Then if you’ve got it right people will start believing in your project more. I think where you went wrong on your last post was how you updated on the feedback you received. 2 hypotheses:
1. You are right and the community is full of people who don’t realise
2. There are some issues which you were wrong about or stylistic choices which were unhelpful
I think the evidence is better for option 2 and that you would do better to modify what you’ve done based on the feedback.
If you are still convinced of option 1 then it’s up to you to persuade the community why it is wrong. For ChristianKl’s comment you could write the page of the proposed site where you give the evidence he requests. Reading between the lines I suspect that he disagrees with what you’ve said and that is why he wants you to provide the evidence, rather than purely that this would be the norm for LW. For my or Elo’s comments you could persuade us that it really is as bad as you say.
***
In the future, my advice to you would be:
Start small—what individual bias do you think you could explain best? How would you explain just that 1 small thing as simply and engagingly as possible?
Use the site questions feature—if you want examples from the community just ask the question without any commentary on who is/isn’t debugged etc.
AFAIK there isn’t a specific movement where the spread of rationality is its core aim. I can’t speak for anyone else but my impression is that this kind of rationality is most likely to spread organically rather than from one big project. There are lots of communities which are working on rationality related projects and will welcome in whoever is interested. People here are more than happy to apply their rationality, just not necessarily in the project which you are prescribing. This is a rational response if they have a low expectation of success.
My issue here is that from witnessing your interactions so far I don’t have very high expectations of your own personal emotional intelligence. Criticism of your ideas often seems to be met with hostility, exasperation and accusations of fallacies. Even if your ideas are correct this seems like a great way of alienating those who you are asking for help. One of the key tenets to LW style rationality vs traditional rationality is dealing with the world as it is, not as we think it should be and I don’t feel like you’re doing that.
Again, I could be wrong about this but the impressions that you give are key to getting people to co-operate with you.
I can understand your excitement at finding a community which represents some of the things where you’ve previously felt that you’re on you own. However I think you would be wiser to take stock and learn before you try a project as ambitious as you are suggesting.
Start small—what individual bias do you think you could explain best? How would you explain just that 1 small thing as simply and engagingly as possible?
Use the site questions feature—if you want examples from the community just ask the question without any commentary on who is/isn’t debugged etc.
I suspect you have more learning to do before you really get LW rationality as G Gordon Worley III describes so it might be better to really get a handle on all this first.
What I am look for in particular is a forum / community dealing with not only rationality but also emotional intelligence / empathy / self reflection / brain debugging / reason vs. emotion in a less theoretical/academic/abstract and more practical way than LW.
What I am looking for is an online step by step guide to rationality that starts from zero and is somewhat easier to access than LW / EY stuff.
″ the rationality community is still much less than what it could have been. ”
This is exactly what I mean and I am looking to talk to people about that—hopefully WITHOUT first having to pass any “is this guy even worth talking to” tests… :-)
I feel like we’re going over the same ground. I’m not sure there’s much more for me to add as I don’t know of any sites which I think would be the right match for you.
Hmm, I think you’re not going to get a lot of response to this as posted because it’s not asking for something specific enough (instead just asking for people who consider themselves “the most debugged”) and because you’ve used terminology that suggests, at least to me, that you have only a surface level of engagement with LW-style rationality as evidenced by your ask for folks who are “debugged” since, although “debugging” is sometimes a metaphor people here use to talk about the process of clearing away their confusion, folks within the community rarely talk about degrees of debuggedness, thus it seems offputting and outsiderish, which I think works against you since are making an ask for people to trust you have a project worth investing time in.
Sorry for the unsolicited advice, but I see you’re getting downvoted and wanted to try to help make sense of why I think that’s happening.
Nod. I came here to say basically this and mostly endorse Gordon’s phrasing.
I’d add that we’ve specifically had bad experience with people who didn’t quite grok LW-style rationality (and using somewhat similar language to you), so this is something many of us have more of an allergic reaction to than you might naively expect.
I wonder what such bad experience might have been?
What could probably happen worst case???
I already experienced what you call “allergic reaction” here.
But isn’t that nothing but the outcome of fallacy:
“Just because the guy is not using typical LW language he must have no clue about LW.”
The reason I was asking for debugged brains was I already experienced similar negative reception to my posts here. I was hoping to find at least a few guys here that are REALLY free of fallacies, so they would take my request just for what it is, not for what they interpret or assume it is.
I admit, the anticipation of this certain kind of ignorance here made ME not invest too much time into this post, which turned out to be a boomerang. I will be back with more specifics.
But—Isn’t this kind of judgement a negative authority fallacy?
And: how to prevent this in the future?
PS: No need to be sorry, thanks for your thoughts.
So to me this notion of finding folks “free of fallacies” sounds a bit strange since a fallacy is not something a person has and instead it’s a way of describing an action as failing to satisfy the requirements of some purpose. “Fallacy” is most often used in this sense surrounding failures of (modal) logic, that is reasoning that fails to satisfy the rules of modal logic, so someone’s reasoning may be said to be “free of fallacies” but not their person.
I think this gets into other territory, which is that fallacies exist mostly within the confines of modal logic only, yet modal logic is insufficient to completely explain reality as we find it, and in fact one of the core ideas of LW is that traditional rationality, with modal logic and failure to adhere to it as traditional rationality’s spearhead, is inadequate because it often ignores useful evidence. Hence LW-style rationality takes a more nuanced stance in that fallacies are often only fallacies of logic and may actually be expressions of correct probability updates. That is, although some fallacy would disqualify you from saying something is “wrong” or “right” because it would invalidate the logic, it may instead be an expression of a valid probability update as a result of new evidence. Or not. Fallacies are somewhat correlated with correct probabilistic reasoning but not perfectly so (although probabilistic reasoning has its own set of fallacies not recognized within the fallacies of modal logic).
Sorry for the lack of links on this; it’s stuff that’s covered in the Sequences, but where exactly I’m not sure. I’ll gladly strong upvote a reply to this comment linking to relevant material on these points.
I don’t think it’s in single posts? Like, there’s the Robin Hanson post The Fallacy Fallacy, or JGWeissman’s Catchy Fallacy Name Fallacy, but those are mostly about “here are specific issues with focusing on fallacies” as opposed to “and also here’s how Bayesian epistemology works instead.” If I were to point to a single post, it might be Science Isn’t Strict Enough, which of course is about science instead of about logic, and is doing the “this is how this standard diverges from what seems to be the right standard” argument but in the opposite direction, sort of.
Firstly, let me say that I think the idea of bringing rationalism to the masses is a great idea. I think the best we have so far is HPMoR so that should be the standard to try to improve upon.
Secondly, it is a very difficult task, as you are aware. That means that my prior for any individual succeeding at this would be very low, even if I’ve seen lots of evidence showing that they have the kind of skill set that would be required. If I hadn’t read HPMoR I would have put a low expectation on Eliezer managing it—he himself says he would have only put a 10% chance of the kind of success that it has achieved.
If I have yet to witness that individual’s skills then my prior is tiny and I need alot of evidence to suggest that they are capable. I think this is what you’re seeing when you perceive a judgment on negative authority—I’m not saying you can’t do it, only that I want more evidence before I believe that you can.
***
With your last post I think you were doing the right thing—putting your ideas out there and seeing what happens. Then if you’ve got it right people will start believing in your project more. I think where you went wrong on your last post was how you updated on the feedback you received. 2 hypotheses:
1. You are right and the community is full of people who don’t realise
2. There are some issues which you were wrong about or stylistic choices which were unhelpful
I think the evidence is better for option 2 and that you would do better to modify what you’ve done based on the feedback.
If you are still convinced of option 1 then it’s up to you to persuade the community why it is wrong. For ChristianKl’s comment you could write the page of the proposed site where you give the evidence he requests. Reading between the lines I suspect that he disagrees with what you’ve said and that is why he wants you to provide the evidence, rather than purely that this would be the norm for LW. For my or Elo’s comments you could persuade us that it really is as bad as you say.
***
In the future, my advice to you would be:
Start small—what individual bias do you think you could explain best? How would you explain just that 1 small thing as simply and engagingly as possible?
Use the site questions feature—if you want examples from the community just ask the question without any commentary on who is/isn’t debugged etc.
Bucky, man, you made my day!
Finally someone contributing!
Without your post I was about to commit some “you don’t shit where you eat” kind of post here...
I will later address your comments in detail.
For now:
Is LW the only community about rationality?
How about CFAR or EA? I coulnd’t find any forums there.
The folks here seem not ready, willing or able to APPLY their wisdom. (No, that’s not an applause light, that is sarkasm!)
BTW, rationality in the narrower LW sense is not enough.
I also deal with emotional intelligence. Basically the reason for irrationality is emotions
trumping reason.
Is there any movement with the aim to spread rationality?
Who to talk to? - I mean I can’t be the only guy on the planet!
Is there any step by step guide to rationality online?
I know clearerthinking.org but that’s not it yet.
Do you maybe know some psychological community / forum?
Do you know “predictably irrational” by Dan Ariely?
I have my own story how i developed a rational mind starting in the 1980s.
Who could be interested in that?
What/who is the spearhead/ figurehead of any rationality and emotional intelligence (which in essence is about the same) movement ????
I am asking because until I found this person/organization I feel like I am this person and that feels weird.
Hope to stay in touch.
AFAIK there isn’t a specific movement where the spread of rationality is its core aim. I can’t speak for anyone else but my impression is that this kind of rationality is most likely to spread organically rather than from one big project. There are lots of communities which are working on rationality related projects and will welcome in whoever is interested. People here are more than happy to apply their rationality, just not necessarily in the project which you are prescribing. This is a rational response if they have a low expectation of success.
My issue here is that from witnessing your interactions so far I don’t have very high expectations of your own personal emotional intelligence. Criticism of your ideas often seems to be met with hostility, exasperation and accusations of fallacies. Even if your ideas are correct this seems like a great way of alienating those who you are asking for help. One of the key tenets to LW style rationality vs traditional rationality is dealing with the world as it is, not as we think it should be and I don’t feel like you’re doing that.
Again, I could be wrong about this but the impressions that you give are key to getting people to co-operate with you.
I can understand your excitement at finding a community which represents some of the things where you’ve previously felt that you’re on you own. However I think you would be wiser to take stock and learn before you try a project as ambitious as you are suggesting.
To make things easy:
Whats the quickest way for me to prove myself?
I suspect you have more learning to do before you really get LW rationality as G Gordon Worley III describes so it might be better to really get a handle on all this first.
Fair enough.
Could you maybe point me to other helpful sites?
What I am look for in particular is a forum / community dealing with not only rationality but also emotional intelligence / empathy / self reflection / brain debugging / reason vs. emotion in a less theoretical/academic/abstract and more practical way than LW.
What I am looking for is an online step by step guide to rationality that starts from zero and is somewhat easier to access than LW / EY stuff.
I know https://yourlogicalfallacyis.com, but it it’s focus is too narrow, and there is no community.
I finally found this:
https://forum.effectivealtruism.org/posts/Z9rLfMjTZ2X376izm/rationality-as-an-ea-cause-area
″ the rationality community is still much less than what it could have been. ”
This is exactly what I mean and I am looking to talk to people about that—hopefully WITHOUT first having to pass any “is this guy even worth talking to” tests… :-)
Thanks, Marcus.
I feel like we’re going over the same ground. I’m not sure there’s much more for me to add as I don’t know of any sites which I think would be the right match for you.
Why don’t you have a little more faith… :-)
I can only suggest you read my older post about Emo Climate Change.
You should be able to find at least 1-2 concepts you can’t find anywhere else...
This is the only thing I have online so far.
I can understand your impression, but just point me to anything clearly wrong I wrote… :-)
Will take the time to give you more substantial input...
I’ve read it and commented on it already. You can refer to that comment for my thoughts.
Concepts which I can’t find elsewhere are only good if they are accurate/helpful which I don’t believe they are.
I think in this case it is up to you to show that you’re right, rather than up to me to show you’re wrong.