The first association I have with your username is “spams Open Threads with not really interesting questions”.
Note that there are two parts in that objection. Posting a boring question in an Open Thread is not a problem per se—I don’t really want to discourage people from doing that. It’s just that when I open any Open Thread, and there are at least five boring top-level comments by the same user, instead of simply ignoring them I feel annoyed.
Many of your comments are very general debate-openers, where you expect others to entertain you, but don’t provide anything in return. Choosing your recent downvoted question as an example:
How do you estimate threats and your ability to cope; what advice can you share with others based on your experiences?
First, how do you estimate “threats and your ability to cope”? If you ask other people to provide their data, it would be polite to provide your own.
Second, what is your goal here? Are you just bored and want to start a debate that could entertain you? Or are you thinking about a specific problem you are trying to solve? Then maybe being more specific in the question could help to give you more relevant answer. But the thing is, your not being specific seems like an evidence for the “I am just bored and want you to entertain me” variant.
By “transient” I mean that you mention a topic once and then never show any interest in it again. By “noise” I mean random pieces of text which neither contain useful information nor are interesting.
As I said before, I think it would be good if you get in the habit of trying to predict the votes that your posts get beforehand and then not post when you think that a post would produce negative karma.
One way to do this might be, whenever you write a post keep it in a textfile and wait a day. The next day you ask yourself whether there anything you can do to improve it. If you feel you can improve it, do it.
Then you estimate a confidence interval for the karma you expect your post to get and take a note of it in a spreadsheet. If you think it will be positive post your comment.
If you train that skill I would expect you to raise your karma and learn a generally valuable skill.
As I said before, I think it would be good if you get in the habit of trying to predict the votes that your posts get beforehand and then not post when you think that a post would produce negative karma.
This is the best advice. The trick to keeping high karma is to cultivate your discernment. Each time you write a post, assess its value, and then delete it if you don’t anticipate people appreciating it. View that deletion as a victory equal to the victory of posting a high-karma comment.
I would be concerned that you might post with popular opinion not with valuable or worthwhile ideas. (if the caveat of worthwhile ideas even if they sound unpopular is included then this is still a good strategy)
Thank you for asking. I’ve been trying to figure out what to say to you, but couldn’t figure out quite what the issue is. One possibility in terms of karma is to bundle a number of comments into a single comment, but this doesn’t address how the comments could be better.
A possible angle is to work on is being more specific. It might be like the difference between a new computer user and a more sophisticated computer user. The new user says “My computer doesn’t work!”, and there is no way to help that person from a distance until they say what sort of computer it is, what they were trying to do, and some detail about what happened.
Being specific doesn’t come naturally to all people on all subjects, but it’s a learnable skill, and highly valued here.
I think it’s that you post a lot of questions and not a lot of content. Less Wrong is predisposed to upvoting high-content responses. I haven’t had an account for very long, but I have lurked for ages. That’s my impression, anyways. I recognize that since I haven’t actually pulled comment karma data from the site and analyzed it, I could be totally off-base.
Maybe when you ask questions, use this form:
[This is a general response to the post]
and
[This is what is confusing me]
but
[I thought about it and I think I have the answer, is this correct?] or [I thought about it, came up with these conclusions, but rejected them for reasons listed here, I’m still confused]
EDIT: I just looked at your submitted history. You do post content in Main, apparently, but your posts seem to run counter to the popular ideas here. There is bias, and LessWrong has a lot of ideas deemed “settled.” Effective Altruism appears to be one, and you have posted arguments against it. I’ve also seen some of your posts jump to conclusions without explaining your explicit reasons. LWers seem to appreciate having concepts reduced as much as possible to make reasoning more explicit.
There is bias, and LessWrong has a lot of ideas deemed “settled.”
Any group has a lot of ideas that are settled. If you want to convince any scientific minded group that the Aristoteles four elements is true, then you have to hit a high bar for not getting rejected. If anything LW allows a wide array of contrarian points.
LW’s second highest voted post is Holden’s post against MIRI and is contrarian to core ideas of this community in the same sense as a post criticizing EA is. The difference is that the post actualy goes deep and make a substantive argument.
I want to say that that’s what I was trying to imply, but that might be backwards-rationalization. I do have the impression that contrarian ideas are accepted and lauded if and only if they’re presented with the reasoning standards of the community. I’ll be honest: LW does strike me as far-fetched in some respects BUT I recognize that I haven’t done enough reading on those subjects to have an informed opinion. I’ve lurked but am not an ingrained member of the community and can’t give a detailed analysis of the standards. Only my impression.
AND I realize that this sounds defensive, and I know there’s no real reason for my ego to be wounded. I appreciate your input! I hope that my advice to Clarity wasn’t too far off the mark. I tried to be clear about my advice being based on impressions more than data.
EDIT: removed “biased,” replaced with “far-fetched.”
Obviously it has reasoning standards. They are much higher than the average person might expect, because that’s one of the goals of the community.
Bias was an poor word to use, and I retract my use of the term. I mean that as a relatively new participant, there are ideas that seem far-fetched because I have not examined the arguments for them. I admit that this is nothing more than my visceral reaction. Until I examine each issue thoroughly, I won’t be able to say anything but “that viscerally strikes me as biased.” Cryonics, for instance, is a conclusion that seems far-fetched because I have a very poor understanding of biology, and no exposure to the discussion around it. Without a better background in the science and philosophy of cryonics, I have no way of incorporating casual acceptance of the idea into my own conclusion. I recognize that, admit it, and am apparently not being clear about that fact. In trying to express empathy with a visceral reaction of disbelief, I misused the word “bias” and will be more clear in the future.
On the second point: I understand that there’s a cost to treating every post with the same rigor. Posts that are poorly reasoned, and come to potentially dangerous conclusions, should be examined more rigorously. Posts that are just as bad, but whose conclusions are less dangerous, can probably be taken less seriously. Even so...someone who makes many such arguments, with a mix of dangerous and less-dangerous conclusions, might see a lack of negative feedback as positive feedback. That’s an issue in itself, but newcomers wouldn’t be in a position to recognize that.
Cryonics is not a discussion that’s primarily about biology. A lot of outsider will want to either think that cryonics works or that it doesn’t.
On LW there a current that we don’t make binary judgements like that but instead reason with probabilities. So thinking that there a 20% chance that cryonics works is enough for people to go out and buy cryonics insurance because of the huge value that cryonics has if it succeeds.
That’s radically different than most people outside of LW think.
Cryonics is not a discussion that’s primarily about biology.
Well, the biological aspect is “where exactly in the body is ‘me’ located”?
For example, many people on LW seem to assume that the whole ‘me’ is in the head; so you can just freeze the head, and feed the rest to the worms. Maybe that’s a wrong idea; maybe the ‘me’ is much more distributed in the body, and the head is merely a coordinating organ, plus a center of a few things that need to work really fast. Maybe if the future science will revive the head and connect it to some cloned/artificial average human body, we will see the original personality replaced by more or less an average personality; perhaps keeping the memories of the original, but unable to empathise with the hobbies or values of the original.
For example, many people on LW seem to assume that the whole ‘me’ is in the head; so you can just freeze the head, and feed the rest to the worms.
Whether you need to freeze the whole body or whether the head is enough is a meaningful debate, but it has little to do with why a lot of people oppose cryonics.
At this stage, I can see an argument for freezing the gut, or at least samples of the gut, so as to get the microbiome. Anyone know about reviving frozen microbes?
There is evidence for and against cryonics that I KNOW exists, but I haven’t parsed most of it yet.
If I come to the conclusion that cryonics insurance is worth betting on, I am not sure I can get my spouse on board. Since he’d ultimately be in charge of what happens to my remains, AND we have an agreement to be open about our financial decisions, him being on board is mandatory.
If I come to the conclusion that cryonics is worth betting on, I might feel morally obligated to proselytize about it. That has massive social costs for me.
I’m freaked out by the concept because very intelligent people in my life have dismissed the concept as “idiotic,” and apparently cryonics believers make researchers in the field of cryogenics very uncomfortable.
Basically, it’s a whole mess of things to come to terms with. The spouse thing is the biggest.
I think those concerns are understandable but the thing that makes LW special is that discourse here often ignores uncomfortable barriers of thought like this. That can feel weird for outsiders.
I looked at a few pages of your comment history to see if I could find a particularly horrible example to base an explanation on (entirelyuseless’s link is appropriate), but I was surprised to find that the vast majority of your comments had no karma rather than downvotes.
I’m not sure what you need to do to upgrade or edit out your typical comment. Possibly you could review your upvoted comments to see how they’re different from your usual comments.
In addition to what everyone else has said, here’s a useful article on how to ask smart questions. It’s talking about asking technical questions on support forums, but the matter generalises, especially the advice to make your best effort to answer it yourself, before asking it publicly, and when you do, to provide the context and where you have got to already.
while it isn’t necessary to already be technically competent to get attention from us, it is necessary to demonstrate the kind of attitude that leads to competence — alert, thoughtful, observant, willing to be an active partner in developing a solution.
Thanks, that article is incredible. I hope to see one that is about how to answer questions, and how to understand answers too! After reading, some contemplation on the matter, and some chance happenings upon information I feel is relevant to the issue, I believe I’ve changed a lot:
Recently a highly admired friend of mine said something along the lines of ‘I’ve never said anything that wasn’t intention’. Whereas for me, most of that which I say is unintentional, just observed. So this got me thinking pretty hard about these things. Being on my mind, I suppose I got the following sliver of personal development when I started looking up some podcasts to comfort myself the following day:
I’m vain. When I listen to things, personal development podcasts or not, I tend to look for what could be about me. I sampled the Danger and Play podcasts and like what I’ve heared. Inspired by the way he frames self-talk as interpersonal ilocutation, my mental landscaped has changed steeply. One consequence of this has been that I’m no longer held captive to ‘believing’ the first thought or idea that comes to my head. Rather, it’s as if it’s just one mental subagents proposition, to be contested and such. I am not biased towards reserving my thoughts till a more complex stopping rule, like coming to a conclusion that a certain verbablisation would lead to a certain outcome (e.g. the conclusion is positive emotionally, raises my anxiety to an optimal level, and/or functional by way of interpersonal compliance) , rather than something that just spews from my mind.
Perhaps a precursor to this has been a general dampening of how seriously I’ve been taking my moral intuitions. I’ve contextualised them in terms of the facts that they are predated by evolutionary foreces, context, and such. Approximately an expressivist position, championed sometimes by A.J Ayer and the logical positives, regarding moral language, if I remember the wikipedia page correctly...but even say, in ingratituated sense of helplessness then seems no longer to relate to entrenched circumstances, but liable to change depending on the path dependence on my memory—something influenced by the past, but continuously influenced by the ongoing present, even for older memories that are revisited and updated, reframed etc.
Danger and Play is part of the ‘red pill’ ‘manosphere’ of content. Frequently the movement is derided as mysogenistic. I can’t speak on that, since I reckon that it would be heterogenous with peoples attitudes towards women and labelling a broad category critically is misleading (like labelling all Islamists as terrorists, for analogy). Some of my sticking points in gender and sexual relations seem to relate to underdeveloped learned optimism and growth mindset. It seems like some ‘red pill’ and related ‘seduction’ movements include elements that are concurrently antithetical to developing these:
To illustrate, the prominent RSD company often frames things in ways that don’t suggest negative things are situational and temporary, while making global judgemnets about negative things (eg: ‘life is hard...’). That’s a recipe for learned helplessness. Which, may very well be good for their business model, combined with all the motivation they spew out. In fact, this observation probably holds for a number of motivational video channels to keep people coming back. There are certainly exceptions—I remember one which started off with that quote from Albert Einstein that closely approximates a pithy summation of a growth mindset and learned optimism, but the details escape me.
One thing that really compels and reminds me to think in this reflective way is simply that a lot of my intuitions are really quite mean to myself. When that podcast instructed me to stand back and think of myself as another person, it just seems absurd to treat myself like that. I mean, if I find effective altruism things compelling because they’re nice to do, isn’t the most proximate and therefore likely one of the easier or more reliable niceties to be nice to myself. In turn, it looks like that will lead to:
competence — alert, thoughtful, observant, willing to be an active partner in developing a solution.
The kind of attitude that makes for smarter questions...
Usually, your questions feel more suited for a general-purpose forum than the narrowly specialized set of interests commonly discussed here. (We do have “Stupid Questions” and “Instrumental Rationality” threads, but even those follow the same standards for comment quality as the rest of LW.)
Also, posting a dozen questions in succession may give users the impression that you’re trying to monopolize the discussion. Even if that’s not your intention, I would understand it if some users ended up thinking it is.
I would suggest looking for specialized forums on some of the topics that interest you, and using LW only for topics likely to be of interest to rationalists.
Many of your comment get downvoted, sometimes heavily. In every open thread you post a lot of questions, some of them completely off topic. A single good question in the open thread can give you 2-3 karma, but a single bad one can go down as −7 or less. So stop asking so much irrelevant questions and start contributing.
as a hard rule; when posting in open; the ratio of your posts to posts by others should always be below 1:3 (other’s might want to comment and suggest 1:4). You should post less then 1 in 4 of the posts in the open thread. They often read like a stream of consciousness (I think you know this already), and you might be better off taking on board some of the ideas of sitting on thoughts over a day or so and re-evaluating them for yourself before posting.
As a side note: presentation of an idea can help the reception. We are still human; and do care for delicate wording on some topics.
Thanks. I do tend to sit on my ideas, or I like to post and update those posts or reply with reflections upon revisitations of those thoughts so that I and others can see how my thinking changes over time.
My ratio is only that high when there is a new open thread. Since I post in blocks by formulating several posts then posting then when I next get a chance, it may appear early on that my ratio is high. But by the end of the month, I am certainly no where near that ratio.
I am continuously trying to improve my presentation. Unfortunately, till date I have received minimal specific feedback on how to improve presentation. Sometimes I feel the stream of consciousness approach illustrates the way I’m thinking about a certain thing more illustratively.
It may well do, but illustrating the way you’re thinking about something isn’t necessarily a good goal here. Why should anyone else care how you happen to be thinking about something?
There may be special cases in which they do. If you are a world-class expert on something it could be very enlightening to see how you think about it. If you are just a world-class thinker generally, it might be fascinating to see how you think about anything. Otherwise, not so much.
It may be worth releasing the posts gradually over the course of the week so as to not make it look like a clump. (and again paying attention to that ratio). I agree that you seem to post a chunk and once in a week. but it may serve better to spread out your posts.
Why is my karma so low? Is there something I’m consistently doing wrong that I can do less wrong? I’m sorry.
The first association I have with your username is “spams Open Threads with not really interesting questions”.
Note that there are two parts in that objection. Posting a boring question in an Open Thread is not a problem per se—I don’t really want to discourage people from doing that. It’s just that when I open any Open Thread, and there are at least five boring top-level comments by the same user, instead of simply ignoring them I feel annoyed.
Many of your comments are very general debate-openers, where you expect others to entertain you, but don’t provide anything in return. Choosing your recent downvoted question as an example:
First, how do you estimate “threats and your ability to cope”? If you ask other people to provide their data, it would be polite to provide your own.
Second, what is your goal here? Are you just bored and want to start a debate that could entertain you? Or are you thinking about a specific problem you are trying to solve? Then maybe being more specific in the question could help to give you more relevant answer. But the thing is, your not being specific seems like an evidence for the “I am just bored and want you to entertain me” variant.
You use LW as a dumping ground for whatever crosses your mind at the moment, and that is usually random and transient noise.
Thanks. What counts as noise and what as signal to you, and what do you mean by transient?
By “transient” I mean that you mention a topic once and then never show any interest in it again. By “noise” I mean random pieces of text which neither contain useful information nor are interesting.
As I said before, I think it would be good if you get in the habit of trying to predict the votes that your posts get beforehand and then not post when you think that a post would produce negative karma.
One way to do this might be, whenever you write a post keep it in a textfile and wait a day. The next day you ask yourself whether there anything you can do to improve it. If you feel you can improve it, do it. Then you estimate a confidence interval for the karma you expect your post to get and take a note of it in a spreadsheet. If you think it will be positive post your comment.
If you train that skill I would expect you to raise your karma and learn a generally valuable skill.
If at the end of writing a post you think “I’m not sure where I was going with this anymore.” as in http://lesswrong.com/r/discussion/lw/mzx/some_thoughts_on_decentralised_prediction_markets/ , don’t publish the post. If you yourself don’t see the point in your writing it’s unlikely that others will consider it valuable.
This is the best advice. The trick to keeping high karma is to cultivate your discernment. Each time you write a post, assess its value, and then delete it if you don’t anticipate people appreciating it. View that deletion as a victory equal to the victory of posting a high-karma comment.
I would be concerned that you might post with popular opinion not with valuable or worthwhile ideas. (if the caveat of worthwhile ideas even if they sound unpopular is included then this is still a good strategy)
I second this. This is also a very important skill for work and personal emails, and anything having to do with social sites like Facebook.
Thank you for asking. I’ve been trying to figure out what to say to you, but couldn’t figure out quite what the issue is. One possibility in terms of karma is to bundle a number of comments into a single comment, but this doesn’t address how the comments could be better.
A possible angle is to work on is being more specific. It might be like the difference between a new computer user and a more sophisticated computer user. The new user says “My computer doesn’t work!”, and there is no way to help that person from a distance until they say what sort of computer it is, what they were trying to do, and some detail about what happened.
Being specific doesn’t come naturally to all people on all subjects, but it’s a learnable skill, and highly valued here.
I think it’s that you post a lot of questions and not a lot of content. Less Wrong is predisposed to upvoting high-content responses. I haven’t had an account for very long, but I have lurked for ages. That’s my impression, anyways. I recognize that since I haven’t actually pulled comment karma data from the site and analyzed it, I could be totally off-base.
Maybe when you ask questions, use this form:
[This is a general response to the post] and [This is what is confusing me] but [I thought about it and I think I have the answer, is this correct?] or [I thought about it, came up with these conclusions, but rejected them for reasons listed here, I’m still confused]
EDIT: I just looked at your submitted history. You do post content in Main, apparently, but your posts seem to run counter to the popular ideas here. There is bias, and LessWrong has a lot of ideas deemed “settled.” Effective Altruism appears to be one, and you have posted arguments against it. I’ve also seen some of your posts jump to conclusions without explaining your explicit reasons. LWers seem to appreciate having concepts reduced as much as possible to make reasoning more explicit.
Any group has a lot of ideas that are settled. If you want to convince any scientific minded group that the Aristoteles four elements is true, then you have to hit a high bar for not getting rejected. If anything LW allows a wide array of contrarian points.
LW’s second highest voted post is Holden’s post against MIRI and is contrarian to core ideas of this community in the same sense as a post criticizing EA is. The difference is that the post actualy goes deep and make a substantive argument.
I want to say that that’s what I was trying to imply, but that might be backwards-rationalization. I do have the impression that contrarian ideas are accepted and lauded if and only if they’re presented with the reasoning standards of the community. I’ll be honest: LW does strike me as far-fetched in some respects BUT I recognize that I haven’t done enough reading on those subjects to have an informed opinion. I’ve lurked but am not an ingrained member of the community and can’t give a detailed analysis of the standards. Only my impression.
AND I realize that this sounds defensive, and I know there’s no real reason for my ego to be wounded. I appreciate your input! I hope that my advice to Clarity wasn’t too far off the mark. I tried to be clear about my advice being based on impressions more than data.
EDIT: removed “biased,” replaced with “far-fetched.”
Yes, LW does have reasoning standards. That’s part of what being refining the art of human rationality is about.
What do you mean with “biased”? That LW is different than mainstream society in the ideas it values?
Do you think it’s a bias to treat badly reasoned post which might result in people dying the differently than harmless badly reasoned posts?
Obviously it has reasoning standards. They are much higher than the average person might expect, because that’s one of the goals of the community.
Bias was an poor word to use, and I retract my use of the term. I mean that as a relatively new participant, there are ideas that seem far-fetched because I have not examined the arguments for them. I admit that this is nothing more than my visceral reaction. Until I examine each issue thoroughly, I won’t be able to say anything but “that viscerally strikes me as biased.” Cryonics, for instance, is a conclusion that seems far-fetched because I have a very poor understanding of biology, and no exposure to the discussion around it. Without a better background in the science and philosophy of cryonics, I have no way of incorporating casual acceptance of the idea into my own conclusion. I recognize that, admit it, and am apparently not being clear about that fact. In trying to express empathy with a visceral reaction of disbelief, I misused the word “bias” and will be more clear in the future.
On the second point: I understand that there’s a cost to treating every post with the same rigor. Posts that are poorly reasoned, and come to potentially dangerous conclusions, should be examined more rigorously. Posts that are just as bad, but whose conclusions are less dangerous, can probably be taken less seriously. Even so...someone who makes many such arguments, with a mix of dangerous and less-dangerous conclusions, might see a lack of negative feedback as positive feedback. That’s an issue in itself, but newcomers wouldn’t be in a position to recognize that.
Cryonics is not a discussion that’s primarily about biology. A lot of outsider will want to either think that cryonics works or that it doesn’t. On LW there a current that we don’t make binary judgements like that but instead reason with probabilities. So thinking that there a 20% chance that cryonics works is enough for people to go out and buy cryonics insurance because of the huge value that cryonics has if it succeeds. That’s radically different than most people outside of LW think.
Well, the biological aspect is “where exactly in the body is ‘me’ located”?
For example, many people on LW seem to assume that the whole ‘me’ is in the head; so you can just freeze the head, and feed the rest to the worms. Maybe that’s a wrong idea; maybe the ‘me’ is much more distributed in the body, and the head is merely a coordinating organ, plus a center of a few things that need to work really fast. Maybe if the future science will revive the head and connect it to some cloned/artificial average human body, we will see the original personality replaced by more or less an average personality; perhaps keeping the memories of the original, but unable to empathise with the hobbies or values of the original.
Whether you need to freeze the whole body or whether the head is enough is a meaningful debate, but it has little to do with why a lot of people oppose cryonics.
At this stage, I can see an argument for freezing the gut, or at least samples of the gut, so as to get the microbiome. Anyone know about reviving frozen microbes?
It’s not hard. IIRC people brought to life microbes which were frozen in permafrost tens of thousands of years ago.
I understand that; I’m still not comfortable enough with the discussion about cryonics to bet on it working.
Do you have a probability in your head about cryonics working or not working, or do you feel uncomfortable assigning a probability?
A little of both, I think.
There is evidence for and against cryonics that I KNOW exists, but I haven’t parsed most of it yet.
If I come to the conclusion that cryonics insurance is worth betting on, I am not sure I can get my spouse on board. Since he’d ultimately be in charge of what happens to my remains, AND we have an agreement to be open about our financial decisions, him being on board is mandatory.
If I come to the conclusion that cryonics is worth betting on, I might feel morally obligated to proselytize about it. That has massive social costs for me.
I’m freaked out by the concept because very intelligent people in my life have dismissed the concept as “idiotic,” and apparently cryonics believers make researchers in the field of cryogenics very uncomfortable.
Basically, it’s a whole mess of things to come to terms with. The spouse thing is the biggest.
I think those concerns are understandable but the thing that makes LW special is that discourse here often ignores uncomfortable barriers of thought like this. That can feel weird for outsiders.
A large proportion of your comments seem very distracting and sort of off-topic for Less Wrong.
Thanks. Can I have an example which is either self-evident as distracting and off-topic or explain why it is?
I looked at a few pages of your comment history to see if I could find a particularly horrible example to base an explanation on (entirelyuseless’s link is appropriate), but I was surprised to find that the vast majority of your comments had no karma rather than downvotes.
I’m not sure what you need to do to upgrade or edit out your typical comment. Possibly you could review your upvoted comments to see how they’re different from your usual comments.
This is a sufficiently evident example.
In addition to what everyone else has said, here’s a useful article on how to ask smart questions. It’s talking about asking technical questions on support forums, but the matter generalises, especially the advice to make your best effort to answer it yourself, before asking it publicly, and when you do, to provide the context and where you have got to already.
Thanks, that article is incredible. I hope to see one that is about how to answer questions, and how to understand answers too! After reading, some contemplation on the matter, and some chance happenings upon information I feel is relevant to the issue, I believe I’ve changed a lot:
Recently a highly admired friend of mine said something along the lines of ‘I’ve never said anything that wasn’t intention’. Whereas for me, most of that which I say is unintentional, just observed. So this got me thinking pretty hard about these things. Being on my mind, I suppose I got the following sliver of personal development when I started looking up some podcasts to comfort myself the following day:
I’m vain. When I listen to things, personal development podcasts or not, I tend to look for what could be about me. I sampled the Danger and Play podcasts and like what I’ve heared. Inspired by the way he frames self-talk as interpersonal ilocutation, my mental landscaped has changed steeply. One consequence of this has been that I’m no longer held captive to ‘believing’ the first thought or idea that comes to my head. Rather, it’s as if it’s just one mental subagents proposition, to be contested and such. I am not biased towards reserving my thoughts till a more complex stopping rule, like coming to a conclusion that a certain verbablisation would lead to a certain outcome (e.g. the conclusion is positive emotionally, raises my anxiety to an optimal level, and/or functional by way of interpersonal compliance) , rather than something that just spews from my mind.
Perhaps a precursor to this has been a general dampening of how seriously I’ve been taking my moral intuitions. I’ve contextualised them in terms of the facts that they are predated by evolutionary foreces, context, and such. Approximately an expressivist position, championed sometimes by A.J Ayer and the logical positives, regarding moral language, if I remember the wikipedia page correctly...but even say, in ingratituated sense of helplessness then seems no longer to relate to entrenched circumstances, but liable to change depending on the path dependence on my memory—something influenced by the past, but continuously influenced by the ongoing present, even for older memories that are revisited and updated, reframed etc.
Danger and Play is part of the ‘red pill’ ‘manosphere’ of content. Frequently the movement is derided as mysogenistic. I can’t speak on that, since I reckon that it would be heterogenous with peoples attitudes towards women and labelling a broad category critically is misleading (like labelling all Islamists as terrorists, for analogy). Some of my sticking points in gender and sexual relations seem to relate to underdeveloped learned optimism and growth mindset. It seems like some ‘red pill’ and related ‘seduction’ movements include elements that are concurrently antithetical to developing these:
To illustrate, the prominent RSD company often frames things in ways that don’t suggest negative things are situational and temporary, while making global judgemnets about negative things (eg: ‘life is hard...’). That’s a recipe for learned helplessness. Which, may very well be good for their business model, combined with all the motivation they spew out. In fact, this observation probably holds for a number of motivational video channels to keep people coming back. There are certainly exceptions—I remember one which started off with that quote from Albert Einstein that closely approximates a pithy summation of a growth mindset and learned optimism, but the details escape me.
One thing that really compels and reminds me to think in this reflective way is simply that a lot of my intuitions are really quite mean to myself. When that podcast instructed me to stand back and think of myself as another person, it just seems absurd to treat myself like that. I mean, if I find effective altruism things compelling because they’re nice to do, isn’t the most proximate and therefore likely one of the easier or more reliable niceties to be nice to myself. In turn, it looks like that will lead to:
The kind of attitude that makes for smarter questions...
Usually, your questions feel more suited for a general-purpose forum than the narrowly specialized set of interests commonly discussed here. (We do have “Stupid Questions” and “Instrumental Rationality” threads, but even those follow the same standards for comment quality as the rest of LW.)
Also, posting a dozen questions in succession may give users the impression that you’re trying to monopolize the discussion. Even if that’s not your intention, I would understand it if some users ended up thinking it is.
I would suggest looking for specialized forums on some of the topics that interest you, and using LW only for topics likely to be of interest to rationalists.
Thanks. Do you have a suggestion for another forum you recommend I move to?
I don’t know much about topic-specific forums, but seeing as you like to ask frequent questions, Reddit and Quora come to mind.
Many of your comment get downvoted, sometimes heavily. In every open thread you post a lot of questions, some of them completely off topic.
A single good question in the open thread can give you 2-3 karma, but a single bad one can go down as −7 or less. So stop asking so much irrelevant questions and start contributing.
as a hard rule; when posting in open; the ratio of your posts to posts by others should always be below 1:3 (other’s might want to comment and suggest 1:4). You should post less then 1 in 4 of the posts in the open thread. They often read like a stream of consciousness (I think you know this already), and you might be better off taking on board some of the ideas of sitting on thoughts over a day or so and re-evaluating them for yourself before posting.
As a side note: presentation of an idea can help the reception. We are still human; and do care for delicate wording on some topics.
Thanks. I do tend to sit on my ideas, or I like to post and update those posts or reply with reflections upon revisitations of those thoughts so that I and others can see how my thinking changes over time.
My ratio is only that high when there is a new open thread. Since I post in blocks by formulating several posts then posting then when I next get a chance, it may appear early on that my ratio is high. But by the end of the month, I am certainly no where near that ratio.
I am continuously trying to improve my presentation. Unfortunately, till date I have received minimal specific feedback on how to improve presentation. Sometimes I feel the stream of consciousness approach illustrates the way I’m thinking about a certain thing more illustratively.
It may well do, but illustrating the way you’re thinking about something isn’t necessarily a good goal here. Why should anyone else care how you happen to be thinking about something?
There may be special cases in which they do. If you are a world-class expert on something it could be very enlightening to see how you think about it. If you are just a world-class thinker generally, it might be fascinating to see how you think about anything. Otherwise, not so much.
It may be worth releasing the posts gradually over the course of the week so as to not make it look like a clump. (and again paying attention to that ratio). I agree that you seem to post a chunk and once in a week. but it may serve better to spread out your posts.
Don’t buy these comments too much. i’m glancing through them, they’re much too critical. Listen to Nancy if anyone.