It has recently been suggested (by yourself) that:
Perhaps a better question would be “If my mission is to save the world from UFAI, should I expend time and resources attempting to determine what stance to take on other causes?” No matter your level of potential to learn multiple subjects, investing that time and energy into FAI would, in theory, result in a better outcome with FAI—though I am becoming increasingly aware of the fact that there are limits to how good I can be with subjects I haven’t specialized in and if you think about it, you may realize that you have limitations as well.
It seems to me that the relevance of economic growth to FAI chances is closer to Eliezer’s area of expertise, influence and comparative advantage than the determination of laws controlling military technology is to anyone here. Why is it worth evaluating and expressing opinions on this subject?
(Personally I am happy to spend some time talking about such things for the same reason that I spend some time talking about the details and implications of time travel in HPMoR fanfiction.)
I do make that mistake sometimes, however, this is not one of those times:
A. Whether I am knowledgeable here isn’t very important (as opposed to the context in which I wrote that comment).
I am not even advising people to agree on a particular strategy, I am spreading the word and getting them to think about it. Even if I tried to advise them, I don’t expect LessWrong would take my ideas at face value and blindly follow them. In this case, evaluating and expressing opinions on this subject serves the purpose of getting people to think. Getting people to think is important in this case because this particular problem is likely to require that a large number of people get involved in their own fate. They’re the ones that currently provide the checks and balances on government power. If they simply let the powerful decide amongst themselves, they may find that the powerful choose to maximize their power. Unfortunately, I don’t currently know of anyone who is qualified and trustworthy enough to advise them on what’s likely to happen and which method is likely to succeed, but at least stirring up debate and discussion will get them thinking about this. The more people think about it now, the more likely they are to have a decently well informed opinion and make functional choices later on. My knowledge level is adequate for this particular purpose.
B. Why should I specifically do this? Several reasons, actually:
Nobody else is currently doing it for us:
There are no parties of sufficient size that I know of who are taking responsibility for spreading the word on this to make sure that a critical mass is reached. I’ve scoured the internet and not found a group dedicated to this. The closest we have, to my knowledge, is Suarez. Suarez is an author, and he seems bright and dedicated to spreading the word. I’m sure he’s done research and put thought into this, and he is getting attention, but he’s not enough. This cause needs an effort much larger and much more well-researched than one guy can pull off.
I “get it”, but not everyone does.
My areas of knowledge are definitely not optimal for this and I have no intentions of dedicating my life to this issue, but as a person who “gets it”, I can perhaps convince a small group of relevant people (people who are likely to be interested in the subject) to seriously consider the issue. As we have seen, I have a greater understanding of this issue than some of the posters—I am explaining things like how land mines are not even comparable to killer robots in terms of their potential to win wars / wreck democracy. Somebody who “gets it” needs to be around to explain these kinds of things, or there may not be enough people in the group who “get it”. I am mildly special because I “get it” and am willing to discuss this so other people “get it”.
I am aware of this risk sooner than they are.
Perhaps most important: I am aware of this risk sooner. (Explained in my next point.)
C. What I am doing is actually much bigger than it looks.
I’ve seen the LessWrong Google Analytics. Some posts have accumulated 200,000+ visits over time. As I understand it, word spreads in an exponential fashion. Therefore, the more people that know about this in the beginning, the more people will know about it later. Even if this post got only 1,000 reads, entering those 1,000 reads into the beginning of the exponential growth curve is likely to result in many, many times as many people knowing about this. My post could, over the years, result in millions of people finding out about this sooner.
It only takes a relatively small investment for me to help spread the word about this and I view the benefits as being worth that investment.
Conclusion:
If you “get it” and you care about this risk, I urge you to do the same thing. Post about this on Facebook, on Twitter, on other forums—wherever you have the ability to get a group of people to think about this. The couple of minutes it takes to tell 20 people now could mean that hundreds of people find out sooner. If any of you decide to spread the word, comment. I’d like to know.
If you “get it” and you care about this risk, I urge you to do the same thing. Post about this on Facebook, on Twitter, on other forums—wherever you have the ability to get a group of people to think about this. The couple of minutes it takes to tell 20 people now could mean that hundreds of people find out sooner. If any of you decide to spread the word, comment. I’d like to know.
I perceive plenty of risks regarding future military technology that are likely to result in the loss of life and liberty. People with power no longer requiring the approval (or insufficient disapproval) of other human participants to maintain their power is among the dangers. Increased ease of creating extremely destructive weapons (including killer robots) without large scale enterprise (eg. with 3D printers you mentioned) is another.
This issue is not one I expect to have any influence over. This is a high stakes game. A national security issue and an individual ‘right to bear arms’ issue rolled into one. It is also the kind of of game where belief in doomsday predictions is enough to make people (or even a cause) lose credibility. To whatever extent my actions could have an influence at all I have no particular confidence that it would be in a desirable direction.
Evangelism is not my thing. Even if it was, this wouldn’t be the cause I chose to champion.
This issue is not one I expect to have any influence over.
I don’t expect to have a large influence over it, but for a small investment, I make a small difference. You said once yourself that if your life could make even a miniscule difference to the probability that humanity survives, it would be worth it. And if a 1⁄4,204,800 sized fraction of my life makes a 0.000000001% difference in the chance that humanity doesn’t lose democracy, that’s worth it to me. Looking at it that way, does my behavior make sense?
It is also the kind of of game where belief in doomsday predictions is enough to make people (or even a cause) lose credibility.
Ok. I feel like you should be saying that to yourself—you’re the one who said you thought the 3-D printer idea would result in everyone dying. I think the worst thing I said is that killer robots are a threat to democracy. Did you find something in my writing that you pattern matched to “doomsday prediction”? If so, I will need an example.
Evangelism is not my thing. Even if it was, this wouldn’t be the cause I chose to champion.
Spending 1⁄4,204,800 of my life to spread the word about something is best categorized as “doing my part” not “championing a cause”. Like I said in my last comment:
“I have no intentions of dedicating my life to this issue.”
After considering the amount of time I spent on this and the clear statement of my intentions (or lack of intentions), do you agree that I was never trying to champion this cause and was simply doing my part, wedrifid?
Looking at it that way, does my behavior make sense?
I suggested that Eliezer’s analysis of economic growth and FAI is more relevant to Eliezer (in terms of his expertise, influence and comparative advantage) than military robot politics is to all of us (on each of the same metrics). To resolve the ambiguity there, I do not take the position that talk of robot killers is completely worthless. Instead I take the position that Eliezer spending a day or so analysing economic growth impacts on his life’s work is entirely sensible. So instead of criticising your behavior I am criticising your criticism of another behaviour that is somewhat similar.
Ok. I feel like you should be saying that to yourself—you’re the one who said you thought the 3-D printer idea would result in everyone dying.
I perceive a difference between the social consequences of replying with a criticism of a “right to bear automated-killer-robot arms” proposal in a comment and the social consequences of spreading the word to people I know (on facebook, etc.) about some issue of choice.
I think the worst thing I said is that killer robots are a threat to democracy.
Yes. My use of ‘doomsday’ to describe that scenario is lax. Please imagine that I found a more precise term and expressed approximately the same point.
After considering the amount of time I spent on this and the clear statement of my intentions (or lack of intentions), do you agree that I was never trying to champion this cause and was simply doing my part, wedrifid?
Please note that the quote that mentions ‘championing a cause’ was explicitly about myself. It was not made as a criticism of your behavior. It was made as a direct, quote denoted reply to your call for readers (made in response to myself) to evangelise to people we know on ‘facebook, twitter and other forums’. I was explaining why I do not choose to do as you request even though by my judgement I do, in fact, “get it”.
Taking a stance and expressing concern about something that isn’t a mainstream issue comes with a cost. Someone who is mainstream in all ways but one tends to be more influential when it comes to that one issue than someone who has eccentric beliefs in all areas.
So instead of criticising your behavior I am criticising your criticism of another behaviour that is somewhat similar.
Oh okay.
I perceive a difference between the social consequences of replying …
I see. I thought you were making some different comparison.
Yes. My use of ‘doomsday’ to describe that scenario is lax. Please imagine that I found a more precise term and expressed approximately the same point.
Okay. (:
Please note that the quote that mentions ‘championing a cause’ was explicitly about myself.
Okay, noted.
I was explaining why I do not choose to do as you request even though by my judgement I do, in fact, “get it”.
I’m glad that you get it enough to see the potential benefit of spreading the word even though you choose not to because you anticipate unwanted social consequences instead.
Taking a stance and expressing concern about something that isn’t a mainstream issue comes with a cost. Someone who is mainstream in all ways but one tends to be more influential when it comes to that one issue than someone who has eccentric beliefs in all areas.
Hahaha! Yeah, I can see that. Though this really depends on who your friends are or which friend group one chose to spread the idea to.
At this stage, it is probably best to spread the word only to those who Seth Godin calls “early adopters” (defined as: people who want to know everything about their subject of interest aka nerds).
This would be why I told LessWrong as opposed to some other group.
It has recently been suggested (by yourself) that:
It seems to me that the relevance of economic growth to FAI chances is closer to Eliezer’s area of expertise, influence and comparative advantage than the determination of laws controlling military technology is to anyone here. Why is it worth evaluating and expressing opinions on this subject?
(Personally I am happy to spend some time talking about such things for the same reason that I spend some time talking about the details and implications of time travel in HPMoR fanfiction.)
I do make that mistake sometimes, however, this is not one of those times:
A. Whether I am knowledgeable here isn’t very important (as opposed to the context in which I wrote that comment).
I am not even advising people to agree on a particular strategy, I am spreading the word and getting them to think about it. Even if I tried to advise them, I don’t expect LessWrong would take my ideas at face value and blindly follow them. In this case, evaluating and expressing opinions on this subject serves the purpose of getting people to think. Getting people to think is important in this case because this particular problem is likely to require that a large number of people get involved in their own fate. They’re the ones that currently provide the checks and balances on government power. If they simply let the powerful decide amongst themselves, they may find that the powerful choose to maximize their power. Unfortunately, I don’t currently know of anyone who is qualified and trustworthy enough to advise them on what’s likely to happen and which method is likely to succeed, but at least stirring up debate and discussion will get them thinking about this. The more people think about it now, the more likely they are to have a decently well informed opinion and make functional choices later on. My knowledge level is adequate for this particular purpose.
B. Why should I specifically do this? Several reasons, actually:
Nobody else is currently doing it for us:
There are no parties of sufficient size that I know of who are taking responsibility for spreading the word on this to make sure that a critical mass is reached. I’ve scoured the internet and not found a group dedicated to this. The closest we have, to my knowledge, is Suarez. Suarez is an author, and he seems bright and dedicated to spreading the word. I’m sure he’s done research and put thought into this, and he is getting attention, but he’s not enough. This cause needs an effort much larger and much more well-researched than one guy can pull off.
I “get it”, but not everyone does.
My areas of knowledge are definitely not optimal for this and I have no intentions of dedicating my life to this issue, but as a person who “gets it”, I can perhaps convince a small group of relevant people (people who are likely to be interested in the subject) to seriously consider the issue. As we have seen, I have a greater understanding of this issue than some of the posters—I am explaining things like how land mines are not even comparable to killer robots in terms of their potential to win wars / wreck democracy. Somebody who “gets it” needs to be around to explain these kinds of things, or there may not be enough people in the group who “get it”. I am mildly special because I “get it” and am willing to discuss this so other people “get it”.
I am aware of this risk sooner than they are.
Perhaps most important: I am aware of this risk sooner. (Explained in my next point.)
C. What I am doing is actually much bigger than it looks.
I’ve seen the LessWrong Google Analytics. Some posts have accumulated 200,000+ visits over time. As I understand it, word spreads in an exponential fashion. Therefore, the more people that know about this in the beginning, the more people will know about it later. Even if this post got only 1,000 reads, entering those 1,000 reads into the beginning of the exponential growth curve is likely to result in many, many times as many people knowing about this. My post could, over the years, result in millions of people finding out about this sooner.
It only takes a relatively small investment for me to help spread the word about this and I view the benefits as being worth that investment.
Conclusion:
If you “get it” and you care about this risk, I urge you to do the same thing. Post about this on Facebook, on Twitter, on other forums—wherever you have the ability to get a group of people to think about this. The couple of minutes it takes to tell 20 people now could mean that hundreds of people find out sooner. If any of you decide to spread the word, comment. I’d like to know.
I perceive plenty of risks regarding future military technology that are likely to result in the loss of life and liberty. People with power no longer requiring the approval (or insufficient disapproval) of other human participants to maintain their power is among the dangers. Increased ease of creating extremely destructive weapons (including killer robots) without large scale enterprise (eg. with 3D printers you mentioned) is another.
This issue is not one I expect to have any influence over. This is a high stakes game. A national security issue and an individual ‘right to bear arms’ issue rolled into one. It is also the kind of of game where belief in doomsday predictions is enough to make people (or even a cause) lose credibility. To whatever extent my actions could have an influence at all I have no particular confidence that it would be in a desirable direction.
Evangelism is not my thing. Even if it was, this wouldn’t be the cause I chose to champion.
I don’t expect to have a large influence over it, but for a small investment, I make a small difference. You said once yourself that if your life could make even a miniscule difference to the probability that humanity survives, it would be worth it. And if a 1⁄4,204,800 sized fraction of my life makes a 0.000000001% difference in the chance that humanity doesn’t lose democracy, that’s worth it to me. Looking at it that way, does my behavior make sense?
Ok. I feel like you should be saying that to yourself—you’re the one who said you thought the 3-D printer idea would result in everyone dying. I think the worst thing I said is that killer robots are a threat to democracy. Did you find something in my writing that you pattern matched to “doomsday prediction”? If so, I will need an example.
Spending 1⁄4,204,800 of my life to spread the word about something is best categorized as “doing my part” not “championing a cause”. Like I said in my last comment:
“I have no intentions of dedicating my life to this issue.”
After considering the amount of time I spent on this and the clear statement of my intentions (or lack of intentions), do you agree that I was never trying to champion this cause and was simply doing my part, wedrifid?
I suggested that Eliezer’s analysis of economic growth and FAI is more relevant to Eliezer (in terms of his expertise, influence and comparative advantage) than military robot politics is to all of us (on each of the same metrics). To resolve the ambiguity there, I do not take the position that talk of robot killers is completely worthless. Instead I take the position that Eliezer spending a day or so analysing economic growth impacts on his life’s work is entirely sensible. So instead of criticising your behavior I am criticising your criticism of another behaviour that is somewhat similar.
I perceive a difference between the social consequences of replying with a criticism of a “right to bear automated-killer-robot arms” proposal in a comment and the social consequences of spreading the word to people I know (on facebook, etc.) about some issue of choice.
Yes. My use of ‘doomsday’ to describe that scenario is lax. Please imagine that I found a more precise term and expressed approximately the same point.
Please note that the quote that mentions ‘championing a cause’ was explicitly about myself. It was not made as a criticism of your behavior. It was made as a direct, quote denoted reply to your call for readers (made in response to myself) to evangelise to people we know on ‘facebook, twitter and other forums’. I was explaining why I do not choose to do as you request even though by my judgement I do, in fact, “get it”.
Taking a stance and expressing concern about something that isn’t a mainstream issue comes with a cost. Someone who is mainstream in all ways but one tends to be more influential when it comes to that one issue than someone who has eccentric beliefs in all areas.
Oh okay.
I see. I thought you were making some different comparison.
Okay. (:
Okay, noted.
I’m glad that you get it enough to see the potential benefit of spreading the word even though you choose not to because you anticipate unwanted social consequences instead.
Hahaha! Yeah, I can see that. Though this really depends on who your friends are or which friend group one chose to spread the idea to.
At this stage, it is probably best to spread the word only to those who Seth Godin calls “early adopters” (defined as: people who want to know everything about their subject of interest aka nerds).
This would be why I told LessWrong as opposed to some other group.