Do you think that maybe it could also be tied up with this sort of thing? Most of the ethical content of this site seems to be heavily related to the sort of approach Eliezer takes to FAI. This isn’t surprising.
Part of the mission of this site is to proselytize the idea that FAI is a dire issue that isn’t getting anywhere near enough attention. I tend to agree with that idea.
Existential risk aversion is really the backbone of this site. The flow of conversation is driven by it, and you see its influence everywhere. The point of being rational in the Lesswrongian sense is to avoid rationalizing away the problems we face each and every day, to escape the human tendency to avoid difficult problems until we are forced to face them.
In any event, my main interest in this site is inexorably tied in with existential risk aversion. I want to work on AGI, but I’m now convinced that FAI is a necessity. Even if you disagree with that, it is still the case that there are going to be many ethical dilemmas coming down the pipe as we gain more and more power to change our environment and ourselves through technology. There are many more ways to screw up than there are to get it right.
This is all there is to it; someone is going to be making some very hard decisions in the relatively near future, and there are going to be some serious roadblocks to progress if we do not equip people with the tools they need to sort out new, bizarre and disorienting ethical dilemmas. This I believe to likely be the case. We have extreme anti-aging, nanotech and AGI to look forward to, to name only a few. The ethical issues that come hand in hand with these sorts of technologies are immense and difficult to sort out. Very few people take these issues seriously; even fewer are trying to actually tackle them, and those who are don’t seem to be doing a good enough job. It is my understanding that changing this state of affairs is a big motive behind lesswrong. Maybe lesswrong isn’t all that it should be, but it’s a valiant attempt, in my estimation.
Do you think that maybe it could also be tied up with this sort of thing? Most of the ethical content of this site seems to be heavily related to the sort of approach Eliezer takes to FAI. This isn’t surprising.
Part of the mission of this site is to proselytize the idea that FAI is a dire issue that isn’t getting anywhere near enough attention. I tend to agree with that idea.
Existential risk aversion is really the backbone of this site. The flow of conversation is driven by it, and you see its influence everywhere. The point of being rational in the Lesswrongian sense is to avoid rationalizing away the problems we face each and every day, to escape the human tendency to avoid difficult problems until we are forced to face them.
In any event, my main interest in this site is inexorably tied in with existential risk aversion. I want to work on AGI, but I’m now convinced that FAI is a necessity. Even if you disagree with that, it is still the case that there are going to be many ethical dilemmas coming down the pipe as we gain more and more power to change our environment and ourselves through technology. There are many more ways to screw up than there are to get it right.
This is all there is to it; someone is going to be making some very hard decisions in the relatively near future, and there are going to be some serious roadblocks to progress if we do not equip people with the tools they need to sort out new, bizarre and disorienting ethical dilemmas. This I believe to likely be the case. We have extreme anti-aging, nanotech and AGI to look forward to, to name only a few. The ethical issues that come hand in hand with these sorts of technologies are immense and difficult to sort out. Very few people take these issues seriously; even fewer are trying to actually tackle them, and those who are don’t seem to be doing a good enough job. It is my understanding that changing this state of affairs is a big motive behind lesswrong. Maybe lesswrong isn’t all that it should be, but it’s a valiant attempt, in my estimation.