[...] just rational things to do. The “heroic” part must stand for something, no?
I had always assumed it was intended to stand for doing things that are rational even if they’re really hard or scary and unanticipated.
If you do a careful cost-benefit calculation and conclude (depending on your values and beliefs) that …
… the biggest risk facing humanity in the nearish future is that of a runaway AI doing things we really don’t want but are powerless to stop, and preventing this requires serious hard work in mathematics and philosophy and engineering that no one seems to be doing; or
… most of the world’s population is going to spend eternity in unimaginable torment because they don’t know how to please the gods; or
… there are billions of people much, much worse off than you, and giving away almost everything you have and almost everything you earn will make the world a substantially better place than keeping it in order to have a nicer house, better food, more confidence of not starving when you get old, etc.
and if you are a normal person then you shrug your shoulders, say “damn, that’s too bad”, and get on with your life; but if you are infused with a sense of heroic responsibility then you devote your life to researching AI safety (and propagandizing to get other people thinking about it too), or become a missionary, or live in poverty while doing lucrative but miserable work in order to save lives in Africa.
If it turns out that you picked as good a cause as you think you did, and if you do your heroic job well and get lucky, then you can end up transforming the world for the better. If you picked a bad cause (saving Germany from the Jewish menace, let’s say) and do your job well and get lucky, you can (deservedly) go down in history as an evil genocidal tyrant and one of the worst people who ever lived. And if you turn out not to have the skill and luck you need, you can waste your life failing to solve the problem you took aim at, and end up neither accomplishing anything of importance nor having a comfortable life.
So there are reasons why most people don’t embrace “heroic responsibility”. But the premise for the whole thing—without which there’s nothing to be heroically responsible about—is, it seems to me, that you really think that this thing needs doing and you need to do it and that’s what’s best for the world.
(“Heroic responsibility” isn’t only about tasks so big that they consume your entire life. You can take heroic responsibility for smaller-scale things too, if they present themselves and seem important enough. But, again, I think what makes them opportunities for heroic responsibility is that combination of importantly worth doing and really intimidating.)
and if you are a normal person then you shrug your shoulders, say “damn, that’s too bad”, and get on with your life; but if you are infused with a sense of heroic responsibility then you devote your life to...
If you’re a normal person, the fact that you shrug your shoulders when faced with such things is beneficial because shrugging your shoulders instead of being heroic when faced with the destruction of civilization serves as immunity against crazy ideas and because you’re running on corrupted hardware, you probably aren’t as good at figuring out how to avoid the destruction of civilization as you think.
Just saying “I’m not going to shrug my shoulders; I’m going to be heroic instead” is removing the checks and balances that are irrational themselves but protect you against bad rationality of other types, leaving you worse off overall.
I am inclined to agree; I am not a fan of the idea of “heroic responsibility”. (Though I think most of us could stand to be a notch or two more heroic than we currently are.)
I had always assumed it was intended to stand for doing things that are rational even if they’re really hard or scary and unanticipated.
If you do a careful cost-benefit calculation and conclude (depending on your values and beliefs) that …
… the biggest risk facing humanity in the nearish future is that of a runaway AI doing things we really don’t want but are powerless to stop, and preventing this requires serious hard work in mathematics and philosophy and engineering that no one seems to be doing; or
… most of the world’s population is going to spend eternity in unimaginable torment because they don’t know how to please the gods; or
… there are billions of people much, much worse off than you, and giving away almost everything you have and almost everything you earn will make the world a substantially better place than keeping it in order to have a nicer house, better food, more confidence of not starving when you get old, etc.
and if you are a normal person then you shrug your shoulders, say “damn, that’s too bad”, and get on with your life; but if you are infused with a sense of heroic responsibility then you devote your life to researching AI safety (and propagandizing to get other people thinking about it too), or become a missionary, or live in poverty while doing lucrative but miserable work in order to save lives in Africa.
If it turns out that you picked as good a cause as you think you did, and if you do your heroic job well and get lucky, then you can end up transforming the world for the better. If you picked a bad cause (saving Germany from the Jewish menace, let’s say) and do your job well and get lucky, you can (deservedly) go down in history as an evil genocidal tyrant and one of the worst people who ever lived. And if you turn out not to have the skill and luck you need, you can waste your life failing to solve the problem you took aim at, and end up neither accomplishing anything of importance nor having a comfortable life.
So there are reasons why most people don’t embrace “heroic responsibility”. But the premise for the whole thing—without which there’s nothing to be heroically responsible about—is, it seems to me, that you really think that this thing needs doing and you need to do it and that’s what’s best for the world.
(“Heroic responsibility” isn’t only about tasks so big that they consume your entire life. You can take heroic responsibility for smaller-scale things too, if they present themselves and seem important enough. But, again, I think what makes them opportunities for heroic responsibility is that combination of importantly worth doing and really intimidating.)
If you’re a normal person, the fact that you shrug your shoulders when faced with such things is beneficial because shrugging your shoulders instead of being heroic when faced with the destruction of civilization serves as immunity against crazy ideas and because you’re running on corrupted hardware, you probably aren’t as good at figuring out how to avoid the destruction of civilization as you think.
Just saying “I’m not going to shrug my shoulders; I’m going to be heroic instead” is removing the checks and balances that are irrational themselves but protect you against bad rationality of other types, leaving you worse off overall.
I am inclined to agree; I am not a fan of the idea of “heroic responsibility”. (Though I think most of us could stand to be a notch or two more heroic than we currently are.)
Well, here is a counter-example. I can’t imagine that was too intimidating :-/