Good write up, I’ll definitely use this to introduce others to LW. Maybe one more numbered point to stress the scope of this issue would help explain the inherent danger. I tried to read this post from the perspective of someone who is new to this topic and, for me, it leaves the door open for the ‘not my problem’ argument or the position that ‘this will never affect my town/city/country so why should I care?’
A hypothetical point #5 could perhaps stress the uncertainty of an isolated vs global disaster and/or explain that, unlike other technologies, we only get one chance to get it right. I’m sure there’s a more elegant way to word it, the possibility of a ‘rival species’ mentioned at the end should be enough to make this point but it’s easy to overlook the implications for someone new to this type of thinking.
Good write up, I’ll definitely use this to introduce others to LW. Maybe one more numbered point to stress the scope of this issue would help explain the inherent danger. I tried to read this post from the perspective of someone who is new to this topic and, for me, it leaves the door open for the ‘not my problem’ argument or the position that ‘this will never affect my town/city/country so why should I care?’
A hypothetical point #5 could perhaps stress the uncertainty of an isolated vs global disaster and/or explain that, unlike other technologies, we only get one chance to get it right. I’m sure there’s a more elegant way to word it, the possibility of a ‘rival species’ mentioned at the end should be enough to make this point but it’s easy to overlook the implications for someone new to this type of thinking.