I would suggest that this is a useful thing to do on an individual level (to adjust for scope insensitivity and so forth) but a terrible thing to do on a group level (because it’s mind-killing). Smells too much like the Yellow Peril for my taste.
The Anthropomorphization Cannon is a powerful weapon, and if it were to fall into the wrong hands…
I feel that this position could be equally argued if the scopes were switched, given the following motivation.
...if we mentally anthropomorphised certain risks, then we’d be more likely to give them the attention they deserved.
—OP
For example, a harmless :-) play on your comment. All the while, keeping the above maximization criteria in mind.
I would suggest that this is a useful thing to do on a group level (because it’s mind-killing; take Yellow Peril for example) but a terrible thing to do on an individual level (to adjust for scope insensitivity and so forth).
I would suggest that this is a useful thing to do on an individual level (to adjust for scope insensitivity and so forth) but a terrible thing to do on a group level (because it’s mind-killing). Smells too much like the Yellow Peril for my taste.
The Anthropomorphization Cannon is a powerful weapon, and if it were to fall into the wrong hands…
I feel that this position could be equally argued if the scopes were switched, given the following motivation.
For example, a harmless :-) play on your comment. All the while, keeping the above maximization criteria in mind.