There’s a debate between Tyler Cowen and philosopher Agnes Callard around valuing human lives with a number. Tyler Cowen starts by saying that it’s actually a complicated issue but that having some bounds that depend on circumstances is useful. Agnes Callard then says that you don’t need to put any value on human lives at all to make practical tradeoffs because you can think about obligations.
After hearing that exchange it seems to me like the position that you have to put monetary values on human lives to be able to make good decisions seems questionable to me and naive due to lack of knowledge of the alternatives about how to make the decisions.
Thinking about social actors making promises to each other and then having obligations to deliever on those promises is a valid model for thinking about who makes effort to safe peoples lives.
It seems like “agent X puts a particular dollar value on human life” might be ambiguous between “agent X acts as though human lives are worth exactly N dollars each” and “agent X’s internal thoughts explicitly assign a dollar value of N to a human life”. I wonder if that’s causing some confusion surrounding this topic. (I didn’t watch the linked video.)
There’s a debate between Tyler Cowen and philosopher Agnes Callard around valuing human lives with a number. Tyler Cowen starts by saying that it’s actually a complicated issue but that having some bounds that depend on circumstances is useful. Agnes Callard then says that you don’t need to put any value on human lives at all to make practical tradeoffs because you can think about obligations.
After hearing that exchange it seems to me like the position that you have to put monetary values on human lives to be able to make good decisions seems questionable to me and naive due to lack of knowledge of the alternatives about how to make the decisions.
Thinking about social actors making promises to each other and then having obligations to deliever on those promises is a valid model for thinking about who makes effort to safe peoples lives.
It seems like “agent X puts a particular dollar value on human life” might be ambiguous between “agent X acts as though human lives are worth exactly N dollars each” and “agent X’s internal thoughts explicitly assign a dollar value of N to a human life”. I wonder if that’s causing some confusion surrounding this topic. (I didn’t watch the linked video.)