And yet, from a consequentialist standpoint, there shouldn’t be.
Only if your reasoning is extremely reliable in estimating the consequences of your action or inaction. Otherwise you may end up doing more harm by acting than you would by inacting (happens all the time). I am guessing that this is a part of what keeps people from acting.
Only if your reasoning is extremely reliable in estimating the consequences of your action or inaction. Otherwise you may end up doing more harm by acting than you would by inacting (happens all the time). I am guessing that this is a part of what keeps people from acting.