I didn’t consider the opposite danger of CFAR immunizing against other forms of self-improvement, it seems like it wouldn’t be too strong of an effect but much less bad in any case?
Yeah I guess it depends on what you’re trying to measure. From an individual perspective getting sucked into a cult for a few years is much worse, but from a collective perspective a smart EA or rationalist operating at only 50% of their counterfactual impact is probably much worse.
Yeah I guess it depends on what you’re trying to measure. From an individual perspective getting sucked into a cult for a few years is much worse, but from a collective perspective a smart EA or rationalist operating at only 50% of their counterfactual impact is probably much worse.