First point: by “really want to do good” (the really is important here) I mean someone who would be fundamentally altruistic and would not have any status/power desire, even subconsciously.
I don’t think Conjecture is an “AGI company”, everyone I’ve met there cares deeply about alignment and their alignment team is a decent fraction of the entire company. Plus they’re funding the incubator.
I think it’s also a misconception that it’s an unilateralist intervension. Like, they’ve talked to other people in the community before starting it, it was not a secret.
First point: by “really want to do good” (the really is important here) I mean someone who would be fundamentally altruistic and would not have any status/power desire, even subconsciously.
Then I’d argue the dichotomy is vacuously true, i.e. it does not generally pertain to humans. Humans are the result of human evolution. It’s likely that having a brain that (unconsciously) optimizes for status/power has been very adaptive.
Regarding the rest of your comment, this thread seems relevant.
First point: by “really want to do good” (the really is important here) I mean someone who would be fundamentally altruistic and would not have any status/power desire, even subconsciously.
I don’t think Conjecture is an “AGI company”, everyone I’ve met there cares deeply about alignment and their alignment team is a decent fraction of the entire company. Plus they’re funding the incubator.
I think it’s also a misconception that it’s an unilateralist intervension. Like, they’ve talked to other people in the community before starting it, it was not a secret.
Then I’d argue the dichotomy is vacuously true, i.e. it does not generally pertain to humans. Humans are the result of human evolution. It’s likely that having a brain that (unconsciously) optimizes for status/power has been very adaptive.
Regarding the rest of your comment, this thread seems relevant.