I’d really like to see Eliezer engage with this comment, because to me it looks like the following sentence’s well-foundedness is rightly being questioned.
it’s naked mathematical truth that the task GPTs are being trained on is harder than being an actual human.
While I generally agree that powerful optimizers are dangerous, the fact that the GPT task and the “being an actual human” task are somewhat different has nothing to do with it.
I’d really like to see Eliezer engage with this comment, because to me it looks like the following sentence’s well-foundedness is rightly being questioned.
While I generally agree that powerful optimizers are dangerous, the fact that the GPT task and the “being an actual human” task are somewhat different has nothing to do with it.