Do you mean that he actively seeks to encourage young people to try and slow Moore’s Law, or that this is an unintentional consequence of his writings on AI risk topics?
I’m pretty sure that Roko means the second. If this idea got mentioned to Eliezer I’m pretty sure he’d point out the minimal impact that any single human can have on this, even before one gets to whether or not it is a good idea.
Personally trying to slow Moore’s Law down is the kind of foolishness that Eliezer seems to inspire in young people...
Do you mean that he actively seeks to encourage young people to try and slow Moore’s Law, or that this is an unintentional consequence of his writings on AI risk topics?
I’m pretty sure that Roko means the second. If this idea got mentioned to Eliezer I’m pretty sure he’d point out the minimal impact that any single human can have on this, even before one gets to whether or not it is a good idea.