I was all prepared to vote this up from ‘”If Eliezer had simply been obsessed by saving the world from asteroids, would they all be focused on that?’.” But then you had to go and be wrong—suggesting some sort of “lazy ideas only” search process that makes no sense historically, and conflating LW and SI.
Can you point to something I said that’s you think is wrong?
My understanding of the history (from reading an interview with Eliezer) is that Eliezer concluded the singularity was the most important thing to work on and then decided the best way to get other people to work on it was to improve their general rationality. But whether that’s true or not, I don’t see how that’s inconsistent with the notion that Eliezer and a bunch of people similar to him are suffering from motivated reasoning.
I also don’t see how I conflated LW and SI. I said many LW readers worry about UFAI and that SI has taken the position that the best way to address this worry is to do philosophy.
You’re right that you can interpret FAI as motivated reasoning. I guess I should have considered alternate interpretations more.
Eliezer concluded the singularity was the most important thing to work on and then decided the best way to get other people to work on it was to improve their general rationality.
Well, kinda. Eliezer concluded the singularity was the most important thing to work on and then decided the best way to work on it was to code an AI as fast as possible, with no particular regard for safety.
I also don’t see how I conflated LW and SI
“[...] arguing about ideas on the internet” is what I was thinking of. It’s a LW-describing sentence in a non-LW-related area. Oh, and “Why rationalists worry about FAI” rather than “Why SI worries about FAI.”
I was all prepared to vote this up from ‘”If Eliezer had simply been obsessed by saving the world from asteroids, would they all be focused on that?’.”
Maybe. So far as I know, averting asteroids doesn’t have as good a writer to inspire people.
I was all prepared to vote this up from ‘”If Eliezer had simply been obsessed by saving the world from asteroids, would they all be focused on that?’.” But then you had to go and be wrong—suggesting some sort of “lazy ideas only” search process that makes no sense historically, and conflating LW and SI.
Can you point to something I said that’s you think is wrong?
My understanding of the history (from reading an interview with Eliezer) is that Eliezer concluded the singularity was the most important thing to work on and then decided the best way to get other people to work on it was to improve their general rationality. But whether that’s true or not, I don’t see how that’s inconsistent with the notion that Eliezer and a bunch of people similar to him are suffering from motivated reasoning.
I also don’t see how I conflated LW and SI. I said many LW readers worry about UFAI and that SI has taken the position that the best way to address this worry is to do philosophy.
You’re right that you can interpret FAI as motivated reasoning. I guess I should have considered alternate interpretations more.
Well, kinda. Eliezer concluded the singularity was the most important thing to work on and then decided the best way to work on it was to code an AI as fast as possible, with no particular regard for safety.
“[...] arguing about ideas on the internet” is what I was thinking of. It’s a LW-describing sentence in a non-LW-related area. Oh, and “Why rationalists worry about FAI” rather than “Why SI worries about FAI.”
Two people have been confused by the “arguing about ideas” phrase, so I changed it to “thinking about ideas”.
It’s more polite, and usually more accurate, to say “I sent a message I didn’t want to, so I changed X to Y.”
Most accurate would be “feedback indicates that a message was received that I didn’t intend to send, so...”
Maybe. So far as I know, averting asteroids doesn’t have as good a writer to inspire people.