You’re right that you can interpret FAI as motivated reasoning. I guess I should have considered alternate interpretations more.
Eliezer concluded the singularity was the most important thing to work on and then decided the best way to get other people to work on it was to improve their general rationality.
Well, kinda. Eliezer concluded the singularity was the most important thing to work on and then decided the best way to work on it was to code an AI as fast as possible, with no particular regard for safety.
I also don’t see how I conflated LW and SI
“[...] arguing about ideas on the internet” is what I was thinking of. It’s a LW-describing sentence in a non-LW-related area. Oh, and “Why rationalists worry about FAI” rather than “Why SI worries about FAI.”
You’re right that you can interpret FAI as motivated reasoning. I guess I should have considered alternate interpretations more.
Well, kinda. Eliezer concluded the singularity was the most important thing to work on and then decided the best way to work on it was to code an AI as fast as possible, with no particular regard for safety.
“[...] arguing about ideas on the internet” is what I was thinking of. It’s a LW-describing sentence in a non-LW-related area. Oh, and “Why rationalists worry about FAI” rather than “Why SI worries about FAI.”
Two people have been confused by the “arguing about ideas” phrase, so I changed it to “thinking about ideas”.
It’s more polite, and usually more accurate, to say “I sent a message I didn’t want to, so I changed X to Y.”
Most accurate would be “feedback indicates that a message was received that I didn’t intend to send, so...”