The funnest one off the top of my head is how Yudkowsky used to think that the best thing for altruists to do was build AGI as soon as possible, because that’s the quickest way to solve poverty, disease, etc. and achieve a glorious transhuman future. Then he thought more (and talked to Bostrom, I was told) and realized that that’s pretty much the exact opposite of what we should be doing. When MIRI was founded its mission was to build AGI as soon as possible.
(Disclaimer: This is the story as I remember it being told, it’s entirely possible I’m wrong)
The funnest one off the top of my head is how Yudkowsky used to think that the best thing for altruists to do was build AGI as soon as possible, because that’s the quickest way to solve poverty, disease, etc. and achieve a glorious transhuman future. Then he thought more (and talked to Bostrom, I was told) and realized that that’s pretty much the exact opposite of what we should be doing. When MIRI was founded its mission was to build AGI as soon as possible.
(Disclaimer: This is the story as I remember it being told, it’s entirely possible I’m wrong)
He recounts this story in the Sequences.