I don’t think I have anything unique to add to this discussion. Basically I defer to Eliezer and Nick (Bostom) for written arguments since they are largely the ones who provided the arguments that lead me to strongly believe we live in a world with hard takeoff via recursive self improvement that will lead to a “singularity” in the sense that we pass some threshold of intelligence/capabilities beyond which we cannot meaningfully reason about or control what happens after the fact, though we may be able to influence how it happens in ways that don’t cut off the possibilities of outcomes we would be happy with.
I don’t think I have anything unique to add to this discussion. Basically I defer to Eliezer and Nick (Bostom) for written arguments since they are largely the ones who provided the arguments that lead me to strongly believe we live in a world with hard takeoff via recursive self improvement that will lead to a “singularity” in the sense that we pass some threshold of intelligence/capabilities beyond which we cannot meaningfully reason about or control what happens after the fact, though we may be able to influence how it happens in ways that don’t cut off the possibilities of outcomes we would be happy with.