Re “It’s costly for AI to leave humans alive”, I think the best thing written on this is Paul’s comment here, the most relevant part of which is:
First, I think you should talk quantitatively. How many more resources can an AI get by killing humans? I’d guess the answer is something like 1 in a billion to 1 in a trillion.
If you develop as fast as possible you will wreck the human habitat and incidentally kill a lot of people. It’s pretty complicated to figure out exactly how much “keep earth livable enough for human survival” will slow you down, since it depends a lot on the dynamics of the singularity. I would guess more like a month than a year, which results in a miniscule reduction in available resources. I think that (IMO implausible) MIRI-style views would suggest more like hours or days than months.
Incidentally, I think “byproducts of rapid industrialization trash Earth’s climate” is both much more important than the dyson sphere as well as much more intuitively plausible.
You can get energy from harvesting the biosphere, and you can use it to develop slightly faster. This is a rounding error compared to the last factor though.
Killing most humans might be the easiest way to prevail in conflict. I think this is especially plausible for weak AI. For very powerful AI, it also seems like a rounding error. Even a moderately advanced civilization could spend much less than 1 in a trillion of its resources to have much less than 1 in a billion chance of being seriously inconvenienced by humanity.
Re “It’s costly for AI to leave humans alive”, I think the best thing written on this is Paul’s comment here, the most relevant part of which is: