I would be modestly surprised, but not very surprised, if an A.G.I. could cause build a Dyson sphere causing the sun to be dimmed by >20% in less than a couple decades (I think a few percent isn’t enough to cause crop failure), but within a century is plausible to me.
I don’t think we would be squashed for our potential to build a competitor. I think that a competitor would no longer be a serious threat once an A.G.I. seized all available compute.
I give a little more credence to various “unknown unknowns” about the laws of physics and the priorities of superintelligences implying that an A.G.I. would no longer care to exploit the resources we need.
Overall rationalists are right to worry about being killed by A.G.I.
I agree with most of this.
I would be modestly surprised, but not very surprised, if an A.G.I. could cause build a Dyson sphere causing the sun to be dimmed by >20% in less than a couple decades (I think a few percent isn’t enough to cause crop failure), but within a century is plausible to me.
I don’t think we would be squashed for our potential to build a competitor. I think that a competitor would no longer be a serious threat once an A.G.I. seized all available compute.
I give a little more credence to various “unknown unknowns” about the laws of physics and the priorities of superintelligences implying that an A.G.I. would no longer care to exploit the resources we need.
Overall rationalists are right to worry about being killed by A.G.I.