why not simply push to acquire financial resources and hire brilliant people to do the work you think is necessary
The point is to obtain an insurmountable lead on WBE tech, otherwise you’ll just spur competition and probably end up with Robin Hanson’s Malthusian scenario. (If intelligence explosion were possible, you could win the WBE race by a small margin and translate that into a big win, but for this post I’m assuming that intelligence explosion isn’t possible, so you need to win the race by a large margin.)
[comment from the heart, rather than from the head: your description of MSI-1 sounds kind of, well, totalitarian. Don’t you think that’s a little peculiar?]
(If intelligence explosion were possible, you could win the WBE race by a small margin and translate that into a big win, but for this post I’m assuming that intelligence explosion isn’t possible, so you need to win the race by a large margin.)
Since exploiting intelligence explosion still requires FAI, and FAI could be very difficult, you might still need a large enough margin to perform all the necessary FAI research before your competition stumbles on an AGI.
The point is to obtain an insurmountable lead on WBE tech, otherwise you’ll just spur competition and probably end up with Robin Hanson’s Malthusian scenario. (If intelligence explosion were possible, you could win the WBE race by a small margin and translate that into a big win, but for this post I’m assuming that intelligence explosion isn’t possible, so you need to win the race by a large margin.)
In that case you’re in for a surprise when you find out what I was referring to by “WBE-enabled institutional controls” for MSI-2. Read Carl Shulman’s Whole Brain Emulation and the Evolution of Superorganisms.
Since exploiting intelligence explosion still requires FAI, and FAI could be very difficult, you might still need a large enough margin to perform all the necessary FAI research before your competition stumbles on an AGI.