Wouldn’t the UFAI’s possible amorality give it an advantage over a morally fettered FAI?
Probably not enough to overcome much of a head start, especially since a consequentialist FAI could and would do anything necessary to win without fear of being corrupted by power in the process.
It’d be down to the one written to develop faster, not necessarily the first if the other can quickly catch up.
True, to a limited extent. Still, if the theory about foom is correct the time-lengths involved my be very short, to the point where barring an unlikely coincidence of development the first one will take over the world before the second one is even fully coded. Even if that’s not the case, it will always be the case that there will be some sort of cut-off ‘launch before this time or lose’ point. You always have to weigh up the chance that that cut-off is in the near future, bearing in mind that the amount of cleverness and effort need to build an AGI will be decreasing all the time.
Probably not enough to overcome much of a head start, especially since a consequentialist FAI could and would do anything necessary to win without fear of being corrupted by power in the process.
True, to a limited extent. Still, if the theory about foom is correct the time-lengths involved my be very short, to the point where barring an unlikely coincidence of development the first one will take over the world before the second one is even fully coded. Even if that’s not the case, it will always be the case that there will be some sort of cut-off ‘launch before this time or lose’ point. You always have to weigh up the chance that that cut-off is in the near future, bearing in mind that the amount of cleverness and effort need to build an AGI will be decreasing all the time.