But perhaps the proportion of FAIs that ‘kill all humans’ is large.
Maybe probability you estimate for that to happen is high, but “proportion” doesn’t makes sense, since FAI is defined as an agent acting for specific preference, so FAIs have to agree on what to do.
Maybe probability you estimate for that to happen is high, but “proportion” doesn’t makes sense, since FAI is defined as an agent acting for specific preference, so FAIs have to agree on what to do.
OK, I’m new to this.