Has MIRI put any thought into attempting to convince specifically Larry Page or Sergey Brin that AGI is dangerous and we need to rush FAI first? If they succeeded at that (soon), I would no longer have to assign >50% probability to a paperclipping.
I work at Google and have given it some serious thought. Without going into to much detail, the timing is not yet correct. However, steps are being taken in that direction.
Has MIRI put any thought into attempting to convince specifically Larry Page or Sergey Brin that AGI is dangerous and we need to rush FAI first? If they succeeded at that (soon), I would no longer have to assign >50% probability to a paperclipping.
I work at Google and have given it some serious thought. Without going into to much detail, the timing is not yet correct. However, steps are being taken in that direction.