Yeah, I think he assigns ~5% chance to FOOM, if I had to make a tenative guess. 10% seems too high to me. In general, my first impression as to Hanson’s credences on a topic won’t be accurate unless I really scrutinize his claims. So its not weird to me that someone might wind up thinking Hanson believes there’s a <1% of AI x-risks.
Do you mean hard take off, or Yudkowsky’s worry that foom causes rapid value drift and destroys all value? I think Hanson puts maybe 5% on that and a much larger number on hard take off, 10 or 20%.
Really? My impression was the opposite. He’s said stuff to the effect of “there’s nothing you can do to prevent value drift”, and seems to think that whether we create EMs or not, our successors will hold values quite different to our own. See all the stuff about the current era being a dreamtime, on the values of grabby aliens etc.
Yeah, I think he assigns ~5% chance to FOOM, if I had to make a tenative guess. 10% seems too high to me. In general, my first impression as to Hanson’s credences on a topic won’t be accurate unless I really scrutinize his claims. So its not weird to me that someone might wind up thinking Hanson believes there’s a <1% of AI x-risks.
Do you mean hard take off, or Yudkowsky’s worry that foom causes rapid value drift and destroys all value? I think Hanson puts maybe 5% on that and a much larger number on hard take off, 10 or 20%.
Really? My impression was the opposite. He’s said stuff to the effect of “there’s nothing you can do to prevent value drift”, and seems to think that whether we create EMs or not, our successors will hold values quite different to our own. See all the stuff about the current era being a dreamtime, on the values of grabby aliens etc.