Nice post, so many hidden assumptions behind the words we use.
I wonder what are some concrete examples of this in alignment discussions, examples like your one about the probability that god exists.
One that comes to mind is a recent comment thread on one of the Late 2021 Miri Conversations posts where we were assigning probabilities to “soft takeoff” and “hard takeoff” scenarios. Then Daniel Kokotajlo realized that “soft takeoff” had to be disambiguated because in that context some people were using it to mean any kind of gradual advancement in AI capabilities, whereas others meant it to mean specifically “GDP doubling in 4 years, then doubling in 1 year”.
Nice post, so many hidden assumptions behind the words we use.
I wonder what are some concrete examples of this in alignment discussions, examples like your one about the probability that god exists.
One that comes to mind is a recent comment thread on one of the Late 2021 Miri Conversations posts where we were assigning probabilities to “soft takeoff” and “hard takeoff” scenarios. Then Daniel Kokotajlo realized that “soft takeoff” had to be disambiguated because in that context some people were using it to mean any kind of gradual advancement in AI capabilities, whereas others meant it to mean specifically “GDP doubling in 4 years, then doubling in 1 year”.