Anna, it takes very little effort to rattle off a numerical probability—and then most readers take away an impression (usually false) of precision of thought.
At the start of Causality Judea Pearl explains why humans (should and usually do) use “causal” concepts rather than “statistical” ones. Although I do not recall whether he comes right out and says it, I definitely took away from Pearl the heuristic that stating your probability about some question is basically useless unless you also state the calculation that led to the number. I do recall that stating a number is clearly what Pearl defines as a statistical statement rather than a causal statement. What you should usually do instead of stating a probability estimate is to share with your readers the parts of your causal graph that most directly impinges on the question under discussion.
So, unless Eliezer goes on to list one or more factors that he believes would cause a human to convert to or convert away from my system of valuing things (namely, goal system zero or GSZ) or one or more factors that he believes would tend to prevents other factors from causing a conversion to or away from GSZ, I am going to go on believing that Eliezer has probably not reflected enough on the question for his numbers to be worth anything and that he is just blowing me off.
In summary, I tend to think that most uses of numerical probabilities on these pages have been useless. On this particular question I am particularly sceptical because Eliezer has exhibited signs (which I am prepared to describe if asked) that he has not reflected enough on goal system zero to understand it well enough to make any numerical probability estimate about it.
I am busy with an urgency today, so I might take 24 h to reply to replies to this.
Eliezer’s novella provides a vivid illustration of the danger of promoting what should have stayed an instrumental value to the the status of a terminal value. Eliezer likes to refer to this all-too-common mistake as losing purpose. I like to refer to it as adding a false terminal value.
For example, eating babies was a valid instrumental goal when the Babyeaters were at an early state of technological development. It is not IMHO evil to eat babies when the only alternative is chronic severe population pressure which will eventually either lead to your extinction or the disintegration of your agricultural civilization with a reversion to a more primitive existence in which technological advancement is slow, uncertain and easily reversed by things like natural disasters.
But then babyeating became an end in itself.
By clinging to the false terminal value of babyeating, the Babyeaters caused their own extinction even though at the time of their extinction they had an alternative means of preventing an explosion of their population (particularly, editing their own genome so that fewer babies are born: if they did not have the tech to do that, they could have asked the humans or the the Superhappies for it).
In the same way, the humans in the novella and the Superhappies are the victims of a false terminal value, which we might call “hedonic altruism”: the goal of extinguishing suffering wherever it exists in the universe. Eliezer explains some of the reasons for the great instrumental value of becoming motivated by the suffering of others in Sympathetic Minds in the passage that starts with “Who is the most formidable, among the human kind?” Again, just because something has great instrumental value is no reason to promote it to a terminal value; when circumstances change, it may lose its instrumental value; and a terminal value once created tends to persist indefinitely because by definition there is no criterion by which to judge a system of terminal values.
I hope that human civilization will abandon the false terminal value of hedonic altruism before it spreads to the stars. I.e., I hope that the human dystopian future portrayed in the novella can be averted.