I do believe “Google wins” is a likely scenario—more likely than “X wins” for any other single value of X.
This is something of a non-sequitur. “Google wins” might be more likely than any other “X wins” with an X than we can name today, and still very unlikely in absolute terms. Like a lottery with a thousand tickets where only one person has bought ten of them and all the others go to one person each. Let us be precise: conditional of there being a singularity before 2040, what is your probability that Google initiates it?
Before reading this post, had you already recognized the impact that the size of the organization who builds the first AI has on its probability of FOOMing, formed an estimate of the likelihood of first AI being big or small, and factor that into your estimate for the expected speed of a FOOM?
Did you upvote this post?
If the answer to the first question is yes, then I might try to answer a variant of your question.
If your answer to the first question is no, and your answer to the second question is no because you think this is not something you need to factor into that probabilty, then it would be counterproductive for me to answer questions on tangential issues.
If your answer to the first question is no, and your answer to the second question is no because you’re waiting for more specifics, why should I bother, since you’ve already decided my answer would be worth nothing to you?
Please do not vote solely based on how much you agree or disagree with someone’s conclusions. A better heuristic is to vote based on how much a comment or post improves the accuracy of your map. For example, a comment you agree with that doesn’t add to the discussion should be voted down or left alone. A comment you disagree with that raises important points should be voted up.
So the specific probability I assign to Google in particular winning is irrelevant to voting, and IMHO something of a digression from the main point of the post. If you care, make up your own probability. That’s what you should be doing anyway. I’ve given you many relevant facts.
A more-important question is, What is the probability distribution over “capital investment that will produce the first average-human-level AI”? I expect that the probabilities will be dominated by large investments, because the probability distribution over “capital investment that will produce the first X” appears to me to be dominated by large investments, for similarly-ambitious X such as “spaceflight to the moon” or “sequence of the human genome”. A very clever person could have invented low-cost genome sequencing in the 1990s and sequenced the genome him/herself. But no very clever person did.
This is something of a non-sequitur. “Google wins” might be more likely than any other “X wins” with an X than we can name today, and still very unlikely in absolute terms. Like a lottery with a thousand tickets where only one person has bought ten of them and all the others go to one person each. Let us be precise: conditional of there being a singularity before 2040, what is your probability that Google initiates it?
First, take this 2-question quiz:
Before reading this post, had you already recognized the impact that the size of the organization who builds the first AI has on its probability of FOOMing, formed an estimate of the likelihood of first AI being big or small, and factor that into your estimate for the expected speed of a FOOM?
Did you upvote this post?
If the answer to the first question is yes, then I might try to answer a variant of your question.
If your answer to the first question is no, and your answer to the second question is no because you think this is not something you need to factor into that probabilty, then it would be counterproductive for me to answer questions on tangential issues.
If your answer to the first question is no, and your answer to the second question is no because you’re waiting for more specifics, why should I bother, since you’ve already decided my answer would be worth nothing to you?
A relevant quote from Site Mechanics::
So the specific probability I assign to Google in particular winning is irrelevant to voting, and IMHO something of a digression from the main point of the post. If you care, make up your own probability. That’s what you should be doing anyway. I’ve given you many relevant facts.
A more-important question is, What is the probability distribution over “capital investment that will produce the first average-human-level AI”? I expect that the probabilities will be dominated by large investments, because the probability distribution over “capital investment that will produce the first X” appears to me to be dominated by large investments, for similarly-ambitious X such as “spaceflight to the moon” or “sequence of the human genome”. A very clever person could have invented low-cost genome sequencing in the 1990s and sequenced the genome him/herself. But no very clever person did.