And I will not, if at all possible, give any other human being the least cause to think that someone else might spark a better Singularity. I can make no promises upon the future, but I will at least not close off desirable avenues through my own actions.
A possible problem here is that your high entry requirement specifications may well, with a substantial probability, allow others with lower standards to create a superintelligence before you do.
So: since you seem to think that would be pretty bad, and since you say you are a consequentialist—and believe in the greater good—you should probably act to stop them—e.g. by stepping up your own efforts to get there first by bringing the target nearer to you.
A possible problem here is that your high entry requirement specifications may well, with a substantial probability, allow others with lower standards to create a superintelligence before you do.
So: since you seem to think that would be pretty bad, and since you say you are a consequentialist—and believe in the greater good—you should probably act to stop them—e.g. by stepping up your own efforts to get there first by bringing the target nearer to you.