On the other hand, you can also interpret it as “I’m pretty sure (on the basis of various intuitions etc.) that the vast majority of possible superintelligences aren’t conscious”. This isn’t an objective statement of what will happen
What do you mean by saying that this is not an objective statement or a prediction?
Are you saying that you think there’s no underlying truth to consciousness?
We know it’s measurable, because that’s basically ‘I think therefore I am.’ It’s not impossible that someday we could come up with a machine or algorithm which can measure consciousness, so it’s not impossible that this ‘non-prediction’ or ‘subjective statement’ could be proved objectively wrong.
My most charitable reading of your comment is that you’re saying that the post is highly speculative and based off of ‘subjective’ (read: arbitrary) judgements. This is my position, that’s what I just said. It’s fanfiction.
I think even if you were to put at the start “this is just speculation, and highly uncertain” it would still be inappropriate content for a site about thinking rationality, for a variety of reasons, one of which being that people will base their own beliefs on your subjective judgments or otherwise be biased by them.
And even when you speculate, you should never be assigning 90% probability to a prediction about CONSCIOUSNESS and SUPERINTELLIGENT AI.
God, it just hit me again how insane that is.
“I think that [property we can not currently objectively measure] will not be present in [agent we have not observed], and I think that I could make 10 predictions of similar uncertainty and be wrong only once.”
What do you mean by saying that this is not an objective statement or a prediction?
Are you saying that you think there’s no underlying truth to consciousness?
We know it’s measurable, because that’s basically ‘I think therefore I am.’ It’s not impossible that someday we could come up with a machine or algorithm which can measure consciousness, so it’s not impossible that this ‘non-prediction’ or ‘subjective statement’ could be proved objectively wrong.
My most charitable reading of your comment is that you’re saying that the post is highly speculative and based off of ‘subjective’ (read: arbitrary) judgements. This is my position, that’s what I just said. It’s fanfiction.
I think even if you were to put at the start “this is just speculation, and highly uncertain” it would still be inappropriate content for a site about thinking rationality, for a variety of reasons, one of which being that people will base their own beliefs on your subjective judgments or otherwise be biased by them.
And even when you speculate, you should never be assigning 90% probability to a prediction about CONSCIOUSNESS and SUPERINTELLIGENT AI.
God, it just hit me again how insane that is.
“I think that [property we can not currently objectively measure] will not be present in [agent we have not observed], and I think that I could make 10 predictions of similar uncertainty and be wrong only once.”