I thought that it has been generally agreed upon that, as a participant, the rational thing to do was to examine one’s intuition and logic and make sure that your system 1 and system 2 thinking matches up, then reexamine it for possible biases. Hence it seems highly irrational to bring up something like
I have a strong intuition that OpenCog can be made into a human-level general intelligence, and that if this intelligence is raised properly it will turn out benevolent and help us launch a positive Singularity. However, I can’t fully rationally substantiate this intuition
I thought that it has been generally agreed upon that, as a participant, the rational thing to do was to examine one’s intuition and logic and make sure that your system 1 and system 2 thinking matches up, then reexamine it for possible biases. Hence it seems highly irrational to bring up something like
as a supporting argument.
Except in a lot of cases your intuition is better than your conscious thinking. See Daniel Kahneman’s Thinking Fast and Slow.
That’s gotta be the best Cupertino I’ve seen in a while.
Thanks, fixed.
My point was that you would be irrational to seriously expect your opponent to share your intuition, not that your intuition is wrong.