I’ve been here awhile. Your account is a few days old. Why are you here?
That’s not an answer. That’s an evasion.
Whether the world is burning or not is an interesting discussion, but I’m quite sure that better epistemology isn’t going to put out the fire.
Epistemology tells you how to think. Moral philosophy tells you how to live. You cannot even fight the fire without better epistemology and better moral philosophy.
Writing voluminous amounts of text on a vanity website isn’t going to do it either.
Why do you desire so much to impute bad motives to curi?
The question is ill-posed. Without context it’s too open-ended to have any meaning. But let me say that I’m here not to save the world. Is that sufficient?
Epistemology tells you how to think.
No, it doesn’t. It deals with acquiring knowledge. There are other things—like logic—which are quite important to thinking.
impute bad motives to curi?
I don’t impute bad motives to him. I just think that he is full of himself and has… delusions about his importance and relationship to truth.
No, it doesn’t. It deals with acquiring knowledge. There are other things—like logic—which are quite important to thinking.
Human knowledge acquisition happens by learning. It involves coming up with guesses and error-correcting those guesses via criticism in an evolutionary process. This is going on in your mind all the time, consciously and subconsciously. It is how we are able to think. And knowing how this works enables us to think better. This is epistemology. And the breakthrough in AGI will come from epistemology. At a very high level, we already know what is going on.
And knowing how this works enables us to think better.
Sure, but that’s not sufficient. You need to show that the effect will be significant, suitable for the task at hand, and is the best use of the available resources.
Drinking CNS stimulants (such as coffee) in the morning also enables us to think better. So what?
And the breakthrough in AGI will come from epistemology.
The question is ill-posed. Without context it’s too open-ended to have any meaning.
This is just more evasion.
But let me say that I’m here not to save the world. Is that sufficient?
You know Yudkowsky also wants to save the world right? That Less Wrong is ultimately about saving the world? If you do not want to save the world, you’re in the wrong place.
I don’t impute bad motives to him. I just think that he is full of himself and has… delusions about his importance and relationship to truth.
Hypothetically, suppose you came across a great man who knew he was great and honestly said so. Suppose also that great man had some true new ideas you were unfamiliar with but that contradicted many ideas you thought were important and true. In what way would your response to him be different to your response to curi?
Fail to ask a clear question, and you will fail to get a clear answer.
You know Yudkowsky also wants to save the world right?
Not quite save—EY wants to lessen the chance that the humans will be screwed over by off-the-rails AI.
That Less Wrong is ultimately about saving the world?
Oh grasshopper, maybe you will eventually learn that not all things are what they look like and even fewer are what they say the are.
you’re in the wrong place
I am disinclined to accept your judgement in this matter :-P
Hypothetically, suppose you came across a great man … In what way would your response to him be different to your response to curi?
Obviously it depends on the way he presented his new ideas. curi’s ideas are not new and were presented quite badly.
There are two additional points here. One is that knowledge is uncertain, fallible, if you wish. Knowledge about the future (= forecasts) is much more so. Great men rarely know they are great, they may guess at their role in history but should properly be very hesitant about it.
Two, I’m much more likely to meet someone who knew he was Napoleon, the rightful Emperor of France, and honestly said so rather than a truly great man who goes around proclaiming his greatness. I’m sure Napoleon has some great ideas that I’m unfamiliar with—what should my response be?
That’s not an answer. That’s an evasion.
Epistemology tells you how to think. Moral philosophy tells you how to live. You cannot even fight the fire without better epistemology and better moral philosophy.
Why do you desire so much to impute bad motives to curi?
The question is ill-posed. Without context it’s too open-ended to have any meaning. But let me say that I’m here not to save the world. Is that sufficient?
No, it doesn’t. It deals with acquiring knowledge. There are other things—like logic—which are quite important to thinking.
I don’t impute bad motives to him. I just think that he is full of himself and has… delusions about his importance and relationship to truth.
Human knowledge acquisition happens by learning. It involves coming up with guesses and error-correcting those guesses via criticism in an evolutionary process. This is going on in your mind all the time, consciously and subconsciously. It is how we are able to think. And knowing how this works enables us to think better. This is epistemology. And the breakthrough in AGI will come from epistemology. At a very high level, we already know what is going on.
Sure, but that’s not sufficient. You need to show that the effect will be significant, suitable for the task at hand, and is the best use of the available resources.
Drinking CNS stimulants (such as coffee) in the morning also enables us to think better. So what?
How do you know that?
This is just more evasion.
You know Yudkowsky also wants to save the world right? That Less Wrong is ultimately about saving the world? If you do not want to save the world, you’re in the wrong place.
Hypothetically, suppose you came across a great man who knew he was great and honestly said so. Suppose also that great man had some true new ideas you were unfamiliar with but that contradicted many ideas you thought were important and true. In what way would your response to him be different to your response to curi?
Fail to ask a clear question, and you will fail to get a clear answer.
Not quite save—EY wants to lessen the chance that the humans will be screwed over by off-the-rails AI.
Oh grasshopper, maybe you will eventually learn that not all things are what they look like and even fewer are what they say the are.
I am disinclined to accept your judgement in this matter :-P
Obviously it depends on the way he presented his new ideas. curi’s ideas are not new and were presented quite badly.
There are two additional points here. One is that knowledge is uncertain, fallible, if you wish. Knowledge about the future (= forecasts) is much more so. Great men rarely know they are great, they may guess at their role in history but should properly be very hesitant about it.
Two, I’m much more likely to meet someone who knew he was Napoleon, the rightful Emperor of France, and honestly said so rather than a truly great man who goes around proclaiming his greatness. I’m sure Napoleon has some great ideas that I’m unfamiliar with—what should my response be?