Nope. There is no composition fallacy where there is no composition. I am replying to your position, not to mine.
entirelyuseless
I do care about tomorrow, which is not the long run.
I don’t think we should assume that AIs will have any goals at all, and I rather suspect they will not, in the same way that humans do not, only more so.
Not really. I don’t care if that happens in the long run, and many people wouldn’t.
I considered submitting an entry basically saying this, but decided that it would be pointless since obviously it would not get any prize. Human beings do not have coherent goals even individually. Much less does humanity.
Right. Utilitarianism is false, but Eliezer was still right about torture and dust specks.
Can we agree that I am not trying to prosthelytize anyone?
No, I do not agree. You have been trying to proselytize people from the beginning and are still doing trying.
(2) Claiming authority or pointing skyward to an authority is not a road to truth.
This is why you need to stop pointing to “Critical Rationalism” etc. as the road to truth.
I also think claims to truth should not be watered down for social reasons. That is to disrespect the truth. People can mistake not watering down the truth for religious fervour and arrogance.
First, you are wrong. You should not mention truths that it is harmful to mention in situations where it is harmful to mention them. Second, you are not “not watering down the truth”. You are making many nonsensical and erroneous claims and presenting them as though they were a unified system of absolute truth. This is quite definitely proselytism.
I basically agree with this, although 1) you are expressing it badly, 2) you are incorporating a true fact about the world into part of a nonsensical system, and 3) you should not be attempting to proselytize people.
Nothing to see here; just another boring iteration of the absurd idea of “shifting goalposts.”
There really is a difference between a general learning algorithm and specifically focused ones, and indeed, anything that can generate and test and run experiments will have the theoretical capability to control pianist robots and scuba dive and run a nail salon.
Do you not think the TCS parent hasn’t also heard this scenario over and over? Do you think you’re like the first one ever to have mentioned it?
Do you not think that I am aware that people who believe in extremist ideologies are capable of making excuses for not following the extreme consequences of their extremist ideologies?
But this is just the same as a religious person giving excuses for why the empirical consequences of his beliefs are the same whether his beliefs are true or false.
You have two options:
1) Embrace the extreme consequences of your extreme beliefs. 2) Make excuses for not accepting the extreme consequences. But then you will do the same things that other people do, like using baby gates, and then you have nothing to teach other people.
I should have said also that the stair-falling scenario and other similar scenarios are just excuses for people not to think about TCS.
You are the one making excuses, for not accepting the extreme consequences of your extremist beliefs.
I suppose you’re going to tell me that pushing or pulling my spouse out of the way of a car
Yes, it is.
Secondly, it is quite different from the stairway case, because your spouse would do the same thing on purpose if they saw the car, but the child will not move away when they see the stairs.
At that point I’ll wonder what types of “force” you advocate using against children that you do not think should be used on adults.
Who said I advocate using force against children that we would not use against adults? We use force against adults, e.g. putting criminals in prison. It is an extremist ideology to say that you should never use force against adults, and it is equally an extremist ideology to say that you should never use force with children.
I ignored you because your definition of force was wrong. That is not what the word means in English. If you pick someone up and take them away from a set of stairs, that is force if they were trying to move toward them, even if they would not like to fall down them.
a baby gate
We were talking about force before, not violence. A baby gate is using force.
Children don’t want to fall down stairs.
They do, however, want to move in the direction of the stairs, and you cannot “help them not fall down stairs” without forcing them not to move in the direction of the stairs.
Saying it is “extremist” without giving arguments that can be criticised and then rejecting it would be rejecting rationality.
Nonsense. I say it is extremist because it is. The fact that I did not give arguments does not mean rejecting rationality. It simply means I am not interested in giving you arguments about it.
You don’t just get to use Bayes’ Theorem here without explaining the epistemological framework you used to judge the correctness of Bayes
I certainly do. I said that induction is not impossible, and that inductive reasoning is Bayesian. If you think that Bayesian reasoning is also impossible, you are free to establish that. You have not done so.
Critical Rationalism can be used to improve Critical Rationalism and, consistently, to refute it (though no one has done so).
If this is possible, it would be equally possible to refute induction (if it were impossible) by using induction. For example, if every time something had always happened, it never happened after that, then induction would be refuted by induction.
If you think that is inconsistent (which it is), it would be equally inconsistent to refute CR with CR, since if it was refuted, it could not validly be used to refute anything, including itself.
not initiating force against children as most parents currently do
Exactly. This is an extremist ideology. To give several examples, parents should use force to prevent their children from falling down stairs, or from hurting themselves with knives.
I reject this extremist ideology, and that does not mean I reject rationality.
I said the thinking process used to judge the epistemology of induction is Bayesian, and my link explains how it is. I did not say it is an exhaustive explanation of epistemology.
What is the thinking process you are using to judge the epistemology of induction?
The thinking process is Bayesian, and uses a prior. I have a discussion of it here
If you are doing induction all the time then you are using induction to judge the epistemology of induction. How is that supposed to work? … Critical Rationalism does not have this problem. The epistemology of Critical Rationalism can be judged entirely within the framework of Critical Rationalism.
Little problem there.
“[I]deas on this website” is referring to a set of positions. These are positions held by Yudkowsky and others responsible for Less Wrong.
This does not make it reasonable to call contradicting those ideas “contradicting Less Wrong.” In any case, I am quite aware of the things I disagree with Yudkowsky and others about. I do not have a problem with that. Unlike you, I am not a cult member.
Taking Children Seriously says you should always, without exception, be rational when raising your children. If you reject TCS, you reject rationality.
So it says nothing at all except that you should be rational when you raise children? In that case, no one disagrees with it, and it has nothing to teach anyone, including me. If it says anything else, it can still be an extremist ideology, and I can reject it without rejecting rationality.
“But of course the claims are separate, and shouldn’t influence each other.”
No, they are not separate, and they should influence each other.
Suppose your terminal value is squaring the circle using Euclidean geometry. When you find out that this is impossible, you should stop trying. You should go and do something else. You should even stop wanting to square the circle with Euclidean geometry.
What is possible, directly influences what you ought to do, and what you ought to desire.