EDIT: Downvoting this post sort of confirms my point that it’s all about signaling tribal affiliations.
If people downvoting you is evidence that you are right then would people upvoting you have been evidence that you were wrong? Or does this kind of ‘confirmation’ not get conserved the way that evidence does?
And the evidence that donating to SIAI does anything other than signal affiliation is...?
… not required to refute your claim. It’s a goal post shift. In fact I explicitly allowed for the SIAI being utterly useless or worse than useless in the comment to which you replied. The claim I rejected is this:
Donating to SIAI is pure display of tribal affiliation
For that to be true it would require that there is nobody who believes that the SIAI does something useful and whose donating behaviour is best modelled as at least somewhat influenced by the desire to achieve the overt goal.
You also require that there are no other causal influences behind the decision including forms of signalling other than tribal affiliation. I have already mentioned “reciprocation” as a non “tribal affiliation” motivating influence. Even if I decided that the SIAI were completely unworthy of my affiliation I would find it difficult to suppress the instinct to pay back at least some of what they gave me.
The SIAI has received anonymous donations. (The relevance should be obvious.)
Beliefs based on little evidence that people outside of tribe find extremely weird are one of the main forms of signaling tribal affiliation. Taking Jesus story seriously is how people signal belonging to one of Christian tribes, and taking unfriendly AI story seriously is how people signal belonging to one of lesswrong tribe.
No goal post are being shifted here. Donating to SIAI because one believes lesswrong tribal stories is signaling that you have these tribal-marker beliefs, and still counts as pure 100% tribal affiliation signaling.
My reference here would be a fund to build world’s largest Jesus statue. These seems to be this largest Jesus contest ongoing, the record was broken twice in just a year, in Poland then in Peru, and now some Croatian group is trying to outdo them both. People who donate to these efforts might honestly belief this is a good idea. Details why they believe so are highly complex, but this is a tribal-marker belief and nothing more.
Virtually nobody who’s not a local Catholic considers it such, just like virtually nobody who’s not sharing “lesswrongian meme complex” considers what SIAI is doing a particularly good idea. I’m sure these funds got plenty of anonymous donations from local Catholics, and maybe some small amount of money from off-tribal people (e.g. “screw religion, but huge Jesus will be great for tourism here” / “friendly AI is almost certainly bullshit, but weirdos are worth funding by Pascal wager”), this doesn’t really change anything.
tl;dr Action signaling beliefs that correlate with tribal affiliation are actions signaling tribal affiliation, regardless of how conscious this is.
tl;dr Action signaling beliefs that correlate with tribal affiliation are actions [solely for] signaling tribal affiliation, regardless of how conscious this is.
There are other reasons why someone could downvote your post. You immediately assuming that it’s about tribal affiliations sort of demonstrates the problem with your claim that it’s all about tribal affiliations.
They’ve published papers. Presumably if we didn’t donate anything, they couldn’t publish papers. They also hand out paychecks to Eliezer. Eliezer is a tribal leader, so we want him to succeed! Between those two, we have proof that they’re doing more than just signalling affiliation.
The far better question is whether they’re doing something useful with that money, and whether it would be better spent elsewhere. That, I do not feel qualified to answer. I think even Give Well gave up on that one.
And the evidence that donating to SIAI does anything other than signal affiliation is...?
EDIT: Downvoting this post sort of confirms my point that it’s all about signaling tribal affiliations.
If people downvoting you is evidence that you are right then would people upvoting you have been evidence that you were wrong? Or does this kind of ‘confirmation’ not get conserved the way that evidence does?
… not required to refute your claim. It’s a goal post shift. In fact I explicitly allowed for the SIAI being utterly useless or worse than useless in the comment to which you replied. The claim I rejected is this:
For that to be true it would require that there is nobody who believes that the SIAI does something useful and whose donating behaviour is best modelled as at least somewhat influenced by the desire to achieve the overt goal.
You also require that there are no other causal influences behind the decision including forms of signalling other than tribal affiliation. I have already mentioned “reciprocation” as a non “tribal affiliation” motivating influence. Even if I decided that the SIAI were completely unworthy of my affiliation I would find it difficult to suppress the instinct to pay back at least some of what they gave me.
The SIAI has received anonymous donations. (The relevance should be obvious.)
Beliefs based on little evidence that people outside of tribe find extremely weird are one of the main forms of signaling tribal affiliation. Taking Jesus story seriously is how people signal belonging to one of Christian tribes, and taking unfriendly AI story seriously is how people signal belonging to one of lesswrong tribe.
No goal post are being shifted here. Donating to SIAI because one believes lesswrong tribal stories is signaling that you have these tribal-marker beliefs, and still counts as pure 100% tribal affiliation signaling.
My reference here would be a fund to build world’s largest Jesus statue. These seems to be this largest Jesus contest ongoing, the record was broken twice in just a year, in Poland then in Peru, and now some Croatian group is trying to outdo them both. People who donate to these efforts might honestly belief this is a good idea. Details why they believe so are highly complex, but this is a tribal-marker belief and nothing more.
Virtually nobody who’s not a local Catholic considers it such, just like virtually nobody who’s not sharing “lesswrongian meme complex” considers what SIAI is doing a particularly good idea. I’m sure these funds got plenty of anonymous donations from local Catholics, and maybe some small amount of money from off-tribal people (e.g. “screw religion, but huge Jesus will be great for tourism here” / “friendly AI is almost certainly bullshit, but weirdos are worth funding by Pascal wager”), this doesn’t really change anything.
tl;dr Action signaling beliefs that correlate with tribal affiliation are actions signaling tribal affiliation, regardless of how conscious this is.
(Edit based on context)
This statement is either false or useless.
There are other reasons why someone could downvote your post. You immediately assuming that it’s about tribal affiliations sort of demonstrates the problem with your claim that it’s all about tribal affiliations.
They’ve published papers. Presumably if we didn’t donate anything, they couldn’t publish papers. They also hand out paychecks to Eliezer. Eliezer is a tribal leader, so we want him to succeed! Between those two, we have proof that they’re doing more than just signalling affiliation.
The far better question is whether they’re doing something useful with that money, and whether it would be better spent elsewhere. That, I do not feel qualified to answer. I think even Give Well gave up on that one.
Really? I thought we wanted the tribal leader to fail in a way that allowed ourselves or someone we have more influence over to take his place.
Or we want the tribal leader to be conveniently martyred at their moment of greatest impact. You know, for the good of the cause.
I think that depends on how we perceive the size of the tribe, our position within it, and the security of its status in the outside world...