Okay, I’m about 6 boxes into the flowchart of their plan and already releasing “gaaah” noises. I’ll add a few further updates, but I may not have the willpower to make it through the whole thing.
Okay, finished. Wasn’t as bad as I’d expected given the beginning.
Short summary: spend decades developing a particularly powerful way to understand people, and then use that to do as much good as possible, e.g. by working like a grant-awards agency that really understands who deserves the money, or like a think tank that really understands what messages people will remember, etc.
If you sort of squint your eyes, it makes sense. On the other hand, I won’t hold my breath. For example, their plan only works if they beat everyone else to this understanding by greater than the time it takes to recruit all the donors and prestige they’ll need (~5 years?). For another example, they’re sort of hamstrung by “Connection Theory” already.
Those linked basic claims look well falsified already.
People always believe that all of their intrinsic goods will be achieved...This is, according to Connection Theory, an inviolable law of the mind.
Wishful thinking is not THAT ubiquitous and unbeatable. Lots of people expect to die without an afterlife and wish it wasn’t so.
According to Connection Theory, the sole source of a
person’s irrationality is that person’s need to believe that all of his or her intrinsic goods
will be fulfilled. This need is a constraint; given this constraint, everyone forms the
most reasonable beliefs that they can on the basis of the evidence they encounter
Falsified all over the place, by most of the heuristics and biases literature for one, unless “that they can” is interpreted in a slippery fashion to describe whatever people in fact do.
According to Connection Theory, every action that every person takes is part of an
implicit plan for achieving all of that person’s intrinsic goods. A person may pursue
some intrinsic goods first and others later, but none can be permanently sacrificed
This looks like it denies that people ever make real tradeoffs, but they do.
Okay, I’m about 6 boxes into the flowchart of their plan and already releasing “gaaah” noises. I’ll add a few further updates, but I may not have the willpower to make it through the whole thing.
Okay, finished. Wasn’t as bad as I’d expected given the beginning.
Short summary: spend decades developing a particularly powerful way to understand people, and then use that to do as much good as possible, e.g. by working like a grant-awards agency that really understands who deserves the money, or like a think tank that really understands what messages people will remember, etc.
If you sort of squint your eyes, it makes sense. On the other hand, I won’t hold my breath. For example, their plan only works if they beat everyone else to this understanding by greater than the time it takes to recruit all the donors and prestige they’ll need (~5 years?). For another example, they’re sort of hamstrung by “Connection Theory” already.
If you write a longer comment or discussion post explaining what you found, e.g. how “Connection Theory” hamstrings them, I will upvote it.
Well, I don’t feel like it, but it might be fun to try and figure out why I’d say that from this summary.
Those linked basic claims look well falsified already.
Wishful thinking is not THAT ubiquitous and unbeatable. Lots of people expect to die without an afterlife and wish it wasn’t so.
Falsified all over the place, by most of the heuristics and biases literature for one, unless “that they can” is interpreted in a slippery fashion to describe whatever people in fact do.
This looks like it denies that people ever make real tradeoffs, but they do.
Got it in one.