creating exercises does not make our present lack of robust measures worse than it already is (...) they seemed interested, and started noticing sunk cost fallacy examples in their lives
Martial arts masters and psychotherapy gurus could say the same. Instead of sunk costs you could teach newbies to notice post-colonial alienation or intelligent design, and sure enough they’d get better at noticing that thing in their lives. I hear scientologists do lots of exercises too. Maybe creating exercises before measures is a positive expected value decision, but I wouldn’t bet on that.
“Sunk cost” is a pretty well-defined idea, we can reliably figure out whether something is a sunk cost, and whether a decision commits sunk cost fallacy, by checking whether the decision controls the amount of lost value and whether the (immutable) amount of lost value controls the decision. Skill at noticing sunk cost fallacy would then be ability to parse such situations quickly/automatically.
Testing effectiveness of training a skill is easier than testing usefulness of the skill, and I think figuring out how to train people to avoid a list of fallacies or to find correct decisions of standard kinds faster and more reliably is a reasonable goal, even if practical usefulness of having those skills remains uncertain.
If Anna and I can’t think of a simple way, you seem to have a rather exaggerated idea of what the fulltime hire needs to be able to do. I don’t understand why people are reading this ad and thinking, “Hm, they want Superperson!” But it clearly needs to be rewritten.
I would be very, very surprised if you and Anna literally came up with nothing of value on measuring rationality; I expect there’s some raw material for a full-time employee to test, tweak and build on. This just seems to me like a higher priority than curriculum-building, and achieving a measure that’s better than subjective impressions doesn’t even seem impossible to me.
Here’s how typical people read typical job ads (typically), especially ones that are this long: Read the title. Scan for a dollar sign or the words “salary” or “salary range”. If both are good enough, scan for the first bulleted list of qualifications. Most ads call these “required qualifications”. If the reader meets enough of these, they scan for the second bulleted list of qualifications which is usually called “preferred qualifications”. Then, if they meet enough of both of these, they’ll go back and start reading in detail to understand the position better before they consider sending in an application or contacting the hiring entity for more information.
I suspect that most people expected your job ad to follow this form since it almost does. Your sections are labeled, effectively “needed” and “bonus”. It’s not until you get to reading the now-bolded details that you find out that not all of the “needed” stuff is required of the applicant and that essentially any one of the needed qualifications will be sufficient. Basically, you don’t have any required qualifications, but you do have a general description of the sort of person you’re interested in and a list of preferred qualifications. In this regard, the ad is defective as it fails to comport with the usual format of a typical ad.
Non-standard forms get experienced people’s hackles up. It often indicates that there’s something unprofessional about the organization.
It’s a project that has people such as you and lukeprog involved in it. (Luke wasn’t mentioned, but he was running the rationality camps etc., so people are going to associate him with this regardless of whether his name is actually mentioned.) You two can, with good reason, be considered Superpeople. I expect that many people will automatically assume that for a cause as important as this, you will only accept folks who are themselves Superpeople as well.
To the extent that irrationality is a result of compartmentalization, this may be the same thing as creating a way to measure how effectively you are accomplishing your goals, which is going to vary between people depending on what their goals are.
For most interesting goals I can think of, creating a rigorous quantitative measure is next to impossible. However, there are a few goals, like running a mile in under four minutes, that lend themselves well to this approach. Perhaps SI could find a group of individuals engaged in such a goal and offer their services as rationality consultants?
Martial arts masters and psychotherapy gurus could say the same. Instead of sunk costs you could teach newbies to notice post-colonial alienation or intelligent design, and sure enough they’d get better at noticing that thing in their lives. I hear scientologists do lots of exercises too. Maybe creating exercises before measures is a positive expected value decision, but I wouldn’t bet on that.
“Sunk cost” is a pretty well-defined idea, we can reliably figure out whether something is a sunk cost, and whether a decision commits sunk cost fallacy, by checking whether the decision controls the amount of lost value and whether the (immutable) amount of lost value controls the decision. Skill at noticing sunk cost fallacy would then be ability to parse such situations quickly/automatically.
Testing effectiveness of training a skill is easier than testing usefulness of the skill, and I think figuring out how to train people to avoid a list of fallacies or to find correct decisions of standard kinds faster and more reliably is a reasonable goal, even if practical usefulness of having those skills remains uncertain.
How do you think we should proceed?
The first task of your full-time hire should be coming up with rationality-measuring tools that are better than human intuition.
If Anna and I can’t think of a simple way, you seem to have a rather exaggerated idea of what the fulltime hire needs to be able to do. I don’t understand why people are reading this ad and thinking, “Hm, they want Superperson!” But it clearly needs to be rewritten.
I would be very, very surprised if you and Anna literally came up with nothing of value on measuring rationality; I expect there’s some raw material for a full-time employee to test, tweak and build on. This just seems to me like a higher priority than curriculum-building, and achieving a measure that’s better than subjective impressions doesn’t even seem impossible to me.
Here’s how typical people read typical job ads (typically), especially ones that are this long: Read the title. Scan for a dollar sign or the words “salary” or “salary range”. If both are good enough, scan for the first bulleted list of qualifications. Most ads call these “required qualifications”. If the reader meets enough of these, they scan for the second bulleted list of qualifications which is usually called “preferred qualifications”. Then, if they meet enough of both of these, they’ll go back and start reading in detail to understand the position better before they consider sending in an application or contacting the hiring entity for more information.
I suspect that most people expected your job ad to follow this form since it almost does. Your sections are labeled, effectively “needed” and “bonus”. It’s not until you get to reading the now-bolded details that you find out that not all of the “needed” stuff is required of the applicant and that essentially any one of the needed qualifications will be sufficient. Basically, you don’t have any required qualifications, but you do have a general description of the sort of person you’re interested in and a list of preferred qualifications. In this regard, the ad is defective as it fails to comport with the usual format of a typical ad.
Non-standard forms get experienced people’s hackles up. It often indicates that there’s something unprofessional about the organization.
It’s a project that has people such as you and lukeprog involved in it. (Luke wasn’t mentioned, but he was running the rationality camps etc., so people are going to associate him with this regardless of whether his name is actually mentioned.) You two can, with good reason, be considered Superpeople. I expect that many people will automatically assume that for a cause as important as this, you will only accept folks who are themselves Superpeople as well.
Don’t proceed. Stay at the drawing board until you figure out a viable attack. Stay there until you die, if you have to.
This seems like a rather extreme position to me. I’d be curious to hear you explain your thinking.
There isn’t much to explain. I just think that taking steps towards cultishness has lower expected utility than doing nothing.
To the extent that irrationality is a result of compartmentalization, this may be the same thing as creating a way to measure how effectively you are accomplishing your goals, which is going to vary between people depending on what their goals are.
For most interesting goals I can think of, creating a rigorous quantitative measure is next to impossible. However, there are a few goals, like running a mile in under four minutes, that lend themselves well to this approach. Perhaps SI could find a group of individuals engaged in such a goal and offer their services as rationality consultants?