It’s worth noting that inventing good ways of measuring issues is as important for developing the field as developing interventions.
I’d love to see posts on LessWrong that purely focus on the way to measure a particular issue inside of rationality. We had some of that in the past with credence calibration but it would be great if talking about how to measure what we care about would a larger part of the rationalist discourse.
This conversation reminds about the Measurement Theory. I didn’t take it yet, I heard it is an abstract course and applicable to social science. It started from measuring dimensions, and expanded to probability. but for a quick whole picture, it looks very mathematics and no clue how to measure bias, haha.
It’s worth noting that inventing good ways of measuring issues is as important for developing the field as developing interventions.
I’d love to see posts on LessWrong that purely focus on the way to measure a particular issue inside of rationality. We had some of that in the past with credence calibration but it would be great if talking about how to measure what we care about would a larger part of the rationalist discourse.
This conversation reminds about the Measurement Theory. I didn’t take it yet, I heard it is an abstract course and applicable to social science. It started from measuring dimensions, and expanded to probability. but for a quick whole picture, it looks very mathematics and no clue how to measure bias, haha.